ISBN-13: 9783319578811 / Angielski / Miękka / 2017 / 366 str.
ISBN-13: 9783319578811 / Angielski / Miękka / 2017 / 366 str.
Part I Introduction to the Human Visual System (HVS).- 1 Visual Attention.- 1.1 Visual Attention: A Historical Review.- 1.1.1 Von Helmholtz’s “Where”.- 1.1.2 James’ “What”.- 1.1.3 Gibson’s “How”.- 1.1.4 Broadbent’s “Selective Filter”.- 1.1.5 Deutsch and Deutsch’s “Importance Weightings”.- 1.1.6 Yarbus and Noton and Stark’s “Scanpaths”.- 1.1.7 Posner’s “Spotlight”.- 1.1.8 Treisman’s “Glue”.- 1.1.9 Kosslyn’s “Window”.- 1.2 Visual Attention and Eye Movements.- 1.3 Summary and Further Reading.- 2 Neurological Substrate of the HVS.- 2.1 The Eye.- 2.2 The Retina.- 2.2.1 The Outer Layer.- 2.2.2 The Inner Nuclear Layer.- 2.2.3 The Ganglion Layer.- 2.3 The Optic Tract and M/P Visual Channels.- 2.4 The Occipital Cortex and Beyond.- 2.4.1 Motion-Sensitive Single-Cell Physiology.- 2.5 Summary and Further Reading.- 3 Visual Psychophysics.- 3.1 Spatial Vision.- 3.2 Temporal Vision.- 3.2.1 Perception of Motion in the Visual Periphery.- 3.2.2 Sensitivity to Direction of Motion in the Visual Periphery.- 3.3 Color Vision.- 3.4 Implications for Attentional Design of Visual Displays.- 3.5 Summary and Further Reading.- 4 Taxonomy and Models of Eye Movements.- 4.1 The Extraocular Muscles and the Oculomotor Plant.- 4.2 Saccades.- 4.3 Smooth Pursuits.- 4.4 Fixations (Microsaccades, Drift, and Tremor).- 4.5 Nystagmus.- 4.6 Implications for Eye Movement Analysis.- 4.7 Summary and Further Reading.-Part II Eye Tracking Systems.- 5 Eye Tracking Techniques.- 5.1 Electro-OculoGraphy (EOG).- 5.2 Scleral Contact Lens/Search Coil.- 5.3 Photo-OculoGraphy (POG) or Video-OculoGraphy (VOG).- 5.4 Video-Based Combined Pupil/Corneal Reflection.- 5.5 Classifying Eye Trackers in “Mocap” Terminology.- 5.6 Summary and Further Reading.- 6 Head-Mounted System Hardware Installation.- 6.1 Integration Issues and Requirements.- 6.2 System Installation.- 6.3 Lessons Learned from the Installation at Clemson.- 6.4 Summary and Further Reading.- 7 Head-Mounted System Software Development.- 7.1 Mapping Eye Tracker Screen Coordinates.- 7.1.1 Mapping Screen Coordinates to the 3D Viewing Frustum.- 7.1.2 Mapping Screen Coordinates to the 2D Image.- 7.1.3 Measuring Eye Tracker Screen Coordinate Extents.- 7.2 Mapping Flock Of Birds Tracker Coordinates.- 7.2.1 Obtaining the Transformed View Vector.- 7.2.2 Obtaining the Transformed Up Vector.- 7.2.3 Transforming an Arbitrary Vector.- 7.3 3D Gaze Point Calculation.- 7.3.1 Parametric Ray Representation of Gaze Direction.- 7.4 Virtual Gaze Intersection Point Coordinates.- 7.4.1 Ray/Plane Intersection.- 7.4.2 Point-In-Polygon Problem.- 7.5 Data Representation and Storage.- 7.6 Summary and Further Reading.- 8 Head-Mounted System Calibration.- 8.1 Software Implementation.- 8.2 Ancillary Calibration Procedures.- 8.2.1 Internal 2D Calibration.- 8.2.2 Internal 3D Calibration.- 8.3 Summary and Further Reading.- 9 Table-Mounted System Hardware Installation.- 9.1 Integration Issues and Requirements.- 9.2 System Installation.- 9.3 Lessons Learned from the Installation at Clemson.- 9.4 Summary and Further Reading.- 10 Table-Mounted System Software Development.- 10.1 Linux Tobii Client Application Program Interface.- 10.1.1 Tet Init.- 10.1.2 Tet Connect, Tet Disconnect.- 10.1.3 Tet Start, Tet Stop.- 10.1.4 Tet CalibClear, Tet CalibLoadFromFile, Tet CalibSaveToFile, Tet CalibAddPoint, Tet CalibRemovePoints, Tet CalibGetResult, Tet CalibCalculateAndSet.- 10.1.5 Tet SynchronizeTime, Tet PerformSystemCheck.- 10.1.6 Tet GetSerialNumber, Tet GetLastError, Tet GetLastErrorAsText.- 10.1.7 Tet CallbackFunction.- 10.2 A Simple OpenGL/GLUT GUI Example.- 10.3 Caveats.- 10.4 Summary and Further Reading.- 11 Table-Mounted System Calibration.- 11.1 Software Implementation.- 11.2 Summary and Further Reading.- 12 Using an Open Source Application Program Interface.- 12.1 API Implementation and XML Format.- 12.2 Client/Server Communication.- 12.3 Server Configuration.- 12.4 API Extensions.- 12.5 Interactive Client Example using Python.- 12.5.1 Using Gazepoint’s Built-in Calibration.- 12.5.2 Using Gazepoint’s Custom Calibration Capabilities.- 12.6 Summary and Further Reading.- 13 Eye Movement Analysis.- 13.1 Signal Denoising.- 13.2 Dwell-Time Fixation Detection.- 13.3 Velocity-Based Saccade Detection.- 13.4 Eye Movement Analysis in Three Dimensions.- 13.4.1 Parameter Estimation.- 13.4.2 Fixation Grouping.- 13.4.3 Eye Movement Data Mirroring.- 13.5 Summary and Further Reading.- 14 Advanced Eye Movement Analysis.- 14.1 Signal Denoising.- 14.2 Velocity-Based Saccade Detection.- 14.3 Microsaccade Detection.- 14.4 Validation: Computing Accuracy, Precision, and Refitting.- 14.5 Binocular Eye Movement Analysis: Vergence.- 14.6 Ambient/Focal Eye Movement Analysis.- 14.7 Transition Entropy Analysis.- 14.8 Spatial Distribution Analysis.- 14.9 Summary and Further Reading.- 15 The Gaze Analytics Pipeline.- 15.1 Gaze Analytics in Five Easy Steps.- 15.1.1 Step 0: Data Collection.- 15.1.2 Step 1 (dirs): Directory Creation.- 15.1.3 Step 2 (raw): Extract Raw Gaze Data.- 15.1.4 Step 3 (graph or process): Graph or Process Raw Data.- 15.1.5 Step 4 (collate): Collate Data Prior to Statistical Analysis.- 15.1.6 Step 5 (stats): Perform Statistical Analyses.- 15.2 Gaze Analytics: A Worked Example.- 15.2.1 Scanpath Visualization.- 15.2.2 Traditional Eye Movement Metrics.- 15.2.3 Advanced Eye Movement Analysis.- 15.3 Summary and Further Reading.- 16 Eye Movement Synthesis.- 16.1 Procedural Simulation of Eye Movements.- 16.1.1 Modeling Saccades.- 16.1.2 Modeling Fixations.- 16.2 Adding Synthetic Eye Tracking Noise.- 16.3 Summary and Further Reading.- Part III Eye Tracking Methodology.- 17 Experimental Design.- 17.1 Formulating a Hypothesis.- 17.2 Forms of Inquiry.- 17.2.1 Experiments Versus Observational Studies.- 17.2.2 Laboratory Versus Field Research.- 17.2.3 Idiographic Versus Nomothetic Research.- 17.2.4 Sample Population Versus Single-Case Experiment Versus
Dr. Duchowski is a professor of Computer Science at Clemson University. His research and teaching interests include visual attention and perception, eye tracking, computer vision, and computer graphics. He has produced a corpus of publications related to eye tracking research and has delivered courses and seminars on the subject at international conferences. He maintains Clemson's eye tracking laboratory, and teaches a regular course on eye tracking methodology, attracting students from a number of disciplines across campus.
Focusing on recent advances in analytical techniques, this third edition of Andrew Duchowski’s successful guide has been revised and extended. It includes new chapters on calibration accuracy, precision and correction; advanced eye movement analysis; binocular eye movement analysis; practical gaze analytics; eye movement synthesis.
Eye Tracking Methodology opens with useful background information, including an introduction to the human visual system and key issues in visual perception and eye movement. The author then surveys eye-tracking devices and provides a detailed introduction to the technical requirements necessary for installing a system and developing an application program. Modern programming examples (in Python) are included and the author outlines the gaze analytics pipeline, a step-by-step data processing sequence from raw data to statistical analysis.
Focusing on the use of modern video-based, corneal-reflection eye trackers – the most widely available and affordable types of systems, Andrew Duchowski takes a look at a number of interesting and challenging applications in human factors, collaborative systems, virtual reality, marketing and advertising. His primary focus is on methodology, and how analysis of eye movements can enhance research and development of anything that is inspected visually.
Stefan Robila, reviewing the second edition says, “The book is written in an easy-to-understand language. Given its breadth, it may be most appropriate for scientists and students starting in this field. ... Overall, I found it to be a solid book on a fascinating topic." (ACM Computing Reviews, October 2008)”
1997-2024 DolnySlask.com Agencja Internetowa