


default search action
ETRA 2012: Santa Barbara, CA, USA
- Carlos Hitoshi Morimoto, Howell O. Istance, Stephen N. Spencer, Jeffrey B. Mulligan, Pernilla Qvarfordt:

Proceedings of the 2012 Symposium on Eye-Tracking Research and Applications, ETRA 2012, Santa Barbara, CA, USA, March 28-30, 2012. ACM 2012, ISBN 978-1-4503-1221-9
Gaze visualization
- Andrew T. Duchowski, Margaux M. Price, Miriah D. Meyer, Pilar Orero

:
Aggregate gaze visualization with real-time heatmaps. 13-20 - Akira Egawa, Susumu Shirayama:

A method to construct an importance map of an image using the saliency map model and eye movement analysis. 21-28 - Thies Pfeiffer

:
Measuring and visualizing attention in space with 3D attention volumes. 29-36 - Kai Essig, Daniel Dornbusch, Daniel Prinzhorn, Helge J. Ritter, Jonathan Maycock, Thomas Schack:

Automatic analysis of 3D gaze coordinates on scene objects using data from eye-tracking and motion-capture systems. 37-44
Eye tracking systems
- Kenneth Holmqvist

, Marcus Nyström
, Fiona Bríd Mulvey
:
Eye tracker data quality: what it is and how to measure it. 45-52 - Dmitri Model, Moshe Eizenman:

A probabilistic approach for the estimation of angle kappa in infants. 53-58 - Flavio Luiz Coutinho

, Carlos Hitoshi Morimoto
:
Augmenting the robustness of cross-ratio gaze tracking methods to head movement. 59-66
Gaze informed user interfaces
- Reynold J. Bailey

, Ann McNamara, Aaron Costello, Srinivas Sridharan, Cindy Grimm
:
Impact of subtle gaze direction on short-term spatial information recall. 67-74 - Srinivas Sridharan, Ann McNamara, Cindy Grimm

:
Subtle gaze manipulation for improved mammography training. 75-82 - Roman Bednarik

, Hana Vrzakova
, Michal Hradis
:
What do you want to do next: a novel approach for intent prediction in gaze-based interaction. 83-90 - Takumi Toyama, Thomas Kieninger, Faisal Shafait, Andreas Dengel:

Gaze guided object recognition using a head-mounted eye tracker. 91-98
Visual attention: studies, tools, methods
- Izabela Krejtz

, Agnieszka Szarkowska
, Krzysztof Krejtz
, Agnieszka Walczak, Andrew T. Duchowski:
Audio description as an aural guide of children's visual attention: evidence from an eye-tracking study. 99-106 - Nadir Weibel

, Adam Fouse, Colleen Emmenegger, Sara Kimmich, Edwin L. Hutchins:
Let's look at the cockpit: exploring mobile eye-tracking for observational research on the flight deck. 107-114 - Ryo Yonetani, Hiroaki Kawashima, Takashi Matsuyama:

Multi-mode saliency dynamics model for analyzing gaze and attention. 115-122 - Ralf Biedert, Jörn Hees, Andreas Dengel, Georg Buscher:

A robust realtime reading-skimming classifier. 123-130
Gaze based interaction
- Sophie Stellmach, Raimund Dachselt

:
Designing gaze-based user interfaces for steering in virtual environments. 131-138 - Diako Mardanbegi, Dan Witzner Hansen, Thomas Pederson

:
Eye-based head gestures. 139-146 - Henna Heikkilä, Kari-Jouko Räihä

:
Simple gaze gestures and the closure of the eyes as an interaction technique. 147-154
Eye tracking systems issues I
- Lisa M. Tiberio, Roxanne L. Canosa:

Self-localization using fixations as landmarks. 155-160 - Michael Bartels, Sandra P. Marshall:

Measuring cognitive workload across different eye tracking hardware platforms. 161-164 - Michael Raschke, Xuemei Chen, Thomas Ertl:

Parallel scan-path visualization. 165-168 - Hui Tang, Joseph J. Topczewski, Anna M. Topczewski, Norbert J. Pienta:

Permutation test for groups of scanpaths using normalized Levenshtein distances and application in NMR questions. 169-172 - Lech Swirski, Andreas Bulling

, Neil A. Dodgson
:
Robust real-time pupil tracking in highly off-axis images. 173-176 - Mélodie Vidal, Andreas Bulling

, Hans Gellersen
:
Detection of smooth pursuits using eye movement shape features. 177-180
Eye tracking applications I
- Carlo Robino, Sofia Crespi

, Ottavia Silva, Claudio de'Sperati:
Parsing visual stimuli into temporal units through eye movements. 181-184 - Simon J. Büchner, Jan Malte Wiener, Christoph Hölscher:

Methodological triangulation to assess sign placement. 185-188 - Tom Foulsham

, Alan Kingstone:
Goal-driven and bottom-up gaze in an active real-world search task. 189-192 - Adrian Madsen

, Adam M. Larson, Lester C. Loschky, N. Sanjay Rebello
:
Using ScanMatch scores to understand differences in eye movements between correct and incorrect solvers on physics problems. 193-196 - Prateek Hejmady, N. Hari Narayanan:

Visual attention patterns during program debugging with an IDE. 197-200 - Ralf Biedert, Andreas Dengel, Mostafa Elshamy, Georg Buscher:

Towards robust gaze-based objective quality measures for text. 201-204
Eye tracking systems issues II
- Juan J. Cerrolaza, Arantxa Villanueva

, Maria Villanueva, Rafael Cabeza
:
Error characterization and compensation in eye tracking systems. 205-208 - Jan Drewes, Guillaume S. Masson

, Anna Montagnini:
Shifts in reported gaze position due to changes in pupil size: ground truth and compensation. 209-212 - Akihiro Tsukada, Takeo Kanade:

Automatic acquisition of a 3D eye model for a wearable first-person vision device. 213-216 - Laura Sesma

, Arantxa Villanueva
, Rafael Cabeza
:
Evaluation of pupil center-eye corner vector for gaze estimation using a web cam. 217-220 - Thomas B. Kinsman, Karen M. Evans, Glenn Sweeney, Tommy P. Keane, Jeff B. Pelz

:
Ego-motion compensation improves fixation detection in wearable eye tracking. 221-224
Eye tracking applications II
- Morten Lund Dybdal, Javier San Agustin, John Paulin Hansen

:
Gaze input for mobile devices by dwell and gestures. 225-228 - Aulikki Hyrskykari, Howell O. Istance, Stephen Vickers:

Gaze gestures or dwell-based interaction? 229-232 - Howell O. Istance, Stephen Vickers, Aulikki Hyrskykari:

The validity of using non-representative users in gaze communication research. 233-236 - Zhen Liang, Qiang Fu, Zheru Chi

:
Eye typing of Chinese characters. 237-240 - Per Ola Kristensson, Keith Vertanen:

The potential of dwell-free eye-typing for fast assistive gaze communication. 241-244
Systems, tools, methods
- Benedict C. O. F. Fehringer

, Andreas Bulling
, Antonio Krüger
:
Analysing the potential of adapting head-mounted eye tracker calibration to a new user. 245-248 - Craig Hennessey, Jacob Fiset:

Long range eye tracking: bringing eye tracking into the living room. 249-252 - Dmitri Model, Moshe Eizenman:

A general framework for extension of a tracking range of user-calibration-free remote eye-gaze tracking systems. 253-256 - Takashi Nagamatsu

, Michiya Yamamoto, Ryuichi Sugano, Junzo Kamahara:
Mathematical model for wide range gaze tracking system based on corneal reflections and pupil using stereo cameras. 257-260 - Yanxia Zhang, Andreas Bulling

, Hans Gellersen
:
Towards pervasive eye tracking using low-level image features. 261-264 - Jeffrey B. Mulligan:

A GPU-accelerated software eye tracking system. 265-268 - Jayson Turner, Andreas Bulling

, Hans Gellersen
:
Extending the visual field of a head-mounted eye tracker for pervasive eye-based interaction. 269-272 - Akira Utsumi, Kotaro Okamoto, Norihiro Hagita, Kazuhiro Takahashi:

Gaze tracking in wide area using multiple camera observations. 273-276 - Corey Holland, Oleg V. Komogortsev:

Eye tracking on unmodified common tablets: challenges and solutions. 277-280 - Oleg Spakov

:
Comparison of eye movement filters used in HCI. 281-284 - Enkelejda Tafaj, Gjergji Kasneci

, Wolfgang Rosenstiel, Martin Bogdan:
Bayesian online clustering of eye movement data. 285-288 - Pieter J. Blignaut, Tanya René Beelders:

The precision of eye-trackers: a case for a new measure. 289-292 - Pieter J. Blignaut, Tanya René Beelders:

TrackStick: a data quality measuring tool for Tobii eye trackers. 293-296 - Samuel John, Erik Weitnauer, Hendrik Koesling:

Entropy-based correction of eye tracking data for static scenes. 297-300 - Detlev Droege, Dietrich Paulus:

A flexible gaze tracking algorithm evaluation workbench. 301-304 - Christopher McMurrough, Vangelis Metsis, Jonathan Rich, Fillia Makedon:

An eye tracking dataset for point of gaze detection. 305-308 - Geoffrey Tien, M. Stella Atkins, Bin Zheng:

Measuring gaze overlap on videos between multiple observers. 309-312 - Peter Kiefer

, Florian Straub, Martin Raubal:
Towards location-aware mobile eye tracking. 313-316 - Anneli Olsen, Ricardo Matos:

Identifying parameter values for an I-VT fixation filter suitable for handling data sampled with various sampling frequencies. 317-320 - Andrew D. Ouzts, Andrew T. Duchowski:

Comparison of eye movement metrics recorded at different sampling rates. 321-324 - Andrew D. Ouzts, Andrew T. Duchowski, Toni Gomes, Rupert A. Hurley:

On the conspicuity of 3-D fiducial markers in 2-D projected environments. 325-328 - Michal Hradis

, Shahram Eivazi, Roman Bednarik
:
Voice activity detection from gaze in video mediated communication. 329-332 - Hideyuki Kubota, Yusuke Sugano

, Takahiro Okabe, Yoichi Sato, Akihiro Sugimoto, Kazuo Hiraki:
Incorporating visual field characteristics into a saliency map. 333-336
Uses and applications
- Tanya René Beelders, Pieter J. Blignaut:

Measuring the performance of gaze and speech for text input. 337-340 - Xiaoyu Zhao, Elias Daniel Guestrin, Dimitry Sayenko, Tyler Simpson, Michel J. A. Gauthier, Milos R. Popovic

:
Typing with eye-gaze and tooth-clicks. 341-344 - Ville Rantanen

, Jarmo Verho, Jukka Lekkala
, Outi Tuisku
, Veikko Surakka
, Toni Vanhala:
The effect of clicking by smiling on the accuracy of head-mounted gaze tracking. 345-348 - Tanya René Beelders, Pieter J. Blignaut:

Using eye gaze and speech to simulate a pointing device. 349-352 - Antonio Diaz Tula, Filipe Morgado Simoes de Campos, Carlos H. Morimoto

:
Dynamic context switching for gaze based interaction. 353-356 - Sophie Stellmach, Raimund Dachselt

:
Investigating gaze-supported multimodal pan and zoom. 357-360 - Ralf Biedert, Andreas Dengel, Christoph Käding:

Universal eye-tracking based text cursor warping. 361-364 - Anders Møller Nielsen, Anders Lerchedahl Petersen, John Paulin Hansen:

Gaming with gaze and losing with a smile. 365-368 - Daniela Giordano

, Isaak Kavasidis, Carmelo Pino
, Concetto Spampinato:
Content based recommender system by using eye gaze data. 369-372 - Masahiro Toyoura

, Tomoya Sawada, Mamoru Kunihiro, Xiaoyang Mao
:
Using eye-tracking data for automatic film comic creation. 373-376 - Shahram Eivazi, Roman Bednarik

, Markku Tukiainen, Mikael von und zu Fraunberg
, Ville Leinonen, Juha E. Jääskeläinen:
Gaze behaviour of expert and novice microneurosurgeons differs during observations of tumor removal recordings. 377-380 - Bonita Sharif

, Michael Falcone, Jonathan I. Maletic:
An eye-tracking study on the role of scan time in finding source code defects. 381-384 - Ralf Biedert, Andreas Dengel, Georg Buscher, Arman Vartan:

Reading and estimating gaze on smart phones. 385-388 - Poja Shams, Erik Wästlund

, Lars Witell
:
Revisiting Russo and Leclerc. 389-392 - Rui Li, Jeff B. Pelz

, Pengcheng Shi, Cecilia Ovesdotter Alm
, Anne R. Haake:
Learning eye movement patterns for characterization of perceptual expertise. 393-396 - Michael E. Holmes, Sheree Josephson, Ryan E. Carney:

Visual attention to television programs with a second-screen application. 397-400 - Peter G. Mahon, Roxanne L. Canosa:

Prisoners and chickens: gaze locations indicate bounded rationality. 401-404 - M. Stella Atkins, Xianta Jiang, Geoffrey Tien, Bin Zheng:

Saccadic delays on targets while watching videos. 405-408 - Catrin Hasse, Dietrich Grasshoff, Carmen Bruder:

How to measure monitoring performance of pilots and air traffic controllers. 409-412 - Oskar Palinko

, Andrew L. Kun:
Exploring the effects of visual cognitive load and illumination on pupil diameter in driving simulators. 413-416

manage site settings
To protect your privacy, all features that rely on external API calls from your browser are turned off by default. You need to opt-in for them to become active. All settings here will be stored as cookies with your web browser. For more information see our F.A.Q.


Google
Google Scholar
Semantic Scholar
Internet Archive Scholar
CiteSeerX
ORCID














