Inferring target locations from gaze data: A smartphone study
Author(s) / Creator(s)
Mueller, Stefanie
Abstract / Description
Although smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 x 9.0 mm. A ”hand-held” (phone in subject’s hand) and a ”mounted” (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect ”target”), gaze deviated from the real positions. The classifier’s performance for the 30 locations ranged considerably between subjects (”mounted”: 30 to 93 % accuracy; ”hand-held”: 8 to 100 % accuracy).
Keyword(s)
fixations mobile devices accuracy gaze positionsPersistent Identifier
Date of first publication
2019-06
Is part of
11th ACM Symposium on Eye Tracking Research & Applications 2019, Denver, Colorado, USA
Publisher
PsychArchives
Citation
Mueller, S. (2019, June). Inferring target locations from gaze data: A smartphone study. PsychArchives. https://doi.org/10.23668/psycharchives.2500
-
ETRA19_final3.pdfAdobe PDF - 2.25MBMD5: 65cfea944219f72c33053b055f4dc3b5
-
There are no other versions of this object.
-
Author(s) / Creator(s)Mueller, Stefanie
-
PsychArchives acquisition timestamp2019-06-19T13:12:34Z
-
Made available on2019-06-19T13:12:34Z
-
Date of first publication2019-06
-
Abstract / DescriptionAlthough smartphones are widely used in everyday life, studies of viewing behavior mainly employ desktop computers. This study examines whether closely spaced target locations on a smartphone can be decoded from gaze. Subjects wore a head-mounted eye tracker and fixated a target that successively appeared at 30 positions spaced by 10.0 x 9.0 mm. A ”hand-held” (phone in subject’s hand) and a ”mounted” (phone on surface) condition were conducted. Linear-mixed-models were fitted to examine whether gaze differed between targets. T-tests on root-mean-squared errors were calculated to evaluate the deviation between gaze and targets. To decode target positions from gaze data we trained a classifier and assessed its performance for every subject/condition. While gaze positions differed between targets (main effect ”target”), gaze deviated from the real positions. The classifier’s performance for the 30 locations ranged considerably between subjects (”mounted”: 30 to 93 % accuracy; ”hand-held”: 8 to 100 % accuracy).en_US
-
CitationMueller, S. (2019, June). Inferring target locations from gaze data: A smartphone study. PsychArchives. https://doi.org/10.23668/psycharchives.2500en
-
Persistent Identifierhttps://hdl.handle.net/20.500.12034/2124
-
Persistent Identifierhttps://doi.org/10.23668/psycharchives.2500
-
Language of contentengen_US
-
PublisherPsychArchivesen_US
-
Is part of11th ACM Symposium on Eye Tracking Research & Applications 2019, Denver, Colorado, USAen_US
-
Is related tohttps://doi.org/10.23668/psycharchives.2491
-
Is related tohttps://doi.org/10.1145/3314111.3319847
-
Keyword(s)fixationsen_US
-
Keyword(s)mobile devicesen_US
-
Keyword(s)accuracyen_US
-
Keyword(s)gaze positionsen_US
-
Dewey Decimal Classification number(s)150
-
TitleInferring target locations from gaze data: A smartphone studyen_US
-
DRO typeconferenceObjecten_US