/speed-acc

Codebase for studies testing an information-seeking account of eye movements during language comprehension

Primary LanguageJupyter NotebookMIT LicenseMIT

An information-seeking account of eye movements in signed and spoken language comprehension

This repository contains all of the data processing, analysis, and modelling code for a set of studies that test an information seeking account of eye movements in different language processing contexts (signed vs. spoken, written text vs. speech, clear speech vs. noisy speech, speech with eye gaze vs. speech without eye gaze).

The file naming conventions are:

  • Signed vs. spoken language (children): speed_acc_child_asl
  • Written text vs. spoken language (adults): speed_acc_adult_text
  • Clear speech vs. noisy speech & Gaze vs. No-Gaze (adults): speed_acc_adult_ng
  • Clear speech vs. noisy speech (children): speed_acc_child_noise
  • Gaze vs. No-Gaze (children): speed_acc_child_gaze

You can find the raw data, data processing, and tidy data files here:

You can find the paper writeups here (note that you should be able to build these RMarkdown files from sratch):

For more details about the analyses reported in the papers, see the following analysis files: