EyeGraph: Modularity-aware Spatio Temporal Graph Clustering for Continuous Event-based Eye Tracking Supplemental

Neural Information Processing Systems 

Both the dataset and the source code are released under two licenses: (1) Creative Commons CC-BY-NC 4.0 license and (2) a custom license. The users/data requestors must agree to both licenses, and it is to be noted that if there is any conflict between any term(s) between two licenses, the custom license shall take priority over the Creative Commons CC-BY-NC 4.0 license. Each session per participant in the conventional lab setting consists of four recordings, each lasting approximately four minutes. In the first two recordings, the participants wore the DAVIS346 camera whereas in the last two recordings, they wore the Pupil-Core eye tracker. The randomized movement pattern of the visual stimulus, i.e., the white circle was identical across the cross-device recording pair (for both the DAVIS346 and Pupil-Core device) but varied between the two recordings corresponding to the same wearable device. In the ambient luminance-changing settings, each session per participant (seated in an office environment similar to conventional lab settings) consists of four recordings. For the first two recordings, the participant wears the DAVIS346 sensor under two lighting conditions: Constant Lighting Condition: Near-eye Lux value maintained at 65 Lux throughout the experiment. Variable Lighting Condition: Near-eye Lux value alternates between 65 Lux and 8 Lux every 1-minute span. For the last two recordings, the participant wears the Pupil-Core eye tracker under the two lighting conditions mentioned above. The participant mobility settings mirror the ambient luminance-changing settings for data recording, however with two mobility conditions (with constant default lighting condition of near-eye 65 lux): Stationary Condition: Sitting in an office environment Mobile Condition: Moving freely while carrying a 14-inch laptop that displays the visual stimuli.