SR Research

Platforms:

  • Windows 7 / 10

  • Linux

  • macOS

Required Python Version:

  • Python 3.6 +

Supported Models:

  • EyeLink 1000

  • EyeLink 1000 Plus

Additional Software Requirements

The SR Research EyeLink implementation of the ioHub common eye tracker interface uses the pylink package written by SR Research. If using a PsychoPy3 standalone installation, this package should already be included.

If you are manually installing PsychPy3, please install the appropriate version of pylink. Downloads are available to SR Research customers from their support website.

EyeTracker Class

class psychopy.iohub.devices.eyetracker.hw.sr_research.eyelink.EyeTracker[source]

The SR Research EyeLink implementation of the Common Eye Tracker Interface can be used by providing the following EyeTracker path as the device class in the iohub_config.yaml device settings file:

eyetracker.hw.sr_research.eyelink

Examples

  1. Start ioHub with SR Research EyeLink 1000 and run tracker calibration:

    from psychopy.iohub import launchHubServer
    from psychopy.core import getTime, wait
    
    
    iohub_config = {'eyetracker.hw.sr_research.eyelink.EyeTracker':
                    {'name': 'tracker',
                     'model_name': 'EYELINK 1000 DESKTOP',
                     'runtime_settings': {'sampling_rate': 500,
                                          'track_eyes': 'RIGHT'}
                     }
                    }
    io = launchHubServer(**iohub_config)
    
    # Get the eye tracker device.
    tracker = io.devices.tracker
    
    # run eyetracker calibration
    r = tracker.runSetupProcedure()
    
  2. Print all eye tracker events received for 2 seconds:

    # Check for and print any eye tracker events received...
    tracker.setRecordingState(True)
    
    stime = getTime()
    while getTime()-stime < 2.0:
        for e in tracker.getEvents():
            print(e)
    
  3. Print current eye position for 5 seconds:

    # Check for and print current eye position every 100 msec.
    stime = getTime()
    while getTime()-stime < 5.0:
        print(tracker.getPosition())
        wait(0.1)
    
    tracker.setRecordingState(False)
    
    # Stop the ioHub Server
    io.quit()
    
clearEvents(event_type=None, filter_id=None, call_proc_events=True)

Clears any DeviceEvents that have occurred since the last call to the device’s getEvents(), or clearEvents() methods.

Note that calling clearEvents() at the device level only clears the given device’s event buffer. The ioHub Process’s Global Event Buffer is unchanged.

Parameters

None

Returns

None

enableEventReporting(enabled=True)[source]

enableEventReporting is the device type independent method that is equivalent to the EyeTracker specific setRecordingState method.

getConfiguration()

Retrieve the configuration settings information used to create the device instance. This will the default settings for the device, found in iohub.devices.<device_name>.default_<device_name>.yaml, updated with any device settings provided via launchHubServer(…).

Changing any values in the returned dictionary has no effect on the device state.

Parameters

None

Returns

The dictionary of the device configuration settings used to create the device.

Return type

(dict)

getEvents(*args, **kwargs)

Retrieve any DeviceEvents that have occurred since the last call to the device’s getEvents() or clearEvents() methods.

Note that calling getEvents() at a device level does not change the Global Event Buffer’s contents.

Parameters
  • event_type_id (int) – If specified, provides the ioHub DeviceEvent ID for which events should be returned for. Events that have occurred but do not match the event ID specified are ignored. Event type ID’s can be accessed via the EventConstants class; all available event types are class attributes of EventConstants.

  • clearEvents (int) – Can be used to indicate if the events being returned should also be removed from the device event buffer. True (the default) indicates to remove events being returned. False results in events being left in the device event buffer.

  • asType (str) – Optional kwarg giving the object type to return events as. Valid values are ‘namedtuple’ (the default), ‘dict’, ‘list’, or ‘object’.

Returns

New events that the ioHub has received since the last getEvents() or clearEvents() call to the device. Events are ordered by the ioHub time of each event, older event at index 0. The event object type is determined by the asType parameter passed to the method. By default a namedtuple object is returned for each event.

Return type

(list)

getLastGazePosition()[source]

getLastGazePosition returns the most recent x,y eye position, in Display device coordinate space, received by the ioHub server from the EyeLink device. In the case of binocular recording, and if both eyes are successfully being tracked, then the average of the two eye positions is returned. If the eye tracker is not recording or is not connected, then None is returned. The getLastGazePosition method returns the most recent eye gaze position retieved from the eye tracker device. This is the position on the calibrated 2D surface that the eye tracker is reporting as the current eye position. The units are in the units in use by the Display device.

If binocular recording is being performed, the average position of both eyes is returned.

If no samples have been received from the eye tracker, or the eye tracker is not currently recording data, None is returned.

Parameters

None

Returns

If the eye tracker is not currently recording data or no eye samples have been received.

tuple: Latest (gaze_x,gaze_y) position of the eye(s)

Return type

None

getLastSample()[source]

getLastSample returns the most recent EyeSampleEvent received from the EyeLink system. Any position fields are in Display device coordinate space. If the eye tracker is not recording or is not connected, then None is returned.

Parameters

None

Returns

If the eye tracker is not currently recording data.

EyeSample: If the eye tracker is recording in a monocular tracking mode, the latest sample event of this event type is returned.

BinocularEyeSample: If the eye tracker is recording in a binocular tracking mode, the latest sample event of this event type is returned.

Return type

None

getPosition()

The getPosition method is the same as the getLastGazePosition method, provided as a consistent cross device method to access the current screen position reported by a device.

See getLastGazePosition for further details.

isRecordingEnabled()[source]

isRecordingEnabled returns True if the eye tracking device is currently connected and sending eye event data to the ioHub server. If the eye tracker is not recording, or is not connected to the ioHub server, False will be returned.

Parameters

None

Returns

True == the device is recording data; False == Recording is not occurring

Return type

bool

runSetupProcedure()[source]

Start the EyeLink Camera Setup and Calibration procedure.

During the system setup, the following keys can be used on either the Host PC or Experiment PC to control the state of the setup procedure:

  • C = Start Calibration

  • V = Start Validation

  • ENTER should be pressed at the end of a calibration or validation to accept the calibration, or in the case of validation, use the option drift correction that can be performed as part of the validation process in the EyeLink system.

  • ESC can be pressed at any time to exit the current state of the setup procedure and return to the initial blank screen state.

  • O = Exit the runSetupProcedure method and continue with the experiment.

sendCommand(key, value=None)[source]

The sendCommand method sends an EyeLink command key and value to the EyeLink device. Any valid EyeLInk command can be sent using this method. However, not that doing so is a device dependent operation, and will have no effect on other implementations of the Common EyeTracker Interface, unless the other eye tracking device happens to support the same command, value format.

If both key and value are provided, internally they are combined into a string of the form:

“key = value”

and this is sent to the EyeLink device. If only key is provided, it is assumed to include both the command name and any value or arguements required by the EyeLink all in the one arguement, which is sent to the EyeLink device untouched.

sendMessage(message_contents, time_offset=None)[source]

The sendMessage method sends a string (max length 128 characters) to the EyeLink device.

The message will be time stamped and inserted into the native EDF file, if one is being recorded. If no native EyeLink data file is being recorded, this method is a no-op.

setRecordingState(recording)[source]

setRecordingState enables (recording=True) or disables (recording=False) the recording of eye data by the eye tracker and the sending of any eye data to the ioHub Server. The eye tracker must be connected to the ioHub Server by using the setConnectionState(True) method for recording to be possible.

Parameters

recording (bool) – if True, the eye tracker will start recordng data.; false = stop recording data.

Returns

the current recording state of the eye tracking device

Return type

bool

trackerSec()[source]

trackerSec returns the current EyeLink Host Application time in sec.msec format.

trackerTime()[source]

trackerTime returns the current EyeLink Host Application time in msec format as a long integer.

Supported Event Types

The EyeLink implementation of the ioHub eye tracker interface supports monoculor or binocular eye samples as well as fixation, saccade, and blink events.

Eye Samples

class psychopy.iohub.devices.eyetracker.MonocularEyeSampleEvent(object)[source]

A MonocularEyeSampleEvent represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recoding from only one eye, or is recording from both eyes and averaging the binocular data.

Event Type ID: EventConstants.MONOCULAR_EYE_SAMPLE

Event Type String: ‘MONOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the sample. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

The horizontal position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

gaze_y

The vertical position of the eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

angle_x

Horizontal eye angle.

angle_y

Vertical eye angle.

raw_x

The uncalibrated x position of the eye in a device specific coordinate space.

raw_y

The uncalibrated y position of the eye in a device specific coordinate space.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

ppd_x

Horizontal pixels per visual degree for this eye position as reported by the eye tracker.

ppd_y

Vertical pixels per visual degree for this eye position as reported by the eye tracker.

velocity_x

Horizontal velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_y

Vertical velocity of the eye at the time of the sample; as reported by the eye tracker.

velocity_xy

2D Velocity of the eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data. 0 = Eye sample is OK. 2 = Eye sample is invalid.

class psychopy.iohub.devices.eyetracker.BinocularEyeSampleEvent(object)[source]

The BinocularEyeSampleEvent event represents the eye position and eye attribute data collected from one frame or reading of an eye tracker device that is recording both eyes of a participant.

Event Type ID: EventConstants.BINOCULAR_EYE_SAMPLE

Event Type String: ‘BINOCULAR_EYE_SAMPLE’

time

time of event, in sec.msec format, using psychopy timebase.

left_gaze_x

The horizontal position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_gaze_y

The vertical position of the left eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

left_angle_x

The horizontal angle of left eye the relative to the head.

left_angle_y

The vertical angle of left eye the relative to the head.

left_raw_x

The uncalibrated x position of the left eye in a device specific coordinate space.

left_raw_y

The uncalibrated y position of the left eye in a device specific coordinate space.

left_pupil_measure_1

Left eye pupil diameter.

left_pupil_measure1_type

Coordinate space type being used for left_pupil_measure_1.

left_ppd_x

Pixels per degree for left eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_ppd_y

Pixels per degree for left eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

left_velocity_x

Horizontal velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_y

Vertical velocity of the left eye at the time of the sample; as reported by the eye tracker.

left_velocity_xy

2D Velocity of the left eye at the time of the sample; as reported by the eye tracker.

right_gaze_x

The horizontal position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_gaze_y

The vertical position of the right eye on the computer screen, in Display Coordinate Type Units. Calibration must be done prior to reading (meaningful) gaze data.

right_angle_x

The horizontal angle of right eye the relative to the head.

right_angle_y

The vertical angle of right eye the relative to the head.

right_raw_x

The uncalibrated x position of the right eye in a device specific coordinate space.

right_raw_y

The uncalibrated y position of the right eye in a device specific coordinate space.

right_pupil_measure_1

Right eye pupil diameter.

right_pupil_measure1_type

Coordinate space type being used for right_pupil_measure1_type.

right_ppd_x

Pixels per degree for right eye horizontal position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_ppd_y

Pixels per degree for right eye vertical position as reported by the eye tracker. Display distance must be correctly set for this to be accurate at all.

right_velocity_x

Horizontal velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_y

Vertical velocity of the right eye at the time of the sample; as reported by the eye tracker.

right_velocity_xy

2D Velocity of the right eye at the time of the sample; as reported by the eye tracker.

status

Indicates if eye sample contains ‘valid’ data for left and right eyes. 0 = Eye sample is OK. 2 = Right eye data is likely invalid. 20 = Left eye data is likely invalid. 22 = Eye sample is likely invalid.

Fixation Events

Successful eye tracker calibration must be performed prior to reading (meaningful) fixation event data.

class psychopy.iohub.devices.eyetracker.FixationStartEvent(object)[source]

A FixationStartEvent is generated when the beginning of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_START

Event Type String: ‘FIXATION_START’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.FixationEndEvent(object)[source]

A FixationEndEvent is generated when the end of an eye fixation ( in very general terms, a period of relatively stable eye position ) is detected by the eye trackers sample parsing algorithms.

Event Type ID: EventConstants.FIXATION_END

Event Type String: ‘FIXATION_END’

time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Saccade Events

Successful eye tracker calibration must be performed prior to reading (meaningful) saccade event data.

class psychopy.iohub.devices.eyetracker.SaccadeStartEvent(object)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

angle_x

Horizontal eye angle at the start of the event.

angle_y

Vertical eye angle at the start of the event.

pupil_measure_1

Pupil size. Use pupil_measure1_type to determine what type of pupil size data was being saved by the tracker.

pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

ppd_x

Horizontal pixels per degree at start of event.

ppd_y

Vertical pixels per degree at start of event.

velocity_xy

2D eye velocity at the start of the event.

status

Event status as reported by the eye tracker.

class psychopy.iohub.devices.eyetracker.SaccadeEndEvent(object)[source]
time

time of event, in sec.msec format, using psychopy timebase.

eye

Eye that generated the event. Either EyeTrackerConstants.LEFT_EYE or EyeTrackerConstants.RIGHT_EYE.

duration

Duration of the event in sec.msec format.

start_gaze_x

Horizontal gaze position at the start of the event, in Display Coordinate Type Units.

start_gaze_y

Vertical gaze position at the start of the event, in Display Coordinate Type Units.

start_angle_x

Horizontal eye angle at the start of the event.

start_angle_y

Vertical eye angle at the start of the event.

start_pupil_measure_1

Pupil size at the start of the event.

start_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

start_ppd_x

Horizontal pixels per degree at start of event.

start_ppd_y

Vertical pixels per degree at start of event.

start_velocity_xy

2D eye velocity at the start of the event.

end_gaze_x

Horizontal gaze position at the end of the event, in Display Coordinate Type Units.

end_gaze_y

Vertical gaze position at the end of the event, in Display Coordinate Type Units.

end_angle_x

Horizontal eye angle at the end of the event.

end_angle_y

Vertical eye angle at the end of the event.

end_pupil_measure_1

Pupil size at the end of the event.

end_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

end_ppd_x

Horizontal pixels per degree at end of event.

end_ppd_y

Vertical pixels per degree at end of event.

end_velocity_xy

2D eye velocity at the end of the event.

average_gaze_x

Average horizontal gaze position during the event, in Display Coordinate Type Units.

average_gaze_y

Average vertical gaze position during the event, in Display Coordinate Type Units.

average_angle_x

Average horizontal eye angle during the event,

average_angle_y

Average vertical eye angle during the event,

average_pupil_measure_1

Average pupil size during the event.

average_pupil_measure1_type

EyeTrackerConstants.PUPIL_AREA

average_velocity_xy

Average 2D velocity of the eye during the event.

peak_velocity_xy

Peak 2D velocity of the eye during the event.

status

Event status as reported by the eye tracker.

Default Device Settings

# This section includes all valid sr_research.eyelink.EyeTracker Device
# settings that can be specified in an iohub_config.yaml
# or in a Python dictionary form and passed to the launchHubServer
# method. Any device parameters not specified when the device class is
# created by the ioHub Process will be assigned the default value
# indicated here.
#
eyetracker.hw.sr_research.eyelink.EyeTracker:
    # name: The unique name to assign to the device instance created.
    #   The device is accessed from within the PsychoPy script 
    #   using the name's value; therefore it must be a valid Python
    #   variable name as well.
    #
    name: tracker

    # enable: Specifies if the device should be enabled by ioHub and monitored
    #   for events.
    #   True = Enable the device on the ioHub Server Process
    #   False = Disable the device on the ioHub Server Process. No events for
    #   this device will be reported by the ioHub Server.
    #    
    enable: True

    # saveEvents: *If* the ioHubDataStore is enabled for the experiment, then
    #   indicate if events for this device should be saved to the
    #   data_collection/keyboard event group in the hdf5 event file.
    #   True = Save events for this device to the ioDataStore.
    #   False = Do not save events for this device in the ioDataStore.
    #    
    saveEvents: True

    # streamEvents: Indicate if events from this device should be made available
    #   during experiment runtime to the PsychoPy Process.
    #   True = Send events for this device to  the PsychoPy Process in real-time.
    #   False = Do *not* send events for this device to the PsychoPy Process in real-time.
    #    
    streamEvents: True

    # auto_report_events: Indicate if events from this device should start being
    #   processed by the ioHub as soon as the device is loaded at the start of an experiment,
    #   or if events should only start to be monitored on the device when a call to the
    #   device's enableEventReporting method is made with a parameter value of True.
    #   True = Automatically start reporting events for this device when the experiment starts.
    #   False = Do not start reporting events for this device until enableEventReporting(True)
    #   is set for the device during experiment runtime.
    #
    auto_report_events: False

    # event_buffer_length: Specify the maximum number of events (for each
    #   event type the device produces) that can be stored by the ioHub Server
    #   before each new event results in the oldest event of the same type being
    #   discarded from the ioHub device event buffer.
    #
    event_buffer_length: 1024

    # device_timer: The EyeLink EyeTracker class uses the polling method to
    #   check for new events received from the EyeTracker device. 
    #   device_timer.interval specifies the sec.msec time between device polls.
    #   0.001 = 1 msec, so the device will be polled at a rate of 1000 Hz.   
    device_timer:
        interval: 0.001

    # monitor_event_types: The eyelink implementation of the common eye tracker 
    #   interface supports the following event types. If you would like to 
    #   exclude certain events from being saved or streamed during runtime, 
    #   remove them from the list below.
    #    
    monitor_event_types: [ MonocularEyeSampleEvent, BinocularEyeSampleEvent, FixationStartEvent, FixationEndEvent, SaccadeStartEvent, SaccadeEndEvent, BlinkStartEvent, BlinkEndEvent]
    
    calibration:
        # IMPORTANT: Note that while the gaze position data provided by ioHub
        # will be in the Display's coordinate system, the EyeLink internally
        # always uses a 0,0 pixel_width, pixel_height coordinate system
        # since internally calibration point positions are given as integers,
        # so if the actual display coordinate system was passed to EyeLink,
        # coordinate types like deg and norm would become very coarse in
        # possible target locations during calibration.
        
        # type: sr_research.eyelink.EyeTracker supports the following
        #   calibration types:
        #   THREE_POINTS, FIVE_POINTS, NINE_POINTS, THIRTEEN_POINTS
        type: NINE_POINTS

        # auto_pace: If True, the eye tracker will automatically progress from
        # one calibration point to the next. If False, a manual key or button press
        # is needed to progress to the next point.
        # 
        auto_pace: True

        # pacing_speed: The number of sec.msec that a calibration point should
        # be displayed before moving onto the next point when auto_pace is set to true.
        # If auto_pace is False, pacing_speed is ignored.
        #
        pacing_speed: 1.5
        
        # screen_background_color: Specifies the r,g,b,a background color to 
        #   set the calibration, validation, etc, screens to. Each element of the color
        #   should be a value between 0 and 255. 0 == black, 255 == white. In general
        #   the last value of the color list (alpha) can be left at 255, indicating
        #   the color not mixed with the background color at all.
        screen_background_color: [128,128,128,255]
        
        # target_type: Defines what form of calibration graphic should be used
        #   during calibration, validation, etc. modes. sr_research.eyelink.EyeTracker
        #   supports the CIRCLE_TARGET type.
        #   
        target_type: CIRCLE_TARGET

        # target_attributes: The asociated target attributes must be supplied
        #   for the given target_type. If target type attribute sections are provided
        #   for target types other than the entry associated with the specified
        #   target_type value they will simple be ignored.
        #
        target_attributes:
            # outer_diameter and inner_diameter are specified in pixels
            outer_diameter: 33
            inner_diameter: 6
            outer_color: [255,255,255,255]
            inner_color: [0,0,0,255]

    # network_settings: Specify the Host computer IP address. Normally
    #   leaving it set to the default value is fine.
    #
    network_settings: 100.1.1.1

    # default_native_data_file_name: The sr_research.eyelink.EyeTracker supports
    #   saving a native eye tracker edf data file, the
    #   default_native_data_file_name value is used to set the default name for
    #   the file that will be saved, not including the .edf file type extension.
    #
    default_native_data_file_name: et_data

    # simulation_mode: Indicate if the eye tracker should provide mouse simulated 
    #   eye data instead of sending eye data based on a participants actual 
    #   eye movements. 
    #
    simulation_mode: False
    
    # enable_interface_without_connection: Specifying if the ioHub Device
    #   should be enabled without truly connecting to the underlying eye tracking
    #   hardware. If True, ioHub EyeTracker methods can be called but will
    #   provide no-op results and no eye data will be received by the ioHub Server.
    #   This mode can be useful for working on aspects of an eye tracking experiment when the
    #   actual eye tracking device is not available, for example stimulus presentation
    #   or other non eye tracker dependent experiment functionality.
    #    
    enable_interface_without_connection: False

    runtime_settings:
        # sampling_rate: Specify the desired sampling rate to use. Actual
        #   sample rates depend on the model being used. 
        #   Overall, possible rates are 250, 500, 1000, and 2000 Hz.
        #
        sampling_rate: 250

        # track_eyes: Which eye(s) should be tracked? 
        #   Supported Values:  LEFT_EYE, RIGHT_EYE, BINOCULAR
        #        
        track_eyes: RIGHT_EYE

        # sample_filtering: Defines the native eye tracker filtering level to be 
        #   applied to the sample event data before it is sent to the specified data stream.
        #   The sample filter section can contain multiple key : value entries if 
        #   the tracker implementation supports it, where each key is a sample stream type,
        #   and each value is the accociated filter level for that sample data stream.
        #   sr_research.eyelink.EyeTracker supported stream types are: 
        #       FILTER_ALL, FILTER_FILE, FILTER_ONLINE 
        #   Supported sr_research.eyelink.EyeTracker filter levels are:
        #       FILTER_LEVEL_OFF, FILTER_LEVEL_1, FILTER_LEVEL_2
        #   Note that if FILTER_ALL is specified, then other sample data stream values are
        #   ignored. If FILTER_ALL is not provided, ensure to specify the setting
        #   for both FILTER_FILE and FILTER_ONLINE as in this case if  either is not provided then
        #   the missing filter type will have filter level set to FILTER_OFF.
        #        
        sample_filtering:
            FILTER_ALL: FILTER_LEVEL_OFF
        
        vog_settings:
            # pupil_measure_types: sr_research.eyelink.EyeTracker supports one
            #   pupil_measure_type parameter that is used for all eyes being tracked. 
            #   Valid options are:
            #       PUPIL_AREA, PUPIL_DIAMETER,
            #            
            pupil_measure_types: PUPIL_AREA

            # tracking_mode: Define whether the eye tracker should run in a pupil only
            #   mode or run in a pupil-cr mode. Valid options are: 
            #       PUPIL_CR_TRACKING, PUPIL_ONLY_TRACKING
            #   Depending on other settngs on the eyelink Host and the model and mode of
            #   eye tracker being used, this parameter may not be able to set the
            #   specified tracking mode. CHeck the mode listed on the camera setup
            #   screen of the Host PC after the experiment has started to confirm if
            #   the requested tracking mode was enabled. IMPORTANT: only use
            #   PUPIL_ONLY_TRACKING mode if using an EyeLink II system, or using
            #   the EyeLink 1000 is a head **fixed** setup. Any head movement
            #   when using PUPIL_ONLY_TRACKING will result in eye position signal drift.
            #            
            tracking_mode: PUPIL_CR_TRACKING

            # pupil_center_algorithm: The pupil_center_algorithm defines what 
            #   type of image processing approach should
            #   be used to determine the pupil center during image processing. 
            #   Valid possible values are for eyetracker.hw.sr_research.eyelink.EyeTracker are:
            #   ELLIPSE_FIT, or CENTROID_FIT
            #            
            pupil_center_algorithm: ELLIPSE_FIT

    # model_name: The model_name setting allows the definition of the eye tracker model being used.
    #   For the eyelink implementation, valid values are:
    #       'EYELINK 1000 DESKTOP', 'EYELINK 1000 TOWER', 'EYELINK 1000 REMOTE', 
    #       'EYELINK 1000 LONG RANGE', 'EYELINK 2'
    model_name: EYELINK 1000 DESKTOP

    # manufacturer_name:    manufacturer_name is used to store the name of the
    #   maker of the eye tracking device. This is for informational purposes only.
    #
    manufacturer_name: SR Research Ltd.

    # model_name: The below parameters are not used by the EyeGaze eye tracker
    #   implementation, so they can be left as is, or filled out for FYI only.
    #
    model_name: N/A

    # serial_number: The serial number for the specific isnstance of device used
    #   can be specified here. It is not used by the ioHub, so is FYI only.
    #
    serial_number: N/A

    # manufacture_date: The date of manufactiurer of the device 
    # can be specified here. It is not used by the ioHub,
    # so is FYI only.
    #   
    manufacture_date: DD-MM-YYYY

    # hardware_version: The device's hardware version can be specified here.
    #   It is not used by the ioHub, so is FYI only.
    #
    hardware_version: N/A
    
    # firmware_version: If the device has firmware, its revision number
    #   can be indicated here. It is not used by the ioHub, so is FYI only.
    #
    firmware_version: N/A

    # model_number: The device model number can be specified here.
    #   It is not used by the ioHub, so is FYI only.
    #
    model_number: N/A
    
    # software_version: The device driver and / or SDK software version number.
    #   This field is not used by ioHub, so is FYI only. 
    software_version: N/A

    # device_number: The device number to assign to the Analog Input device. 
    #   device_number is not used by this device type.
    #
    device_number: 0

Last Updated: April, 2019


Back to top