OpenSesame EyeLink tutorial

Similar documents
Using BodyPaint 3D with LightWave

Overview. Label Gallery SDK User Guide

Color Swatch Add-on User Guide

Adafruit s DS3231 RTC Library Documentation

Using ONYX Separation Control Tool. Contents: What is Separation Control? Using ONYX Separation Control Tool. Separation Control Tips and Tricks

NeoPixie Dust Bag with Circuit Playground Express

Adafruit IO Basics: Servo

Adafruit CCS811 Air Quality Sensor

Android GBoard Morse Code Control with Circuit Playground Express

The KWallet Handbook. George Staikos Lauri Watts Developer: George Staikos

Clare Video Doorbell Version 2 User Manual

Adafruit s PCF8523 RTC Library Documentation

Adobe InDesign. Figure 1 Apply fill and stroke color to text by using the Swatches panel

Buttons FAST Pack User Guide

Clothes Recommend Themselves: A New Approach to a Fashion Coordinate Support System

Video Doorbell Pro 1

Home Security Begins at the Front Door

Alarm.com Wi-Fi Doorbell Camera and Slim Line - Installation Guide

Circuit Playground Express Laser Tag

Bill Redirect Send BluePrint fingerprint Symcod reader numbers directly in your existing application software

TrichoScan Smart Version 1.0

Extension of Fashion Policy at Purchase of Garment on e-shopping Site

OBIS Scientific Remote

Home Security Begins at the Front Door

Alarm.com Wi-Fi Doorbell Camera and Slim Line

Adafruit 2.4" TFT FeatherWing

How to check the printing process

INTERRUTTORI CREPUSCOLARI ASTRONOMICI. Manuale d Uso. memo AST2. Astronomical twilight switches: User Manual

Adafruit Capacitive Touch Sensor Breakouts

Adafruit DRV2605 Haptic Controller Breakout

Adafruit VL53L0X Time of Flight Micro-LIDAR Distance Sensor Breakout

PVC Documentation. Release Marin Atanasov Nikolov

TSL2561 Luminosity Sensor

The Higg Index 1.0 Index Overview Training

PyPortal NeoPixel Color Picker Created by Kattni Rembor. Last updated on :42:41 PM UTC

SMART WALLET A Wallet which follows you

Adafruit 8x16 LED Matrix FeatherWing

Secret Hollow Book Intrusion Detector

Adafruit 7-Segment LED FeatherWings

i2c/spi LCD Backpack Created by lady ada Last updated on :11:04 PM UTC

Adafruit APDS9960 breakout

Superwallet User Manual

Wind Blowing Emoji Prop

Type Syncro System Smart Logic

Adafruit Color Sensors

Crickit Dancing Marionette Kit Created by Dano Wall. Last updated on :03:11 PM UTC

Adafruit 3.5" 480x320 TFT FeatherWing

SyncroPro. Type 5492 Type

Adafruit Mini TFT " 160x80

Adafruit GPIO Expander Bonnet for Raspberry Pi Created by Kattni Rembor. Last updated on :12:47 PM UTC

Adafruit AMG8833 8x8 Thermal Camera Sensor

Adafruit BME680 Library Documentation

MCP Bit DAC Tutorial

Lumea IPL hair removal system. Quick Start Guide

Adafruit IO Basics: Feeds

CircuitPython Snow Globe

14-Segment Alpha-numeric LED FeatherWing

Computer Organization and Architecture, Pt. 1

THE WORLD S MOST INTELLIGENT FACIAL CLEANSING BRUSH

Adafruit 3.5" 480x320 TFT FeatherWing

User Manual Banknotes

IS31FL x9 Charlieplexed PWM LED Driver

Adafruit SI7021 Library Documentation

Non-SAC Member Facility Higg Index Guide: Overview

BLE Light Switch with Feather nrf52840 and Crickit

P NM DIODE LASER HAIR REMOVAL SYSTEM. Beijing Sanhe Beauty S & T Co., Ltd

Datalogging Hat with FLORA BLE

Application of Composite Load Models in

Steam Heat Retrofit for Coover Hall

Wallet Technology. Information Presentation

Portable IPL System. Principle and Application

imtokenwallet Manual If you have any question, please contact us via our telegram (

Jawbone Icon Won't Enter Pairing Mode

Lumea. Prestige. IPL hair removal system. Quick Start Guide

SM ART FAC IA L CLEANSING & MASSAGING BRUSH

37,097. We Analyzed Design Requests. Here Are The Hottest Business Graphic Design Trends for 2018

CoolTec. CT6cc CT5cc CT4cc CT4s CT3cc CT2cc CT2s. Type 5676

CHAPTERS RESEARCH DESIGN

Grove - LED Bar. Introduction. Features

How to Create Your Cryptocurrency Wallet and Add PumaPay Tokens

Postestimation commands predict estat procoverlay Remarks and examples Stored results Methods and formulas References Also see

Con Badge with Circuit Playground Express

Adafruit AS channel Visible Light Sensor

Add to Apple Wallet. Guidelines March 2017

Pharmacy Coverage Guidelines are subject to change as new information becomes available.

Sino:bit with Arduino

Monochrome OLED Breakouts

Adafruit Flora Bluefruit LE

HKS909. Diode laser. User&Training Manual

Content. Manual Version Piercing Props Props Placing the Props Remarks...

Sash Open Area Module (SOAM)

Neon LED Signs. Created by John Park. Last updated on :11:09 PM UTC

1. Charging. 2. In-app Setup. 3. Physical Installation. 4. Features. 5. Troubleshooting. Home Security Begins at the Front Door.

Crickit Carnival Bumper Bot

Adafruit PCF8523 Real Time Clock

HIGH-PERFORMANCE DEEP CLEANSING & ANTI-AGING DEVICE

Make a Google Glass remote with Bluefruit

Crawling Animatronic Hand

Regulatory Genomics Lab

Version 1.4 Getting Started Guide

Transcription:

OpenSesame EyeLink tutorial Daniel Schreij 1, Sebastiaan Mathôt 1,2, and Lotje van der Linden 2 1 VU University Amsterdam, Dept. of Cognitive Psychology 2 Aix-Marseille Université, Laboratoire de Psychologie Cognitive Last updated 02/04/13 http://www.cogsci.nl/opensesame 1 Communicating with the EyeLink through the OpenSesame GUI Creating a basic eye-tracking experiment using the drag-and-drop interface of OpenSesame (Mathôt, Schreij, & Theeuwes, 2012) is very easy. After installing the OpenSesame plug-ins <http://osdoc.cogsci.nl/plug-ins/eyelink-plug-ins> for communicating with the SR Research EyeLink, you will see that the some extra icons have appeared the item toolbar (Figure 1). Figure 1. The Eyelink plug-in icons in the item toolbar These icons represent the various options you have for communicating with the EyeLink device. Of course, it is also possible to communicate with the EyeLink directly through the scripting interface, but this is described in a later section. Basically, an experiment using the EyeLink consists of the following phases. 1.1 Connecting to the Eyelink (once per experiment) eyelink_calibrate Before you can start measuring the eyes, you first have to establish a connection with the EyeLink and then need to calibrate it, so it can associate the position of the measured eyes with coordinates on the screen. This is all done with the eyelink_calibrate module, of which the icon is depicted above. This module automatically connects to the EyeLink, which you typically only need to do once per experiment, so this module is usually also inserted only once at the top of the experiment tree (at the very beginning of the experiment). Once the connection is established you get into calibration mode, in which you will be presented with the (probably familiar) EyeLink calibration screen. Here you have the following options: C Start the calibration V Validate the calibration you just performed (hence always after having pressed C first) Enter / Return Switch between eye camera view and calibration mode A Automatically determine the pupil detection threshold

Q Exit the calibration mode and start or continue the experiment. If measurements suddenly become less accurate, you always have the possibility to perform a recalibration during drift correction (see 1.2). The eyelink_calibrate module further offers the following options: Tracker attached (yes no): indicates whether the EyeLink should be really approached, or if this is just a dummy run of the experiments, during which no real eye measurements should be performed yet. This is useful for testing or debugging an experiment in which you have already implemented all communications with the EyeLink, but would like to perform a test run on a machine without an EyeLink attached. Calibration beep (yes no): indicates whether a beep should be played when the calibration target moves. Enable drift correction if disabled (Eyelink 1000) (yes no): indicates whether active drift correction should be re-enabled when disabled, as is the default for the Eyelink 1000. Please note that not checking this option will not disable active drift correction when enabled. For more information, see eyelink_drift_correct. Saccade Velocity Threshold (default: 35): indicates how fast an eye movement needs to be, before it is registered as a saccade and indexed in the saccade report by the EyeLink data viewer. It can usually be left at its default value. This value can also be adjusted in the EyeLink data viewer after the data has been collected. Saccade Acceleration Threshold (default: 9500): indicates how large the acceleration of an eye movement needs to be before it is registered as a saccade. This can usually be left at its default value. 1.2 Drift correction (before every block or trial) eyelink_drift_correct There is always a small drift, due to factors such as movement of the head and slippage of the head gear (in case of a head-mounted tracker). This can slightly offset the calibration. Luckily, you can check whether this is the case wherever you have placed the eyelink_drift_correct module in your experiment. When the EyeLink is in drift correction mode it pauses the experiment until the participants fixates on the center dot on the screen and optionally presses space to continue. While the EyeLink is paused in drift correction mode, it is also possible to recalibrate it again by pressing Q. This brings you back into calibration mode with its options as described in section 1.1. There is an important difference between the EyeLink 1000 and previous versions of the device (EyeLink 1 and 2) when it comes to the default drift-correction mode. For the Eyelink 1 and 2, drift correction includes a slight 1-point recalibration. This is necessary, because slippage of the helmet causes a systematic drift that needs to be compensated for. In contrast, the EyeLink 1000 only checks whether fixation is stable and accurate, but does not adjust the calibration. The rationale behind this is that, because the Eyelink 1000 is not head-mounted, there is no systematic drift that accumulates throughout the experiment, although there is random drift that varies from trial to trial. In other words, for the EyeLink 10000 drift correction should be thought of as a verification process (and thus optional, but recommended), whereas for the EyeLink 1 and

2 drift correction is an active recalibration (and thus crucial). If you want to re-enable active recalibration for the EyeLink 1000, check the 'Enable drift-correction if disabled (Eyelink 1000)' box in the eyelink_calibrate plug-in. There is no real convention as to how often you have to perform drift correction. Most commonly it is done at the start of every trial, but (for the EyeLink 1 and 2) some have claimed this has introduced noise in their data (for instance in some motivational experiments) and once at the start of every block is better. You will just have to be the judge of what works best for your experiment. 1.3 Measuring eye data (each trial) eyelink_start_recording Place this module where you want to start recording with the eye-tracker, usually at the beginning of a trial. This module has one customizable option: Log message (default: start_trial): when recording is started, OpenSesame will send the message you specify here to the EyeLink log file, to mark the beginning of the trial. eyelink_wait This module pauses the experiment, until a certain event occurs. The type of event can be selected from the module s drop-down list, which comprises: Saccade start Saccade end Fixation start Fixation end Blink start Blink end eyelink_stop_recording This module terminates the EyeLink recording process for the current trial and sends the message specified under Log_message to the EyeLink log file, to mark the end of the trial. 1.4 Sending custom messages to the EyeLink eyelink_log You also have the ability to send the EyeLink custom messages, which will be stored in the log file. This is especially useful if you Figure 2. An example experiment structure.

want to specify periods of interest in the data file afterwards. For instance, if you start recording at the beginning of a trial, but are only interested in the eye movements after the appearance of a cue that appears somewhere during a trial, you could send the message SHOWING CUE to the EyeLink at the moment the cue is presented. The EyeLink data viewer then enables you to filter eye movements from the moment that the SHOWING CUE message was sent till the end of the trial. This possibility gives you more freedom as you can just measure eye movements of the entire trial and afterwards make a selection of the collected data that you actually want to use for further analysis. 1.5 Example experiment structure The structure of an experiment that makes use of the EyeLink could look something like Figure 2. The very first thing that happens is establishing a connection with the EyeLink and calibrating it (eyelink_calibrate). This connection will stay active for the remainder of the experiment and hence only needs to be established once at the beginning. After the welcome/instruction screen has been shown, the experiment commences with the first block and trial. Each trial begins with drift correction (eyelink_drift_correct), during which the experimenter also has the option to recalibrate the EyeLink again, if necessary. After drift correction has been successfully completed (e.g. when the participant fixated on the screen center and pressed space to continue), the command is sent to EyeLink to start measuring eye movements (eyelink_start_recording). Immediately thereafter the stimulus display is shown. After the participant has given his response, a feedback display is shown and any variables that still needed to be logged are sent to the EyeLink (eyelink_log). Here you can also log all your relevant (independent) variables, like the properties of your target, simply by using the '[variable name]' notation (for an alternative, see 3.1). In this case your log messages would look something like this: var target_color [col] var target_x_coordinate [x_pos] Finally, the command is sent to the EyeLink to stop recording and finish its log for the current trial. The connection to the EyeLink is automatically closed when your experiment finishes, so no need to worry about that! 2 Communicating with the EyeLink through the scripting interface You can also access the EyeLink using Python in an inline_script item. The EyeLink class is a property of the experiment class, which (in an inline_script item) you can access like this: self.experiment.eyelink # For all versions of OpenSesame exp.eyelink # Shortcut for OpenSesame 0.26 and later Most functions of this class perform the same actions as the modules described in the previous section. The eyelink_calibrate module for instance has its equivalent in the self.experiment.eyelink.calibrate() function. However, there are many functions which have no corresponding modules in the GUI and the scripting interface hence offers extra functionality. Another convenient function is the equivalent of the eyelink_log module: self.experiment.eyelink.log("message for the EDF")

This function is especially useful if multiple events of interest are handled by the same inline_script item, such as displaying a fixation dot and, after a certain duration, a stimulus. In this case you might want your output file to contain a message for each of those events, which can be achieved simply by calling the above function, with an appropriate message, immediately after you drew something to the display. For example: my_canvas1.show() self.experiment.eyelink.log("display fixation") self.sleep(self.get( SOA )) my_canvas2.show() self.experiment.eyelink.log("display stimulus") Some other examples: time, start_pos = self.experiment.eyelink.wait_for_saccade_start() time, start_pos, end_pos = self.experiment.eyelink.wait_for_saccade_end() x, y = self.experiment.eyelink.sample() The full list of available functions can be found online <http://osdoc.cogsci.nl/plug-ins/eyelink-plug-ins>. 3 Advanced communications with the EyeLink The EyeLink logging function provides advanced options for registering experimental variables, interest areas and other things. You can do this both with the self.experiment.eyelink.log function and with the eyelink_log module. Whenever you want the EyeLink to perform a special operation with your message you have to start your message with the!v command, followed by the action you want it to take. Below are some cases that illustrate the available options. It is important to note that the!v commands only work when they are issued while the EyeLink is recording (and hence between eyelink_start_recording and eyelink_stop_recording modules). It is possible to issue certain commands outside the recording phase, but this falls outside the scope of this tutorial. There is a lot which can be done with these commands and a full list can be found in the EyeLink Data viewer help file, under the chapter Protocol for EyeLink Data to Viewer Integration. We recommend reading this chapter if you really want to use the full potential of the command directives which the EyeLink interface has to offer. 3.1 Storing trial variables You might want to keep a record of the variables and values that you used during a trial. This can be done through the command:!v TRIAL_VAR <variable name> <variable value> For instance, if you want to register that the target color in that specific trial was red, you can issue the command: self.experiment.eyelink.log(!v TRIAL_VAR target red )

Make sure that you specify unique variable names, otherwise the last specified variable might overwrite an earlier registered variable with the same name. 3.2 Designating interest areas You also have the possibility to specify the interest areas in advance. Interest areas are specific regions on the screen of which you are interested if people have fixated inside them (and how frequent) or made a saccade in their direction. For instance, if your experiment contains a spatial cue and a target that appears at another position, you might want to specify interest areas around these elements to determine later if the eyes actually fixated on the cue or target. With an Interest Area Report created in the EyeLink data viewer you then get an ordered list of which interest areas were visited by the eyes along with all the other related information (fixation duration, saccade start point, etc.) You can specify an interest area with the command:!v IAREA <shape> <index> <left x> <top y> <right x> <bottom y> [label] You can give an interest area two shapes, for which the syntax of the command differs slightly. The first possibility is a rectangle shape, which is for instance referenced like this: self.experiment.eyelink.log(!v IAREA RECTANGLE 1 10 5 20 15 cue ) The above command will log a rectangular interest area stretching from x coordinates 10 to 20 and y coordinates 5 to 15. You also need to specify an index for each interest area that you make (in this case 1). Simply start with an index of 1 and increment this number for each new interest area that you create. The label string at the end is optional, but we recommend that you always specify this. If we would like to create a second interest area with the shape of an ellipse, the command to issue would look something like this: self.experiment.eyelink.log(!v IAREA ELLIPSE 2 300 200 400 300 target ) It is also possible to specify a custom shape for an interest area. This is has a slightly different syntax and is designated by specifying FREEHAND at the shape variable.!v IAREA FREEHAND <id> <x1, y1> <x2, y2 >... <xn, yn> [label] The label string parameter once again is optional. <xi, yi> refers to the coordinates of a point. The x, y coordinates of each point are separated by a comma. 4 References Mathôt, S., Schreij, D., & Theeuwes, J. (2012). OpenSesame: An open-source, graphical experiment builder for the social sciences. Behavior Research Methods, 44, 314-324.