Many psychology experiments measure reaction-times or decision-times.
The script simple-detection-visual-pygame.py is a simple detection experiment programmed with pygame. The task is simple: the participant must press a key as quickly as possible when a cross appears at the center of the screen.
Download it and run it with:
pythonsimple-detection-visual-pygame.py
The results are saved in reaction_times.csv which you can inspect with any text editor.
If you are an R afficionado, you can open it and type:
Then, in the subfolder data, locate a file with a name starting with simple-detection... and the extension .xpd. This is a text file containing the reactions times. To analyse them, download analyse_rt.py and run:
Compare the codes of simple-detection-visual-expyriment.py and simple-detection-visual-pygame.py. This should convince you that using expyriment will make your life simpler if you need to program a psychology experiment.
Modify simple-detection-visual-expyriment.py to display a white disk instead of a cross.
Modify simple-detection-visual-expyriment.py to display a white disk on half of the trials and a gray disk on the other half of the trials (thesis experimental conditions should be shuffled randomly). Then modify it to display disks with four levels of gray. Thus you can assess the effect of luminosity on detection time. (see xpy_simple_reaction_times/grey-levels.py for a solution using Expyriment’s design.Block and design.Trial objects).
Modify simple-detection-visual-expyriment.py to play a short sound (click.wav) in lieu of displaying a visual stimulus (hint: use stimuli.Audio()). Thus, you have created a simple audio detection experiment.
There are three blocks of trials: a first one in which the target is always visual, a second one in which it is always a sound, and a third one in which the stimulus is, randomly, visual or auditory. Are we slowed down in the latter condition? Use analyse_audiovisual_rt.py to analyse the reaction times.
Exercice: add python code to simple-detection-audiovisual.py to display instructions at the start of the experiment.
In the previous example, the user just had to react to a stimulus. This involved a very simple type of decision (“is a target present or not?”)
Other tasks involves taking a decision about some property of the stimulus.
Exercise:
| - Modify simple-detection-visual-expyriment.py to display, rather than a cross, a random integer between 0 and 9 (hint: Use stimuli.TextLine()). Now, the task is to decide if the figure is odd or even, by pressing one of two keys.
Comparing the average decision time to the time to react to a simple cross provides a (rough) estimate of the time to decide about the parity of a number. By the way, one can wonder what happens for multiple digits numbers: are we influenced by the flanking digits?
- Add feedback; when the subjects presses the wrong key, play the sound wrong-answer.ogg.
Exercise: Create a script to present, at each trial, a random number between 1 and 99, and ask the subject to decide wether the presented number is smaller or larger than 55. Plot the reactions times as a function of the number.
Do you replicate the distance effect reported by Dehaene, S., Dupoux, E., & Mehler, J. (1990) in “Is numerical comparison digital? Analogical and symbolic effects in two-digit number comparison.” Journal of
Experimental Psychology: Human Perception and Performance, 16, 626–641.?
In a lexical decision experiment, a string of characters is flashed at
the center of the screen and the participant has to decide if it is a actual
word or not, indicating his/her decision by pressing a left or right
button. Reaction time is measured from the word onset, providing an
estimate of the speed of word recognition.
Then modify the lexical decision script to read the stimuli from a comma-separated text file (stimuli.csv) with two columns. Here is the content of stimuli.csv:
how many words of grammatical category (cgram) ‘NOM’, and of
length 5 (nblettres), of lexical frequency (freqfilms2)
comprised between 10 and 100 per millions are there in this database?
(answer=367). Save these words (i.e. the content of the field
Words) into a words.csv file (you may have to clean manually,
ie. remove unwanted columns, using Excel or Libroffice Calc).
Transform lexdec_v1.py into an auditory lexical decision script using the sound files
from the lexical decision folder <../experiments/xpy_lexical_decision/>:
The Stroop effect (Stroop, John Ridley (1935). “Studies of interference in serial verbal reactions”. Journal of Experimental Psychology. 18 (6): 643–662. doi:10.1037/h0054651) may be the most well known psychology experiment. Naming the color of the ink is difficult when there is a confict with the word itself.
This is interpreted as a proof that reading is automatic, i.e. cannot be inhibited.
In the previous chapter, we created Stroop cards with Pygame.
The times are in the subfolder data. Compute the average reading times as a function of the language (using R or python).
Exercise: Program a Stroop task with a single colored word displayed at each trial. To record actual naming times, you will need to record (!) the subject’s vocal response. A simple solution is to run a audio recording application while the script is running. You script should play a brief sound each time you present a target. Then, with a audio editor (e.g. Audacity), you can locate the times of presentation of stimuli and the onsets of vocal responses. Check out the program “CheckVocal” at https://github.com/0avasns/CheckVocal which does just that!
In some experiments, we know in advance the precise timing of all
stimuli (the program flow does not depend on external events). A script that reads the timing of audiovisual stimuli in a csv file and presents them
at the expected times is available at https://www.github.com/chrplr/audiovis
The sentence-picture-matching.py script presents a sound, followed by a picture and waits for the participant to press a button.
Exercise: Modify the previous script to present two pictures and use expyriment’s TouchScreenButtonBox to record the subject’s response, using the example from expyriment/touchscreen_test/touchscreen-test.py