12. Experiments
12.1. Simple reaction times
Many psychology experiments measure reaction-times or decision-times.
The script simple-detection-visual-pygame.py
is a simple detection experiment programmed with pygame. The task is simple: the participant must press a key as quickly as possible when a cross appears at the center of the screen.
Download it and run it with:
python simple-detection-visual-pygame.py
The results are saved in reaction_times.csv
which you can inspect with any text editor.
If you are an R afficionado, you can open it and type:
data = read.csv('reaction_times.csv')
summary(data)
attach(data)
plot(RT)
dev.new()
plot(RT ~ Wait)
Here are my results:
Browse the code of simple-detection-visual-pygame.py
It is pretty technical! This is because Pygame is meant to program simple video games, not psychology experiments.
A more adequate library for this task is Expyriment (another one is Psychopy).
From here, we are going to use it to generate experiments.
Make sure you have installed Expyriment:
$ python
>>> import expyriment
If an error message moduleNotFoundError: No module named 'expyriment'
appears, check Software Installation.
Let us start by downloading simple-detection-visual-expyriment.py
and run it with:
python simple-detection-visual-expyriment.py
Then, in the subfolder data
, locate a file with a name starting with simple-detection...
and the extension .xpd
. This is a text file containing the reactions times. To analyse them, download analyse_rt.py
and run:
python analyse_rt.py data/simple-detection-visual-expyriment_*.xpd
Compare the codes of simple-detection-visual-expyriment.py
and simple-detection-visual-pygame.py
. This should convince you that using expyriment will make your life simpler if you need to program a psychology experiment.
The documentation of expyriment is available at http://docs.expyriment.org/. Have q a quick look at it, especially http://docs.expyriment.org/expyriment.stimuli.html
The basic principles of the expyriment
module are introduced in https://docs.expyriment.org/Tutorial.html.
I provide a minimal template at expyriment_minimal_template.py
that one can use to start writing a expyriment script.
Exercises:
Modify
simple-detection-visual-expyriment.py
to display a white disk instead of a cross.Modify
simple-detection-visual-expyriment.py
to display a white disk on half of the trials and a gray disk on the other half of the trials (thesis experimental conditions should be shuffled randomly). Then modify it to display disks with four levels of gray. Thus you can assess the effect of luminosity on detection time. (seexpy_simple_reaction_times/grey-levels.py
for a solution using Expyriment’sdesign.Block
anddesign.Trial
objects).Modify
simple-detection-visual-expyriment.py
to play a short sound (click.wav
) in lieu of displaying a visual stimulus (hint: usestimuli.Audio()
). Thus, you have created a simple audio detection experiment.Download and run
simple-detection-audiovisual.py
:python simple-detection-audiovisual.py
There are three blocks of trials: a first one in which the target is always visual, a second one in which it is always a sound, and a third one in which the stimulus is, randomly, visual or auditory. Are we slowed down in the latter condition? Use analyse_audiovisual_rt.py
to analyse the reaction times.
Exercice: add python code to simple-detection-audiovisual.py
to display instructions at the start of the experiment.
12.2. Decision times
In the previous example, the user just had to react to a stimulus. This involved a very simple type of decision (“is a target present or not?”)
Other tasks involves taking a decision about some property of the stimulus.
Exercise:
| - Modify simple-detection-visual-expyriment.py
to display, rather than a cross, a random integer between 0 and 9 (hint: Use stimuli.TextLine()
). Now, the task is to decide if the figure is odd or even, by pressing one of two keys.
parity.py
wrong-answer.ogg
.Here is a solution:
parity_feedback.py
12.3. Numerical distance effect
Exercise: Create a script to present, at each trial, a random number between 1 and 99, and ask the subject to decide wether the presented number is smaller or larger than 55
. Plot the reactions times as a function of the number.
Do you replicate the distance effect reported by Dehaene, S., Dupoux, E., & Mehler, J. (1990) in “Is numerical comparison digital? Analogical and symbolic effects in two-digit number comparison.” Journal of Experimental Psychology: Human Perception and Performance, 16, 626–641.?
12.4. Posner’s attentional cueing task
Exercise (**): Read about Posner’s attentional cueing task and program the experiment.
See a solution in
Posner-attention/posner_task.py
(you will needPosner-attention/right-arrow.png
,Posner-attention/star.png
andPosner-attention/left-arrow.png
12.5. Stroop Effect
The Stroop effect (Stroop, John Ridley (1935). “Studies of interference in serial verbal reactions”. Journal of Experimental Psychology. 18 (6): 643–662. doi:10.1037/h0054651) may be the most well known psychology experiment. Naming the color of the ink is difficult when there is a confict with the word itself. This is interpreted as a proof that reading is automatic, i.e. cannot be inhibited.
In the previous chapter, we created Stroop cards with Pygame.
Stroop card
(see
create_stroop_cards.py
)
Download stroop.zip
. Extract the files and run:
python stroop_task.py
The times are in the subfolder data
. Compute the average reading times as a function of the language (using R
or python
).
Exercise: Program a Stroop task with a single colored word displayed at each trial. To record actual naming times, you will need to record (!) the subject’s vocal response. A simple solution is to run a audio recording application while the script is running. You script should play a brief sound each time you present a target. Then, with a audio editor (e.g. Audacity), you can locate the times of presentation of stimuli and the onsets of vocal responses. Check out the program “CheckVocal” at https://github.com/0avasns/CheckVocal which does just that!
12.6. A general audio visual stimulus presentation script
In some experiments, we know in advance the precise timing of all stimuli (the program flow does not depend on external events). A script that reads the timing of audiovisual stimuli in a csv file and presents them at the expected times is available at https://www.github.com/chrplr/audiovis
12.7. Sound-picture matching using a touchscreen
The sentence-picture-matching.py
script presents a sound, followed by a picture and waits for the participant to press a button.
Exercise: Modify the previous script to present two pictures and use expyriment’s TouchScreenButtonBox to record the subject’s response, using the example from expyriment/touchscreen_test/touchscreen-test.py
12.8. More examples using Expyriment
Besides the examples from this course, you can find more expyriment scripts at