“Blindness to background”

My former colleague Dr Catherine O’Hanlon, now at Aberystwyth, and I have just published a paper on an interesting effect we found in small children. The roots of this study go back 7 years to when my son was two and I was reading a picture book with him. I was asking him to find various colours, and I noticed that he struggled to find colours that were in the background of the picture, like “blue” in a seascape or “yellow” in a beach scene. Apparently, he automatically ignored the background and attended only to colours in foreground objects. I thought that was an interesting phenomenon, and asked my developmental-psychology colleague Catherine if this was well known in small children. She said it hadn’t been reported, so we went ahead and tested more children to see if it was just my offspring or more widespread. It turns out that many two- and three-year-olds show this effect. They seem to assume that you must be talking about objects, even when you ask them something neutral like “Can you find green?”.
We think this bias helps young children acquire language.


Here are links to the press release and the paper itself.

ETN-FPI TS2

Yes, it’s acronym time — by ETN-FPI TS2, I mean the second Training School of the European Training Network on Full-Parallax Imaging, which was held at the University of Valencia in September 2016. Chris Kaspiris-Rousellis and I attended, and had a marvellous time learning about optics. I had done some geometrical and wave optics as part of my undergraduate physics degree, but it was great to get back to it, refresh my memory and learn more. The course was brilliantly run by Manuel Martinez and Genaro Saavedra, and consisted of morning lectures followed by afternoon practical sessions in the laboratory.
img_7915
img_7963small
img_7947cropsmall
jroptics2
img_7972small

Mantis videos

In our lab, we run experiments on praying mantis vision. We show the insects videos on a computer (mainly boring stimuli like bars moving, or little black dots which are meant to simulate a bug) and video how they move, via a webcam. The webcam stores a short video clip for each trial. The webcam is positioned so it films only the mantis, not the computer screen. At the moment the experimenter then manually codes the video, making a simple judgment like “mantis looked left”, “mantis moved right”, “mantis did not move” etc.
Here are some example video clips of mantids responding to different visual stimuli.
We would love to be able to analyse these automatically, e.g. to get parameters such as “angle through which mantis head turns” or “number of strikes”.
As you can see, this is pretty challenging. There are big variations from experiment to experiment, and smaller variations from session to session even within the same experiment. The overall lighting, the position of the mantis, what else is in shot, etc etc, all vary.






Online lectures on the human visual system

I recently organised a Training School on 3D Displays and the Human Visual System as part of the European Training Network on Full-Parallax Imaging. The Network aims to train up 15 Early Stage Researchers who not only have expertise both in optics and engineering, but who also have an understanding of human visual perception. This first Training School aimed to equip the researchers, from a variety of disciplinary backgrounds but mainly engineering, with the relevant knowledge of the human visual system.

As part of this, I gave three lectures. I’m sharing the videos and slides here in case these lectures are useful more widely. Drop me a comment below to give me any feedback!

Lecture 1: The eye.


Basic anatomy / physiology of the eye. Photoreceptors, retina, pupil, , lens, cornea. Rods, cones, colour perception. Dynamic range, cone adaptation. Slides are here.

Lecture 2: The human contrast sensitivity function.


Disparity, stereopsis. Stereoacuity and stereoblindness. Binocular vision disorders: amblyopia, strabismus. The correspondence problem. Issues with mobile eyes; epipolar lines, search zones. Different forms of human stereopsis: fine/coarse, contour vs RDS. Slides are here.

Lecture 3: Stereopsis.


Spatial and temporal frequency. Difference between band-pass luminance CSF and low-pass chromatic CSF; applications to encoding of colour. Fourier spectra of images. Models of the CSF. Slides are here.

Brainzone

The Brain Zone advisory committee with Dame Eliza Manningham-Buller (in blue), the chair of the Wellcome Trust board of governors, at the launch of the new exhibition at the Centre for Life.

Dated: 13/04/2016 The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.  Fao Nicola McIntosh, Centre for Life.  #NorthNewsAndPictures/2daymedia

Dated: 13/04/2016
The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.
Fao Nicola McIntosh, Centre for Life.
#NorthNewsAndPictures/2daymedia

Eye tracking with small children using a touchscreen

Back in 2011, my then colleague Dr Catherine O’Hanlon and I carried out a study in which young children gave responses using a touchscreen, while we simultaneously tracked their eye movements. I’ve been asked a few times by other researchers about how we managed this, so I thought it would be useful to share some information here.
Setup

Here (8Mb) you should be able to download a Powerpoint presentation I gave to EyeTrackBehavior in 2011. It’s about the practical issues rather than the science, with photos of our set-up and videos of the experiments.

Some code fragments for using the SMI RED eye tracker with Matlab are here.

I’ve also uploaded a zip file (62Mb, compressed with 7-zip) containing the complete Matlab code, audio clips and images we used to run our experiments. This worked in 2011 but no longer runs on my version of Matlab in 2016 so don’t expect it to work for you. However, it might possibly be useful for anyone aiming to do something similar so I thought why not share it.

Demo videos from our mantis 3D glasses paper

We uploaded 6 nice demo videos as Supplementary Material for our Scientific Reports paper. Unfortunately the links are currently broken (I have emailed) and in any case they are provided in a slightly clunky way where you have to download them. So I thought I would do a blog post explaining the videos here.

Here is a video of a mantis shown a 2D “bug” stimulus (zero disparity). A black disk spirals in towards the centre of the screen. Because the disk is black, it is visible as a dark disk in both eyes, i.e. it’s an ordinary 2D stimulus. The mantis therefore sees it, correctly, in the screen plane, 10cm in front of the insect. The mantis knows its fore-arms can’t reach that far, so it doesn’t bother to strike.

Loading Video Player….

Next, here’s a video of a mantis shown the same bug stimulus in 3D. Now the disk is shown in blue for the left eye and green for the right (bearing in mind that the mantis is upside down). Because the mantis’s left eye is covered by a green filter, the green disk is invisible to it – it’s just bright on a bright background, i.e. effectively not there, whereas the blue disk appears dark on a bright background.
This is a “crossed” geometry, i.e. lines of sight from each disk to the eye that can see it cross over in front of the screen, at a distance about 2.5cm in front of the insect. This is well within the mantis’s catch range, so the insect strikes out trying to catch the bug. You can sometimes see children doing the same thing when they see 3D TV for the first time!

Loading Video Player….

Here’s a slo-mo version recorded with our high-speed camera. Unfortunately the quality has taken a big hit, but at least you get to see the details of the strike…

Loading Video Player….

Sceptical minds (the best kind) might wonder if this is the correct explanation. What if the filters don’t work properly and there’s lots of crosstalk? Then, the mantis is seeing a single dark disk in our “2D” condition and two dimmer disks in our “3D” condition. Maybe the two disks are the reason it strikes, nothing to do with the 3D. Or maybe there’s some other artefact. As a control, we swapped the green and blue disks over, effectively swapping the left and right eye’s images. Now the lines of sight don’t intersect at all, i.e. this image is not consistent with a single object anywhere in space. Sure enough, the mantis doesn’t strike. Obviously, in different insects we put the blue/green glasses on different eyes, so we could be sure the difference really was due to the binocular geometry, not the colours or similar confounds.

Loading Video Player….

Here’s the figure from our paper which illustrates this geometry and also shows the results: