Why don’t we see the world upside down?

This question comes up occasionally and I was just recently asked a similar question by email, so I thought it would be a good idea to do a blog post that everyone can see. Although there’s a great article on this here: http://mentalfloss.com/uk/biology/30542/your-eyes-see-everything-upside-down

First off, the image of the world projected onto our retina is upside. This is just a consequence of geometry. This image from the Wikipedia article on pinhole cameras shows this nicely:

Our eye is more sophisticated than a pinhole camera — it has a lens so it can collect light over the whole of our pupil and bring it to a focus on our retina — but that isn’t important here. The retinal image is still upside-down. So why don’t we see the world upside-down?

One way of answering that is to point out that our eyes don’t, actually, “see” anything at all. Seeing happens in the brain. All your brain needs to know is the relationship between which photoreceptors are receiving the light, and where the object is in the world. We’ve learnt that if we want to touch an object whose image appears at the bottom of our eye, we usually have to raise our hands up (in the direction of our shoulders) while extending them, not move them down (towards our feet). So long as we know the correct mapping, it doesn’t actually matter where on the eye the information is.

“Blindness to background”

My former colleague Dr Catherine O’Hanlon, now at Aberystwyth, and I have just published a paper on an interesting effect we found in small children. The roots of this study go back 7 years to when my son was two and I was reading a picture book with him. I was asking him to find various colours, and I noticed that he struggled to find colours that were in the background of the picture, like “blue” in a seascape or “yellow” in a beach scene. Apparently, he automatically ignored the background and attended only to colours in foreground objects. I thought that was an interesting phenomenon, and asked my developmental-psychology colleague Catherine if this was well known in small children. She said it hadn’t been reported, so we went ahead and tested more children to see if it was just my offspring or more widespread. It turns out that many two- and three-year-olds show this effect. They seem to assume that you must be talking about objects, even when you ask them something neutral like “Can you find green?”.
We think this bias helps young children acquire language.


Here are links to the press release and the paper itself.

ETN-FPI TS2

Yes, it’s acronym time — by ETN-FPI TS2, I mean the second Training School of the European Training Network on Full-Parallax Imaging, which was held at the University of Valencia in September 2016. Chris Kaspiris-Rousellis and I attended, and had a marvellous time learning about optics. I had done some geometrical and wave optics as part of my undergraduate physics degree, but it was great to get back to it, refresh my memory and learn more. The course was brilliantly run by Manuel Martinez and Genaro Saavedra, and consisted of morning lectures followed by afternoon practical sessions in the laboratory.
img_7915
img_7963small
img_7947cropsmall
jroptics2
img_7972small

Mantis videos

In our lab, we run experiments on praying mantis vision. We show the insects videos on a computer (mainly boring stimuli like bars moving, or little black dots which are meant to simulate a bug) and video how they move, via a webcam. The webcam stores a short video clip for each trial. The webcam is positioned so it films only the mantis, not the computer screen. At the moment the experimenter then manually codes the video, making a simple judgment like “mantis looked left”, “mantis moved right”, “mantis did not move” etc.
Here are some example video clips of mantids responding to different visual stimuli.
We would love to be able to analyse these automatically, e.g. to get parameters such as “angle through which mantis head turns” or “number of strikes”.
As you can see, this is pretty challenging. There are big variations from experiment to experiment, and smaller variations from session to session even within the same experiment. The overall lighting, the position of the mantis, what else is in shot, etc etc, all vary.






Online lectures on the human visual system

I recently organised a Training School on 3D Displays and the Human Visual System as part of the European Training Network on Full-Parallax Imaging. The Network aims to train up 15 Early Stage Researchers who not only have expertise both in optics and engineering, but who also have an understanding of human visual perception. This first Training School aimed to equip the researchers, from a variety of disciplinary backgrounds but mainly engineering, with the relevant knowledge of the human visual system.

As part of this, I gave three lectures. I’m sharing the videos and slides here in case these lectures are useful more widely. Drop me a comment below to give me any feedback!

Lecture 1: The eye.


Basic anatomy / physiology of the eye. Photoreceptors, retina, pupil, , lens, cornea. Rods, cones, colour perception. Dynamic range, cone adaptation. Slides are here.

Lecture 2: The human contrast sensitivity function.


Disparity, stereopsis. Stereoacuity and stereoblindness. Binocular vision disorders: amblyopia, strabismus. The correspondence problem. Issues with mobile eyes; epipolar lines, search zones. Different forms of human stereopsis: fine/coarse, contour vs RDS. Slides are here.

Lecture 3: Stereopsis.


Spatial and temporal frequency. Difference between band-pass luminance CSF and low-pass chromatic CSF; applications to encoding of colour. Fourier spectra of images. Models of the CSF. Slides are here.

Brainzone

The Brain Zone advisory committee with Dame Eliza Manningham-Buller (in blue), the chair of the Wellcome Trust board of governors, at the launch of the new exhibition at the Centre for Life.

Dated: 13/04/2016 The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.  Fao Nicola McIntosh, Centre for Life.  #NorthNewsAndPictures/2daymedia

Dated: 13/04/2016
The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.
Fao Nicola McIntosh, Centre for Life.
#NorthNewsAndPictures/2daymedia

Eye tracking with small children using a touchscreen

Back in 2011, my then colleague Dr Catherine O’Hanlon and I carried out a study in which young children gave responses using a touchscreen, while we simultaneously tracked their eye movements. I’ve been asked a few times by other researchers about how we managed this, so I thought it would be useful to share some information here.
Setup

Here (8Mb) you should be able to download a Powerpoint presentation I gave to EyeTrackBehavior in 2011. It’s about the practical issues rather than the science, with photos of our set-up and videos of the experiments.

Some code fragments for using the SMI RED eye tracker with Matlab are here.

I’ve also uploaded a zip file (62Mb, compressed with 7-zip) containing the complete Matlab code, audio clips and images we used to run our experiments. This worked in 2011 but no longer runs on my version of Matlab in 2016 so don’t expect it to work for you. However, it might possibly be useful for anyone aiming to do something similar so I thought why not share it.