Mantis videos

In our lab, we run experiments on praying mantis vision. We show the insects videos on a computer (mainly boring stimuli like bars moving, or little black dots which are meant to simulate a bug) and video how they move, via a webcam. The webcam stores a short video clip for each trial. The webcam is positioned so it films only the mantis, not the computer screen. At the moment the experimenter then manually codes the video, making a simple judgment like “mantis looked left”, “mantis moved right”, “mantis did not move” etc.
Here are some example video clips of mantids responding to different visual stimuli.
We would love to be able to analyse these automatically, e.g. to get parameters such as “angle through which mantis head turns” or “number of strikes”.
As you can see, this is pretty challenging. There are big variations from experiment to experiment, and smaller variations from session to session even within the same experiment. The overall lighting, the position of the mantis, what else is in shot, etc etc, all vary.






Online lectures on the human visual system

I recently organised a Training School on 3D Displays and the Human Visual System as part of the European Training Network on Full-Parallax Imaging. The Network aims to train up 15 Early Stage Researchers who not only have expertise both in optics and engineering, but who also have an understanding of human visual perception. This first Training School aimed to equip the researchers, from a variety of disciplinary backgrounds but mainly engineering, with the relevant knowledge of the human visual system.

As part of this, I gave three lectures. I’m sharing the videos and slides here in case these lectures are useful more widely. Drop me a comment below to give me any feedback!

Lecture 1: The eye.


Basic anatomy / physiology of the eye. Photoreceptors, retina, pupil, , lens, cornea. Rods, cones, colour perception. Dynamic range, cone adaptation. Slides are here.

Lecture 2: The human contrast sensitivity function.


Disparity, stereopsis. Stereoacuity and stereoblindness. Binocular vision disorders: amblyopia, strabismus. The correspondence problem. Issues with mobile eyes; epipolar lines, search zones. Different forms of human stereopsis: fine/coarse, contour vs RDS. Slides are here.

Lecture 3: Stereopsis.


Spatial and temporal frequency. Difference between band-pass luminance CSF and low-pass chromatic CSF; applications to encoding of colour. Fourier spectra of images. Models of the CSF. Slides are here.

Brainzone

The Brain Zone advisory committee with Dame Eliza Manningham-Buller (in blue), the chair of the Wellcome Trust board of governors, at the launch of the new exhibition at the Centre for Life.

Dated: 13/04/2016 The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.  Fao Nicola McIntosh, Centre for Life.  #NorthNewsAndPictures/2daymedia

Dated: 13/04/2016
The Baroness Manningam-Buller – former head of MI5 and chair of Wellcome Fund – is visiting the Centre for Life in Newcastle Upon Tyne for the launch of the brain zone exhibition.
Fao Nicola McIntosh, Centre for Life.
#NorthNewsAndPictures/2daymedia

Eye tracking with small children using a touchscreen

Back in 2011, my then colleague Dr Catherine O’Hanlon and I carried out a study in which young children gave responses using a touchscreen, while we simultaneously tracked their eye movements. I’ve been asked a few times by other researchers about how we managed this, so I thought it would be useful to share some information here.
Setup

Here (8Mb) you should be able to download a Powerpoint presentation I gave to EyeTrackBehavior in 2011. It’s about the practical issues rather than the science, with photos of our set-up and videos of the experiments.

Some code fragments for using the SMI RED eye tracker with Matlab are here.

I’ve also uploaded a zip file (62Mb, compressed with 7-zip) containing the complete Matlab code, audio clips and images we used to run our experiments. This worked in 2011 but no longer runs on my version of Matlab in 2016 so don’t expect it to work for you. However, it might possibly be useful for anyone aiming to do something similar so I thought why not share it.

Outreach: maths week at Shotton Hall Academy

This week I had an opportunity to speak to some aspiring mathematicians about their future and to ask them to think about maths. Not just to answer the questions, and not to just ask ‘why’ without purpose, but to try and find a middle ground between the two. I also obviously tried to include some of the research the institute does, to tie it to my own research.

The first thing I asked the kids to do was to explain to me an example of using maths. Most of them came up with some interesting ideas that focused on measuring quantities and making calculations. I then asked them to come up with a definition. They all said similar things, all along the lines of ‘measuring, calculating, and using data’. Which is a great answer! But was not what I was looking for. ‘OK, give me an example of something that doesn’t use maths.” was my next question. A few hazarded some guesses and were all shown that maths IS in there somewhere! We agreed that a great definition of maths would be that it is simply problem solving. Everything else you do with maths is simply a tool to solve the problem with.

I then talked about earnings and aspirations of mathematicians (did you know that aside from medicine the maths degree is the best paid degree to have in the world?) and went into my own mathematical work. Not the modelling (bit too complicated for 15 year olds) but the actual calculations of the parallax. We went through an example and the kids had to answer the question themselves. They had to solve the problem, I didn’t spoon feed it to them. I think this made for interesting discussion.

That was the end of it, just time to wrap up (DO MATHS AT A LEVEL) and then to give them all a brain sweet for taking part so nicely. All in all I think it went well!

Percentage of variance explained

I was just looking for a good explanation of this online to point a lab member towards, and I couldn’t find anything suitable, so I thought I’d write something myself. (Since then, this https://en.wikipedia.org/wiki/Fraction_of_variance_unexplained seems pretty good.

The idea is you have a set of experimental data {Yi} (i=1 to N). These might be responses collected in N different conditions, for example.
The mean is

M = sum{Yi} / N

and the variance about this mean is the total variance

TV = var(Yi) = sum{ (Yi-M)^2 } / N

Now suppose you have some model or fit which predicts values Fi. The residuals between the fit and the data are Ri=(Yi-Fi). The mean of the squared residuals is

RV = sum { (Yi - Fi )^2 } / N

(this is identical to the variance of the residuals if Fi and Yi have the same mean, as they do in linear regression)
The fraction of UNexplained variance is RV/TV, so the fraction of explained variance is 1-RV/TV.

In a perfect world, Fi would be equal to Yi for all i, and therefore these residuals would all be zero. RV=0 and thus all of the variance is explained. The other end of the spectrum is where all the fit values are the same, just equal to the mean of the data: Fi=M. Then we can see from the equations above that RV=TV and none of the variance is explained.

The percentage of variance which is explained is

PV = 100(1-RV/TV).

On Converting Pixels to Degrees

The Warping Effect

In our mantis contrast sensitivity paper (http://tinyurl.com/mantiscsf) we used the following formula to calculate the spatial frequency of grating stimuli in cycles per degree (cpd):

eqn

Where ppx is the grating period in pixels, sr is the screen resolution in px/cm and viewD is the viewing distance in cm.

This equation assumes a linear correspondence between screen pixels and the visual angle subtended by these pixels (in degrees). The conversion ratio is effectively the average over one grating period; for example a spatial period of 100px corresponds to 20.25 degs making the effective conversion ratio 4.94 px/deg.

When viewD is sufficiently large (which is typical for human experiments) the assumption of a linear correspondence between screen pixels and angle subtended holds (i.e. differences are negligible). However, for small viewD there could be considerable differences as gratings subtend smaller angles the further they are from the centre and thus the stimulus appears to be “squeezed” near the screen ends.

The plots below demonstrate this effect. The top two plots show a horizontal sinusoidal grating of period 5 px (sr = 40 px/cm) as perceived by an observer 7cm away from the monitor. Instead of a constant frequency in cpd there is actually a range of frequencies going from low (at the screen centre) to high (at both ends). The bottom plot is a frequency spectrum showing how broadband this actually is. Using the equation above fs = 0.978 cpd for this stimulus while the actual spectrum goes from 0.08 all the way up 0.6 cpd.

spec1

How to correct for this

One way to correct for this is to incrementally increase the grating period as we move away from the screen centre. For an observer standing sufficiently away from the monitor such a stimulus would appear squeezed at the centre and stretched at the sides. At the right viewing distance, however, all gratings will subtend the same visual angle.

This can be done in multiple ways in Matlab. I’ve found the following particularly intuitive and easy to implement:

  1. Calculate the visual degree corresponding to each px on the screen given a viewing distance and a screen resolution (the function px2deg does this for you)
  2. Use the obtained degrees array to calculate grating/stimuli luminance level

For example, to render a sinusoidal grating of frequency 0.1 cpd on a monitor that is 1920px wide I do:

scrWidthPx = 1920; % px
screenReso = 40; % px/cm
viewD = 7; % cm
freq = 0.1; % cpd
degs = px2deg(scrWidthPx, screenReso, viewD);
y = cos(2*pi*degs*freq);

The array y now contains 1920 luminance levels which, when rendered horizontally across the 1920 screen pixels, would simulate a grating of 0.1 cpd at the specified viewing distance. The plots below demonstrate how this is perceived by the observer.

spec2

The rendered stimulus is now narrow-band as intended.

 Precautions

The correction above assumes a certain viewing distance and that the observer is positioned laterally in front of the screen centre. If the observer position changes then the perceived stimulus will have a different spectrum and so it is important to ensure that subjects (in our experiments, mantids) are placed at the correct viewing distance both away from the screen and laterally to minimize any errors.

How does this affect our CSF Data

The “warping effect” that we described above must be taken into consideration when interpreting the data we published in the mantis CSF paper. Each spatial frequency we rendered was actually perceived by the mantis as a broadband signal and so the actual mantis CSF is likely more narrow-band compared to what we reported.

Which Windows/Matlab version with Psychophysics Toolbox?

My colleague Nicholas Port was just asking me about my experience with different versions of Windows & Matlab running Psychophysics Toolbox. I asked my team about their experience, and I thought it might be useful to record their comments here:

Paul: “I used windows 7 and Matlab 2012b for psychophysical experiments before and I didn’t have any issues. However when I was using an Eyelink 1000 (external eyetracker) and timing was important for some reason the functionality only worked on 64 bit, so it is certainly worth considering which version you should use. Windows 7 and 2012A is more than enough.”

Zoltan: “I used Windows 8.1 with Matlab R2014A. I had some issues with timing because the desktop mode is handled differently.
If you do production-quality stuff where timing is important, go with Windows 7 for the time being. I never had any issues with the 64-bit Matlab. However, there is a software limitation with Windows 7: the refresh rate you can set on any monitor is capped at 85 Hz. You will need to do some hacking in the registry if you have a Radeon card, and you can conveniently make a new custom resolution with anything nvidia.
So, currently, the most stable stuff is a Windows 7 PC, 64-bit Matlab, and an nVidia GPU. If you need more power, switch off the swap file, visual effects and have plenty of RAM.”

Ghaith: “I’ve been using PTB on Windows 7 using both 32/64-bit versions of Matlab with no problems.”

Partow: “I had some problems in downloading psychtoolbox with 32-bit on a Windows 7. 64 is fine.”

Going on an international conference: VSS 2015 in Florida

One of the questions I regularly get asked as a PhD student / research scientist is what it’s like going away on science business to conferences and the like. Well I thought I would share my recent experience with a conference in Florida, USA to hopefully answer the questions and let people see some of the highs (and lows) of conferences in general!

First things first, and it’s the question I get asked all the time about it: I did not pay personally for the flights, accommodation, conference attendance, or even food while I was there. If you were buying something non-essential (such as beer, or a trip to the gym) it comes out of your own pocket but if it’s something you need to ensure that you are at the conference and comfortable the university (so long as approved by your supervisor) will provide it.

The initial part of any conference is always rubbish: the travelling there, on your own. For me this time that meant a 1am alarm on Thursday morning, a taxi to the airport, a flight from Newcastle to Amsterdam, to Atlanta, to Tampa, which included a missed flight (through immigration controls) and about 26 hours without sleep. This is in my opinion the key worst bit of conferences away: Loneliness. This one wasn’t so bad because once we arrived I was staying at an apartment complex with many of my fellow lab mates and scientists, including Sid and Zoltan. We hired a car from the week (paid for) to ensure we could get to the walmart and back to the airport at the end of the week. It definitely worked out cheaper, but the driving on the wrong side of the car on the wrong side of the road in 3 lane traffic got some getting used to.

car

 

 

 

 

 

 

 

 

Upon arrival we pretty much had time for a quick tea at a local restaurant before bed. Up nice and early in the morning for the first day at the conference. Sid, Zoltan and myself arrived comfortably in time to register, which was a very well oiled machine for a conference of thousands of scientists. You always get a name badge so you can talk to anyone. You just have to hope your interests overlap.

 

Lotsofscientists

 

 

 

 

 

 

 

 

Now I imagine most of you are thinking ‘well yeah but hang about! You get to go to Florida, paid, for a week!?’. That is certainly an upside of VSS, glorious sunshine everyday. However the conference ran pretty much all day every day. So on no day did we get a ‘day off’ to just relax, or explore. That said at these things, particularly when there are literally hundreds and hundreds of different scientific talks, and thousands and thousands of scientific posters, you need to chop and choose which you go to, and make sure you don’t run out of steam. Our lab is fairly specialised, and because of this we all went to similar talks and posters, and in the sessions relevant to us we always had somebody giving a talk or presenting a poster. I won’t bore you with all the full details, but many of our lab presented, as well as other members of the institute, and we all did a smashing job (if I do say so myself!) Zoltan, Jenny and I had posters to present, while Sid (and also Ignacio, a long term collaborator of Jenny’s) had talks to give.

IMAG7357IMAG7363IMAG7378IMG_0598IMG_0613Abstract

 

 

 

 

 

 

 

 

 

 

This sort of leads me onto the next bit that is very cool about being at a conference, even though it’s something small: having your name written down in the abstract book for all to see. You feel practically famous!

The talks themselves were all very interesting, and that wasn’t the only thing VSS put on for the scientists. As I’m sure some of you are aware there was a big hoohah about ‘the dress’ earlier in the year. Well the science behind it was explained by members of our institute (Anya, pictured below with Jenny) and some people wore the different coloured versions of ‘the dress’ including our own Jenny! This was at a general demonstration night, where all manner of optical illusions and tricks were shown and explained using visual science. I think this was the highlight for me, personally, as it also had a beach bbq with 2 free beers each!

illusionfromwrongside      Jenny_thedress Jenny_thedress2 JennyAnya_thedress

While I was out in Florida it wasn’t all just work work work. I got the chance to visit the local gym and have a train, as well as catch up with an old friend who had moved out here some months before, which was a lovely extra bonus. The fact you had to pick and choose what talks to go to to ensure you didn’t miss anything you were bothered about also meant you occasionally had a couple of hours free…Ample enough time to catch a bit of sun by the pool!

Pool

 

 

 

 

 

 

 

After that however it was time to go home, which meant another lonely flight back, and back to reality (where you have to submit a receipt for every purchase if you want the money back!).

As conferences go, VSS 2015 was a particular gem, due to location and the fact my fellow lab mates were there. Sometimes it can be a lot worse, with a lot of time spent in your hotel room on your own in the cold with nothing but a book to read to keep you entertained! But the whole conference thing is most definitely a perk!