Picturing science

Our energetic team submitted a number of entries to the Royal Society’s Picturing Science competition.

Jenny: This is an image I created in the scientific programming environment Matlab while doing some maths, trying to understand a particular aspect of stereo ("3D") vision. The funny thing is that I can now no longer even remember how I generated it! I think it is probably a Fourier phase spectrum of some sort. I threw up this image in my work, saved the figure, and carried on. Much later I came back to it and was struck by how beautiful and complex it is. In its repeating cells of interlocking curves, it reminds me of Celtic knotwork like that found in the Lindisfarne gospels.


Jenny: A occupational hazard of being a scientist's daughter is being roped into experiments. For this project, we were interested in where children looked when making judgments about pictures. We developed a system where we displayed pictures to children on a computer touchscreen; the children's eye-movements as they scanned the picture were monitored by an eye-tracker in front on them, and the children then touched the screen to indicate their judgment. We had the children sit on a carseat so that their head stayed in roughly the same position and the eyetracker didn't lose sight of their eyes. An adjustable arm helped us position the touchscreen at the right distance for each child, while the company Tracksys kindly loaned us an eyetracker. My little girl helped us get it all working and patiently recorded the words we needed for the experiment, so the computer could "speak" in a nice, friendly child's voice.


Jenny: This photo shows undergraduate student Steven Errington checking the calibration of a mirror stereoscope. This is one of the oldest forms of 3D display, and uses mirrors to present different images to the two eyes. The observer will sit with their head in the headrest in front of Steven, and the two mirrors in front of them will ensure that their left eye views a computer monitor to their left, while their right eye views a different monitor on their right. It's essential the two monitors are aligned both in space and time - that is, that they update their images at exactly the same time. Steven has clamped a photodiode in front of each monitor (that's the white cable running down by his hand) and fed their inputs into an oscilloscope. Photodiodes output a voltage which depends on the light falling onto them, so each new image presented on the computer monitor shows up as a "blip" on the oscilloscope. Steven can then check that the blips are occurring at exactly the same time, to sub-millisecond precision.


Jenny: Dr Ronny Rosner, an expert in insect visual neurophysiology, prepares to dissect a praying mantis in Dr Claire Rind’s insect lab at Newcastle University. This is part of a major new project in my lab, “Man, Mantis and Machine”, funded by the Leverhulme Trust. The praying mantis is the only non-vertebrate known to have a form of 3D vision, which it uses to help it strike at prey. We want to understand the neuronal circuits in its brain which help it do this, in order to compare them to similar circuits in human beings and other animals, and to 3D vision algorithms in computers and robots. Ronny will be joining my lab next year to bring his expertise to bear on these questions.

Paul: This experiment is designed to test orientation cues with stereo 3D (S3D) displays. The subject is sat behind a curtain with a square cut out of it, and can’t see that the television is in fact twisted through an angle, and therefore not perpendicular. However because of the display being S3D the subject cannot distinguish this change in orientation and assumes that the screen is frontoparallel (perpendicular). This means that when we show the stimulus (in this experiment a pair of rotating cubes) the stimulus look warped unless they are projected for the angle the subject is sat at (orthostereo), in contrast to when the curtain is removed and the television can be seen to be rotated, at which point the subjects brain corrects for not being perpendicular and sees the orthostereo cube (rendered for the angle) as warped, and the perpendicular cube as correctly projected.


Jenny: Three local sixth-formers did a summer project in my lab, funded by the Nuffield Foundation. As part of this, they collected experimental data from members of the public in Newcastle's Centre for Life. Here, they are getting their equipment set up ready for another busy data running experiments. Their work ended up being published in two scientific papers, on which the young people were authors, both in the journal i-Perception.



Kids Kabin in Walker

The time certainly does fly when you’re busy! Since my last blog post I have worked hard on the stimuli to get them to work as we hope they would, and have sorted out an occluder (something to block the edges of the screen from the viewer, so they can’t gauge the orientation of the screen). And I am now in the process of recruiting volunteers and running the experiment.

On Monday I helped out at an ION outreach programme at the Kids Kabin in Walker, an after-school club. School classes have been visiting the centre during school hours to learn different things not seen in a classroom such as pottery, cooking, and including a session on the brain ran by ION. In this session the children (varying age) are taught about how the brain works and how the different parts of the brain handle different tasks. They are engaged in activities they wouldn’t see normally such as the Stroop task (reading out the colour of the words font, rather than the name of the colour, for instance BLUE RED YELLOW would be RED GREEN BLUE and also wearing some custom made glasses which warp the view the of the world the child sees, which are worn and a simple task, like throwing a ball into a basket, is attempted. The emphasis of the classes is to teach the various regions of the cortex and how they work together to perform even simple tasks. The day was good fun for the kids and for the grown ups! Looking forward to going next week!

Welcome Ronny!

And I’m delighted to announce that Ronny Rosner is the third and final member of the M3 team. Ronny already has considerable experience studying neuronal processing by both behavioural and electrophysiological approaches in flies and locusts, including both extracellular and intracellular recordings. He’s currently in Uwe Homberg’s lab in Marburg, Germany. Ronny won’t be joining us till January 2014, though he will be visiting before then. He’s already spent a week visiting Newcastle, during which I took this photo of him in Claire Rind’s lab, about to dissect a mantis brain for us. Germany has a particularly strong tradition of insect neuroscience, so I’m delighted Ronny will be bringing that over to Newcastle to fuse with our existing insect and wider neurophysiology community. In fact I am really excited that I’ve managed to recruit such a talented team for the M3 project, and am looking forward eagerly to 5 years of fun science and productivity!


Mantid photos

Lisa’s been taking more fab photos of the mantids. They are so cool!

Fantastic close-up of a mantis that has jumped onto the CRT!


Mantid striking at image on computer screen


Eyeball to eyeball




Here are some photos showing our experimental set-up.

Mantis watching simulated bug on the CRT


Close-up of mantis watching a simulated bug on a CRT


Top-down view. You can see the mantis leaning out to the right to try and pursue the bug.



Graphs are good

All week Lisa’s been a bit gloomy about the behaviour of the mantids. Apparently they have not been cooperating well, doing defensive postures at the screen or even jumping off their perch. But she’s plugged doggedly on with true PhD student grit :). Anyway, we just sat down and created a script to load up the Matlab files and graph such data as she has managed to collect. And it’s great! Really beautiful, does exactly what we expected, demonstrates our protocol is sensible and suggests it’s worth while proceeding to more interesting stimuli. Just goes to show, you don’t always have an accurate sense of how good your data is as it comes in.

In other news, Paul has taken delivery of a 3D Dell laptop. Having a few teething problems getting it to display in 3D, but I shall be very interested to see the results. MSc computing students Mike and Nick are starting their projects in the lab this week, and James is back from holiday. So it’s going to be a nice diverse lab meeting this Friday.

Welcome Ghaith!

Ghaith Tarawneh is the second person to accept one of the M3 positions. Ghaith is just writing up his PhD in Microelectronic Circuit Design in the Electrical, Electronic and Computer Engineering department here at Newcastle. He also has a MSc in Mechatronics from Newcastle (where he was the highest-ranking student), and a BSc in Computer Engineering from Princess Sumaya University for Technology in Jordan (where he was the highest-ranking student – spot the pattern). He has skills in just about everything electronic, computer or programming-related. In the initial phase of the project, Ghaith’s technical wizardry is going to be essential for key challenges like automated recognition of mantis behaviour and displaying 3D images to insect eyes. Longer-term, Ghaith is keen to move into the computational neuroscience aspects of the project, figuring out the circuits which underlie mantis vision. Given the “machine” aspect of the M3 project, it’s an intriguing thought that Ghaith has the skills to implement these computations in hardware as well as in software if he so desires.
Ghaith will be the first RA to start work on the project, with a start date of 1st June if all goes well.

Welcome Vivek!


Delighted to welcome Vivek Nityananda as the first of the three Leverhulme-funded research associates working on the “Man, Mantis and Machine” project. Vivek currently holds a Marie Curie International Incoming Fellowship from the European Union. For this, he’s working on visual search and attention in bumblebees with Prof Lars Chittka at Queen Mary, London. Previously, he’s worked on acoustic communication in frogs and bush-crickets at the University of Minnesota and the Indian Institute of Science in Bangalore. Vivek will be starting work on the M3 project once his Marie Curie fellowship has finished in September. I’m delighted he’ll be joining us, and look forward to a highly productive time together.


Tantalising a mantis

Very excited that Lisa has been making great progress with the mantid behaviour experiments. Here is a video showing 3 trials of one of Lisa’s mantids performing a visual tracking task. The mantis is being filmed from underneath as it hangs from a stand – they seem happiest when upside-down. In front of the mantis is a CRT monitor where we are displaying the visual stimuli in the experiment. You’ll hear a Ping! when a fixation stimulus appears on the screen. This is a “simulated bug” which runs around in a spiral in towards the centre of the screen. This ensures the mantid begins each trial looking at the centre of the screen. Then you’ll hear a second Ping! as the bug starts to run either left or right (the tracking stimulus). Watch the mantis go mad as it tries to get its claws on the little beastie!

M3

Spent some time this morning with Lisa coding up a stimulus she thought might attract the mantids’ attention. Apparently it worked – the first time she presented it, the mantis leapt right off its perch onto the screen!