A week of debugging

This week has been a very varied and interesting one. I began by sorting out some acetate sheets for my perspex from which I could draw some stimulus, have them represented on the computer, and then check to see if my program was working as I intended it (see below pictures for examples of the pictures, lines and 3D cube used, I also used some random points to see if that also worked and eventually it did, not pictured). The immediate problem I could see was that the pictures were lining up perfectly fine when my angle of rotation was zero, but then if I rotated clockwise the images translated left and anticlockwise right. Because of this I realised that the centre of rotation I had organised (the centre of the television) was not in fact where the origin in my screenspace was. I adjusted the stand accordingly, by moving the television backwards, and it all worked perfectly.

After that I ran through the experiment myself for a full set of data and analysed the results, which show as I suspected that the slight translation made very little difference to the perception of the cube.

I am still working on getting the cubes working in openGL, which is becoming less painful the more I look at it, the cubes now warp and the backs can’t be seen when they shouldn’t. Still a work in progress, watch this space. The advantage I have is that the ‘extra parts’ such as turning it 3D, randomly interleaving many different variables and recording the results should be relatively straightforward as it can just be ported from phase 1.

I helped out at Kids Kabin again this week, (Ann the supervisor came to make up numbers and said my calling in life must eventually involve teaching, quite a nice compliment!) I have begun working on my dissertation with a bit more earnest (reading and re-reading appropriate papers, starting my introduction and materials and methods sections, etc.). I have also signed up to attend a conference on matlab in late June (only a week before my wedding) and have worked on my presentation at a conference in a fortnights time.

Next week I intend to continue with the openGL work, I am thinking of different ways I could present the problem in case what we are trying is physically impossible, possibly an adjustment task where the cube warps depending on the right or left keypresses, and the participant has to do it until they believe the cube is no longer warped, starting at different angles. Think that would be both feasible to program and quite interesting. Will see what Jenny thinks next week.

Slow and Steady wins the race.

I have spent the entire week battling with OpenGL. OpenGL is a graphics program that is supposed to make life easier for you by doing much of the hard calculations to project an object correctly onto the screen. Unfortunately this is the exact opposite of what it is I want to do, as I’m doing oblique angles instead of frontoparallel (straight on). The advantage that OpenGL has however is that it is very good with figuring out what should be able to be seen at any one moment. So in this program the back of my cubes aren’t visible when they shouldn’t be, which is exactly why we opted to use openGL for phase 2. Unfortunately I now need to reassess how to work out the oblique warping all over again. Fortunately I intend to use previous thinking as a ‘roadmap’ to avoid any of the pitfalls I suffered before.

I now have 7 weeks until the project is supposed to be finished. If I can get the cubes working in the next two weeks I’ll have more than enough time to run through some volunteers and collect some results. How do I know I have 7 weeks left? My wedding is the Saturday after the project is done. Which is altogether much more scary than science!

A frustrating week

This week I have been working with the problem of trying to add some solidity to my wire frame cubes via putting surfaces on them. Which sounds fairly trivial but has taken me the best part of a week to sort out. Needed to use Screen(FillPoly,…) and enter the coordinates my sph2cart(theta, phi, fancy R) returned. But this in itself raises a problem. Due to the 3D being used and the entire thing being on one plane when it is done, filling raises more problems then it solves, and the back surface can be seen on the wrong size (PTB does nothing for hidden faces, or at least not automatically)

SO it looks like I may have to write a new piece of code, using openGL. Which I would prefer to avoid as it’s a different beast altogether and will take a while. Going to look today at what I hope will be half measures to see if I can find a workaround in PTB. Not too hopeful though!

In other news I have received and setup the active stereo on my laptop. Have to say it looks brilliant and the screen quality is super HD anyway. So have been working with that and seeing if I can adapt matlab so it work in active stereo (managed it for basic matlab codes, not yet for my script). The number of volunteers I am getting from ION database is brilliant and I now have more than enough to do phase 1 and 2. Reluctant to turn any of them away in case I do need more for the power of the stats used. I have also been working on the abstract and presentation for the illusion conference in Leicester in June (excited for that).

Here are a couple of photos of the lab setup for this experiment and what the participant can see through the screen. As is clear, the curtain makes the orientation of the screen plane appear frontoparallel (perpendicular) whereas the other pictures show it is not!

 

How the illusion appearsBehind the curtain

Picturing science

Our energetic team submitted a number of entries to the Royal Society’s Picturing Science competition.

Jenny: This is an image I created in the scientific programming environment Matlab while doing some maths, trying to understand a particular aspect of stereo ("3D") vision. The funny thing is that I can now no longer even remember how I generated it! I think it is probably a Fourier phase spectrum of some sort. I threw up this image in my work, saved the figure, and carried on. Much later I came back to it and was struck by how beautiful and complex it is. In its repeating cells of interlocking curves, it reminds me of Celtic knotwork like that found in the Lindisfarne gospels.


Jenny: A occupational hazard of being a scientist's daughter is being roped into experiments. For this project, we were interested in where children looked when making judgments about pictures. We developed a system where we displayed pictures to children on a computer touchscreen; the children's eye-movements as they scanned the picture were monitored by an eye-tracker in front on them, and the children then touched the screen to indicate their judgment. We had the children sit on a carseat so that their head stayed in roughly the same position and the eyetracker didn't lose sight of their eyes. An adjustable arm helped us position the touchscreen at the right distance for each child, while the company Tracksys kindly loaned us an eyetracker. My little girl helped us get it all working and patiently recorded the words we needed for the experiment, so the computer could "speak" in a nice, friendly child's voice.


Jenny: This photo shows undergraduate student Steven Errington checking the calibration of a mirror stereoscope. This is one of the oldest forms of 3D display, and uses mirrors to present different images to the two eyes. The observer will sit with their head in the headrest in front of Steven, and the two mirrors in front of them will ensure that their left eye views a computer monitor to their left, while their right eye views a different monitor on their right. It's essential the two monitors are aligned both in space and time - that is, that they update their images at exactly the same time. Steven has clamped a photodiode in front of each monitor (that's the white cable running down by his hand) and fed their inputs into an oscilloscope. Photodiodes output a voltage which depends on the light falling onto them, so each new image presented on the computer monitor shows up as a "blip" on the oscilloscope. Steven can then check that the blips are occurring at exactly the same time, to sub-millisecond precision.


Jenny: Dr Ronny Rosner, an expert in insect visual neurophysiology, prepares to dissect a praying mantis in Dr Claire Rind’s insect lab at Newcastle University. This is part of a major new project in my lab, “Man, Mantis and Machine”, funded by the Leverhulme Trust. The praying mantis is the only non-vertebrate known to have a form of 3D vision, which it uses to help it strike at prey. We want to understand the neuronal circuits in its brain which help it do this, in order to compare them to similar circuits in human beings and other animals, and to 3D vision algorithms in computers and robots. Ronny will be joining my lab next year to bring his expertise to bear on these questions.

Paul: This experiment is designed to test orientation cues with stereo 3D (S3D) displays. The subject is sat behind a curtain with a square cut out of it, and can’t see that the television is in fact twisted through an angle, and therefore not perpendicular. However because of the display being S3D the subject cannot distinguish this change in orientation and assumes that the screen is frontoparallel (perpendicular). This means that when we show the stimulus (in this experiment a pair of rotating cubes) the stimulus look warped unless they are projected for the angle the subject is sat at (orthostereo), in contrast to when the curtain is removed and the television can be seen to be rotated, at which point the subjects brain corrects for not being perpendicular and sees the orthostereo cube (rendered for the angle) as warped, and the perpendicular cube as correctly projected.


Jenny: Three local sixth-formers did a summer project in my lab, funded by the Nuffield Foundation. As part of this, they collected experimental data from members of the public in Newcastle's Centre for Life. Here, they are getting their equipment set up ready for another busy data running experiments. Their work ended up being published in two scientific papers, on which the young people were authors, both in the journal i-Perception.



Kids Kabin in Walker

The time certainly does fly when you’re busy! Since my last blog post I have worked hard on the stimuli to get them to work as we hope they would, and have sorted out an occluder (something to block the edges of the screen from the viewer, so they can’t gauge the orientation of the screen). And I am now in the process of recruiting volunteers and running the experiment.

On Monday I helped out at an ION outreach programme at the Kids Kabin in Walker, an after-school club. School classes have been visiting the centre during school hours to learn different things not seen in a classroom such as pottery, cooking, and including a session on the brain ran by ION. In this session the children (varying age) are taught about how the brain works and how the different parts of the brain handle different tasks. They are engaged in activities they wouldn’t see normally such as the Stroop task (reading out the colour of the words font, rather than the name of the colour, for instance BLUE RED YELLOW would be RED GREEN BLUE and also wearing some custom made glasses which warp the view the of the world the child sees, which are worn and a simple task, like throwing a ball into a basket, is attempted. The emphasis of the classes is to teach the various regions of the cortex and how they work together to perform even simple tasks. The day was good fun for the kids and for the grown ups! Looking forward to going next week!

Welcome Ronny!

And I’m delighted to announce that Ronny Rosner is the third and final member of the M3 team. Ronny already has considerable experience studying neuronal processing by both behavioural and electrophysiological approaches in flies and locusts, including both extracellular and intracellular recordings. He’s currently in Uwe Homberg’s lab in Marburg, Germany. Ronny won’t be joining us till January 2014, though he will be visiting before then. He’s already spent a week visiting Newcastle, during which I took this photo of him in Claire Rind’s lab, about to dissect a mantis brain for us. Germany has a particularly strong tradition of insect neuroscience, so I’m delighted Ronny will be bringing that over to Newcastle to fuse with our existing insect and wider neurophysiology community. In fact I am really excited that I’ve managed to recruit such a talented team for the M3 project, and am looking forward eagerly to 5 years of fun science and productivity!


Mantid photos

Lisa’s been taking more fab photos of the mantids. They are so cool!

Fantastic close-up of a mantis that has jumped onto the CRT!


Mantid striking at image on computer screen


Eyeball to eyeball




Here are some photos showing our experimental set-up.

Mantis watching simulated bug on the CRT


Close-up of mantis watching a simulated bug on a CRT


Top-down view. You can see the mantis leaning out to the right to try and pursue the bug.



Graphs are good

All week Lisa’s been a bit gloomy about the behaviour of the mantids. Apparently they have not been cooperating well, doing defensive postures at the screen or even jumping off their perch. But she’s plugged doggedly on with true PhD student grit :). Anyway, we just sat down and created a script to load up the Matlab files and graph such data as she has managed to collect. And it’s great! Really beautiful, does exactly what we expected, demonstrates our protocol is sensible and suggests it’s worth while proceeding to more interesting stimuli. Just goes to show, you don’t always have an accurate sense of how good your data is as it comes in.

In other news, Paul has taken delivery of a 3D Dell laptop. Having a few teething problems getting it to display in 3D, but I shall be very interested to see the results. MSc computing students Mike and Nick are starting their projects in the lab this week, and James is back from holiday. So it’s going to be a nice diverse lab meeting this Friday.

Welcome Ghaith!

Ghaith Tarawneh is the second person to accept one of the M3 positions. Ghaith is just writing up his PhD in Microelectronic Circuit Design in the Electrical, Electronic and Computer Engineering department here at Newcastle. He also has a MSc in Mechatronics from Newcastle (where he was the highest-ranking student), and a BSc in Computer Engineering from Princess Sumaya University for Technology in Jordan (where he was the highest-ranking student – spot the pattern). He has skills in just about everything electronic, computer or programming-related. In the initial phase of the project, Ghaith’s technical wizardry is going to be essential for key challenges like automated recognition of mantis behaviour and displaying 3D images to insect eyes. Longer-term, Ghaith is keen to move into the computational neuroscience aspects of the project, figuring out the circuits which underlie mantis vision. Given the “machine” aspect of the M3 project, it’s an intriguing thought that Ghaith has the skills to implement these computations in hardware as well as in software if he so desires.
Ghaith will be the first RA to start work on the project, with a start date of 1st June if all goes well.