What a productive lab… Lisa and Paul gave great talks as part of their MRes yesterday; today Lisa is delivering multiple talks to visiting sixth-formers as part of Biology Open Day, while Paul is up in Edinburgh on a Matlab course. Nick and Mike are working hard on their interim reports, due in tonight, while Parto is relieved to have finally got her visa renewal confirmed and is collecting control data with a lighter heart. James and I had a very useful meeting with colleagues this morning on taking the smartphone work further. Ghaith only started work with us on Wednesday and from what I see when I walk past his desk, is already making brilliant progress automating the video analysis. How fab to be part of such a talented and motivated team!
Thanks everyone who came along to help Lisa and me move our mantids across to their new home in the dedicated insect facility down the road. Especially Nick and Mike, who probably didn’t imagine when they came to the UK to do a MSc in Computer Games Engineering that this would involve transporting predatory insects! Although Nick — how about doing us a giant mantis in stereoanamorphic perspective? How scary would that be?
This week I have been working very hard at data collection. Getting the cubes (which we managed to get correctly aligned last week) into a workable experiment program took some debugging but we got it sorted shortly after the Leicester conference. Since then 9 participants have come in to take part in the experiment and the data so far looks good. The conference in Leicester went very well. I was well received, the audience seemed engaged and asked questions and a coulpe of neuroscientists in the crowd caught up with me later to bash heads on ideas that I may/could use for my PhD. Jenny has asked me to start thinking about what it is that I wish to look at and I have a few ideas, most of them relating to comfort (and therefore discomfort) while watching 3D TV. The program on the laptop was a huge hit and everybody wanted to have a go!
My deadline for the dissertation and presentation/poster/abstract are beginning to get closer so the next few weeks I will be working on statistically analysing my data, organising my work and writing it up. I am presenting for the behaviour staff next Thursday and that will be my most pressing subject to handle over the weekend.
I was just having a problem with MakeTexture in Psychtoolbox. I like to use
PsychImaging(‘AddTask’, ‘General’, ‘NormalizedHighresColorRange’);
so that black=0 and white=1, not 255. This makes my code transfer easily to high-bit-depth devices like my DATAPixx. But I was having a problem that MakeTexture would only work with the 0-255 colour range. Argh what a pain! But thankfully googling revealed that this was my mistake.
“— In firstname.lastname@example.org, “IanA”
Answering my own question, I realised after I posted it that MakeTexture
requires a bitdepth passed to it explicitly, and then the normalised range works
as expected. I had expected the default (0) to follow the window itself, but
reading the documentation properly 😉 see the default is 8bit.
So I just needed to do
tex=Screen(‘MakeTexture’, window, imageMatrix,,,1);
tex=Screen(‘MakeTexture’, window, imageMatrix);
and it all worked. Love Psychtoolbox!
Last weeks debugging finally came to fruition on Friday, with Jenny and myself knocking our brains together for the past 3 days and coming up with the solution to the problem. After altering the viewing frustum using some mathematics and then changing the code with loading identity matrices and rendering the texture we got a perfectly lined up, textured, solid cube in openGL which mapped perfectly onto the points already worked out previously using vector mathematics and trigonometry. So the next step (which is going to be the majority of this weeks work) is to implement it into a matlab program to allow it to be ran and the experiment to be done and recorded. Some images of the cube lined up against a perspex proof are shown here. We put the coordinates of one of the faces in at a funny angle of both rotation about the perspex and the table, calculate the coordinates in world space (in cm) and put them into the code. Open GL recreates the same points perfectly on the screen for the viewpoint. A laser was used to make sure the origin was lined up correctly on the screen (you can see the red laser crosshair on the chinrest on one of the photos) and a laser distance measurer was used to judge the distance from the screen.
This week I am also submitting an abstract to NEPG, which is the North East Postgraduate Conference, a coming together of many northern universities (from Edinburgh to Leeds) to present data and information. I hope to be given a spot to present in, or a poster place. Either way it’s an exciting prospect to hopefully present up here for my peers to see what it is that I am doing. Regardless of acceptance or not I look forward to seeing other works and attending talks.
To finish off the week I am giving a talk in Leicester on Friday titled ‘an illusion using 3D technology’. Which will be my first time presenting my data to anybody. While I am not in any way nervous yet I imagine I will be come 9:15 when I open the talks of the day as the first speaker. Wish me luck!