This post is about one of the main Virtual Reality projects I’m involved in.
The Virtual Supermarket is a VR environment specially designed for conducting research, developed by Atoms2Bits B.V.. I’ve been involved since June 2015 - initially for hardware development as a freelancer and after that I joined as a partner. [update: I’ve written more about the hardware development in this post]
It’s unique value is that we have opened up the possibility for eye tracking in Virtual Reality. This means we can track what people look at:
In this post I will shortly explain what the Virtual Supermarket is, how it works and how I got involved in this project.
Eye tracking is best explained using an example:
What you see here is the video recorded from inside the Virtual Reality headset. A software algorithm determines the position of the pupil based on the video (the red oval and dot).
Because this is built into a Virtual Reality headset, we know exactly what is presented on the screen inside the headset. With the position of the pupil, we can then determine what catches the attention on the screen.
To learn more about the hardware behind this, please check out my work on Eye Tracking hardware for VR.
This innovative solution has been developed in collaboration with Georgia Tech (USA), Eindhoven University (NL), Radboud University (NL) and NHTV University of Applied Science (NL). UMC Utrecht (Medical University) is one of the early adopters to explore the possibilities.
Eye tracking in VR provides new opportunities for research. In a research lab, in which you will put 100+ participants to the test, it is very important to keep the environment as controlled as possible. In other words: you want to provide the same experience to everyone. Virtual environments are very constant in this aspect. Since Virtual environments are computer models, they are very suited for collecting data for research purposes.
Since Eye Tracking is not readily available for Virtual Reality at this time, we had to develop this ourselves. This is where I was initially brought into the project. Together with Joris Helming I’ve developed several methods to enable eye tracking for the Oculus Rift DK2, Oculus Rift CV1 and HTC Vive. We have created a separate label for this: Follow The Bits.
Especially when working with medical universities and therefore medical data, we are very careful with participant data. By design, we do not store any personal data or video footage on-premise or in the cloud. The only way to identify a participant is through combination with the dataset kept by the customer.
- Strategic development
- Product Management (note: I don’t write any code for this project)
- Custom hardware development: eye tracking is not available by default. We had to develop this ourselves. We have created a separate label for this: Follow The Bits.
- Image courtesy: header image and screenshots by yours truly, product photography by Joris Helming fotografie