Researchers from King's College London have developed new eye tracking technology which enables young children to engage in an immersive Virtual Reality (VR) experience while undergoing Magnetic Resonance Imaging (MRI).
New eye tracking technology has been developed which enables gaze based human computer interaction to operate immediately and robustly without any explicit set-up tasks. Integrating this technology as a control interface for the MR-compatible VR system allows young children to instantly engage in an immersive and interactive VR experience during MRI scans.
The novel eye tracking controlled VR system has been successfully tested on 23 children aged 2 to 13, allowing them to play games and watch films during MRI using only their eyes.
In human-computer interaction (HCI), gaze estimation is the process of determining where a person is looking. Gaze-based HCI systems mostly require calibration or complex setup at the start of each session, hindering instant control. This limitation affects usability, making immediate interaction challenging, especially for young children who benefit from easy and fast technology use.
A research team from the School of Biomedical Engineering & Imaging Sciences has developed an innovative eye tracking technology that allows for instant user control. This technology forms the core interaction interface of their groundbreaking MR-compatible VR system, designed to immerse children in an interactive virtual world during MRI scans.
MRI scans are noisy and stressful, often causing discomfort and movement especially in children, which can lead to scan failure. Instant interaction is crucial as it quickly engages children with VR, reducing anxiety and minimising movement.
To make the gaze-controlled VR experience engaging, game and video content has been developed, allowing easy customisation for each child's preferences. Interaction is simple: the child holds their gaze on items on the screen, triggering actions like playing a game, watching a video, or interacting with their favourite cartoon character. Maintaining a continuous sense of control is crucial for immersion in the VR system. Therefore, the innovative eye tracking system updates itself based on user interaction. The more the child interacts, the more accurate the gaze estimation becomes.
The VR experience also aims to reduce a child's tendency to move their head during the scan as much as possible, but not all head movement can be eliminated. To remedy this, the team have also used the DISORDER method previously developed at King's College London for baby MRI scans, which retrospectively performs motion correction on scanned images.
Combining these two innovations, enables the creation of a new system that can acquire high quality brain MR images from awake young children.
Our new technology shows promise to solve virtually all of the limitations of existing systems. Our approach opens new possibilities for awake MR studies in young children for both clinical and research purposes, potentially reducing the need for non-trivial interventions like anesthesia and enabling a new generation of MR based studies of awake brain processing in this formative period of life.
Dr Kun Qian, Post-Doctoral Researcher in the Centre for the Developing Brain, School of Biomedical Engineering & Imaging Sciences and study lead-author
Today it is normal to be using phones, tablets and computers for entertainment and many other important tasks. Everyone expects these devices to respond immediately and be intuitive to use, so forcing users to endure calibration processes and delays in starting is ever more challenging for them. The instant gaze technology we have created makes gaze-based user interfaces feel completely comfortable and natural to use. Feedback has been incredibly positive, both from children and adults, suggesting that this technology could make a real difference to MR examinations, with widespread benefits.
Professor Jo Hajnal, Professor of imaging Science in the Centre for the Developing Brain, School of Biomedical Engineering & Imaging Sciences and program leader for this study
Read the full study: Instant interaction driven adaptive gaze control interface.