COLUMBUS, Ohio – Novel technology intends to redefine the virtual reality experience by expanding to incorporate a new sensory connection: taste.
The interface, dubbed 'e-Taste', uses a combination of sensors and wireless chemical dispensers to facilitate the remote perception of taste – what scientists call gustation. These sensors are attuned to recognize molecules like glucose and glutamate — chemicals that represent the five basic tastes of sweet, sour, salty, bitter, and umami. Once captured via an electrical signal, that data is wirelessly passed to a remote device for replication.
Field testing done by researchers at The Ohio State University confirmed the device's ability to digitally simulate a range of taste intensities, while still offering variety and safety for the user.
"The chemical dimension in the current VR and AR realm is relatively underrepresented, especially when we talk about olfaction and gustation," said Jinghua Li , co-author of the study and an assistant professor of materials science and engineering at Ohio State . "It's a gap that needs to be filled and we've developed that with this next-generation system."
The system, whose development was inspired by previous biosensor work of Li's, utilizes an actuator with two parts: an interface to the mouth and a small electromagnetic pump. This pump connects to a liquid channel of chemicals that vibrates when an electric charge passes through it, pushing the solution through a special gel layer into the mouth of the subject.
Depending on the length of time that the solution interacts with this gel layer, the intensity and strength of any given taste can easily be adjusted, said Li.
"Based on the digital instruction, you can also choose to release one or several different tastes simultaneously so that they can form different sensations," she said.
The study was published today in the journal Science Advances.
Taste is a subjective sense that can change from one moment to another. Yet this complex feeling is the product of two of the body's chemical sensing systems working in tandem to ensure what you eat is safe and nutritious, the gustation and the olfactory (or smell) senses.
"Taste and smell are greatly related to human emotion and memory," said Li. "So our sensor has to learn to capture, control and store all that information."
Despite the difficulty involved in replicating similar taste sensations for a majority of people, researchers found that in human trials, participants could distinguish between different sour intensities in the liquids generated by the system with an accuracy rate of about 70%.
Further tests assessing e-Taste's ability to immerse players in a virtual food experience also analyzed its long-range capabilities, showing that remote tasting could be initiated in Ohio from as far away as California. Another experiment involved subjects trying to identify five food options they perceived, whether it was lemonade, cake, fried egg, fish soup or coffee.
While these results open up opportunities to pioneer new VR experiences, this team's findings are especially significant because they could potentially provide scientists with a more intimate understanding of how the brain processes sensory signals from the mouth, said Li.
Plans to enhance the technology revolve around further miniaturizing the system and improving the system's compatibility with different chemical compounds in food that produce taste sensations. Beyond helping to build a better and more dynamic gaming experience, the study notes that the work could be useful in promoting accessibility and inclusivity in virtual spaces for individuals with disabilities, like those with traumatic brain injuries or Long Covid, which brought gustatory loss to mainstream attention.
"This will help people connect in virtual spaces in never-before-seen ways," said Li. "This concept is here and it is a good first step to becoming a small part of the metaverse."
Other Ohio State co-authors include Shulin Chen, Yizhen Jia, Tzu-Li Liu, Qi Wang and Prasad Nithianandam and Chunyu Yang, including Bowen Duan and Zhaoqian Xie from Dalian University of Technology, Xiao Xiao and Changsheng Wu from the National University of Singapore, Xi Tian from Tsinghua University.
This work was supported by the National Science Foundation, the National Institute Of Biomedical Imaging and Bioengineering, the Chronic Brain Injury Pilot Award Program at Ohio State, the Center for Emergent Materials; the Center for Exploration of Novel Complex Materials, the Institute for Materials Research, the National Natural Science Foundation of China and the Dalian Outstanding Young Talents in Science and Technology.
#