Advancing virtual reality (VR) technologies are leveraging the brain’s visual and auditory systems and brain-body interactions to immerse users in virtual environments. However, there is significant and untapped potential for transformative cross-fertilization at the nexus of VR experiences, neurotechnologies, and frontier brain research – both in the deployment of neuroscience research to drive greater VR immersion, as well as the use of VR in research to further our understanding of the brain/mind and in the clinic to improve health outcomes.
On May 22-23, 2019, CIFAR convened a roundtable for Fellows in the Azrieli Program in Brain, Mind & Consciousness along with global leaders in the VR industry to explore how to drive forward the development of more immersive designed realities as well as a richer understanding of the human brain. Key insights and next steps that emerged from the discussions include: the importance of non-visual (e.g., auditory, haptic and social) information for immersive VR experiences; the opportunity, and need, created by VR to better understand how humans distinguish between real and simulated experiences; and the effects of prolonged VR use on health and child development. Participants also raised important ethical issues related to privacy and data ownership, and explored how academia and industry could enhance collaboration through improved data sharing. This roundtable built upon the conversations started in CIFAR’s panel on “The Future of VR: Neuroscience and Biosensor Driven Development” at the 2018 Game Developers Conference, and will pave the way for further opportunities for exchange and collaboration between researchers, content creators and technology developers.
Impacted Stakeholders
- Researchers in cognitive and behavioural neuroscience, ethics and law, and mental health
- Rehabilitation and mental health practitioners interested in the use of VR for therapies
- Developers of neuroimaging/-monitoring tools and of virtual/augmented/mixed reality technologies
- Designers, engineers, content creators and legal counsels in VR gaming and film industries
Key Insights
- VR makes it easier for researchers to study “real-life” scenarios by creating a more immersive and naturalistic experience than traditional lab techniques, while still allowing researchers to maintain a degree of control. However, because consequences in VR simulations are not the same as in real life, researchers need to be careful in how they interpret results from such experiments.
- One model of perception suggests that our brain is a “prediction machine” with “top-down” information processing, using past experiences to make inferences about the signals it receives. Thus, given the appropriate sensory inputs, humans can be tricked to perceive a simulation to be real. Improving neuroscientific understanding in this area can inform VR development, e.g., by identifying the most essential parameters for creating an immersive VR experience or realistic animated characters that do not fall into the “uncanny valley”. At the same time, by creating simulations that allow researchers to manipulate sensory inputs from the environment or the consequences of actions, VR tools could facilitate the study of how humans perceive reality.
- Visual and auditory information helps drive emotional engagement within a VR experience. However, our perceived reality can be enhanced by layering other feedbacks such as haptics (sensation of physical touch) and even passive inputs such as heart rate, eye movement and breath. Using multisensory integration to create VR experiences that are more immersive and inclusive (e.g., for users with disabilities) is an area of active investigation.
- While VR is typically focused on the individual experience, humans are intrinsically social – experiencing a phenomenon with other people and moving in synchrony lead to increased trust and cooperation. This synchrony in body movement as well as brainwaves can be shaped by the beat and groove of music. Thus, understanding non-verbal auditory and motor aspects of social interactions can be instructive for creating better VR experiences and for exploring how VR can integrate a shared or multi-user element.
- VR has already seen significant use in the clinic, as tools for cognitive assessment, for therapy or rehabilitation for a variety of conditions (including PTSD and stroke), and for training physicians and surgeons (e.g., for preoperative training or practising forced error scenarios). However, there are ongoing questions about whether VR-based tools are better than alternate medical practice.
- VR developers are exploring how to make control / input as intuitive for users as possible, whether with handheld controllers or newer possibilities such as eye-tracking. At the same time, advances in tools for mind-machine interface (such as electroencephalography and electromyography) as well as the growth in artificial intelligence and machine learning (which help with “decoding” the captured neurological data) are opening new avenues for “mindreading”, or control by thought. Challenges remain for such “neurocontrol” methods to achieve sufficient resolution (capturing signal from specific brain regions or motor units) and discrimination (e.g., between conscious or intentional actions vs subconscious or unintentional ones).
Priorities and Next Steps
- Better bridging the VR and research communities can stimulate the development of synergistic initiatives. Doing so would involve increasing VR developers’ and content creators’ knowledge of relevant neuro- and cognitive science concepts such as perception, metacognition and agency. It would also require building a greater understanding among researchers about what is feasible from a VR technological perspective.
- Virtual, augmented and mixed reality devices are measurement tools that generate large amounts of behavioural and neurological data, both in test modes by industry and through consumer use. To advance science, industry and academia should work together to identify ways to improve researchers’ access to this data while safeguarding user privacy and research ethics. As one option, VR companies could explore adding parameters to create academic vs. consumer versions of their games. At the same time, creators should evaluate their development needs and whether they could leverage relevant expertise within academia.
- There needs to be more research into how VR use may affect the development of the childhood brain, both to understand how its use in educational programs could be beneficial, and how excessive use could be harmful, particularly as a child is developing their sense of reality. More broadly, it is as yet uncertain how the design of VR experiences could shape users’ social behaviour.
- Better tools for reading and capturing thoughts, which open the door to more intuitive and less effortful control by users, also raise questions about privacy and data ownership that will need to be tackled by interdisciplinary dialogue among neuroscientists, technology developers, ethicists, social scientists, legal scholars and more. It might be instructive to look at how other fields, such as artificial intelligence or genomics, are tackling the ethical implications arising from their research and applications (e.g., through the creation of stakeholder/user communities controlling their own data).
Roundtable Participants
- Kent Bye, Voices of VR Podcast
- Craig Chapman, University of Alberta / CIFAR
- Ryan Chapman, Motive.io
- Brea Chouinard, University of Alberta
- Chad Dezern, Insomniac Games
- Noah Falstein, MindMaze / The Inspiracy
- Walter Greenleaf, Stanford University / MindMaze
- Peter King, Samsung
- Sid Kouider, École normale supérieure / CNRS / CIFAR
- Nick LaMartina, Electronic Arts
- David Menon, University of Cambridge / CIFAR
- Holly Nguyen, Facebook
- Aniruddh Patel, Tufts University / CIFAR
- Brian Pene, Autodesk
- Yelena Rachitsky, Facebook / Oculus
- Brian Schwab, Magic Leap
- Anil Seth, University of Sussex / CIFAR
- Omer Shapira, NVIDIA
- Laurel Trainor, McMaster University / CIFAR
- Timoni West, Unity
- Dan Wetmore, CTRL-Labs
- Joel Zylberberg, York University / CIFAR
Further Reading
2018 Game Developers Conference panel – The Future of VR: Neuroscience and Biosensor Driven Development (event brief)
Opening the (Virtual) Doors of Perception (research brief)
Is today’s artificial intelligence actually conscious? Not just yet (research brief)
Woman who is blind can see emotions (news article on recent research from the lab of Melvyn Goodale, co-director of the Azrieli Program in Brain, Mind & Consciousness)
For more information, contact Amy Cook, Senior Director, Knowledge Mobilization.