Hostname: page-component-745bb68f8f-5r2nc Total loading time: 0 Render date: 2025-02-06T06:26:01.609Z Has data issue: false hasContentIssue false

Neurophilosophical and Ethical Aspects of Virtual Reality Therapy in Neurology and Psychiatry

Published online by Cambridge University Press:  10 September 2018

Rights & Permissions [Opens in a new window]

Abstract:

Highly immersive virtual reality (VR) systems have been introduced into the consumer market in recent years. The improved technological capabilities of these systems as well as the combination with biometric sensors, for example electroencephalography (EEG), in a closed-loop hybrid VR-EEG, opens up a range of new potential medical applications. This article first provides an overview of the past and current clinical applications of VR systems in neurology and psychiatry and introduces core concepts in neurophilosophy and VR research (such as agency, trust, presence, and others). Then, important adverse effects of highly immersive VR simulations and the ethical implications of standalone and hybrid VR systems for therapy in neurology and psychiatry are highlighted. These new forms of VR-based therapy may strengthen patients in exercising their autonomy. At the same time, however, these emerging systems present ethical challenges, for example in terms of moral and legal accountability in interactions involving “intelligent” hybrid VR systems. A user-centered approach that is informed by the target patients’ needs and capabilities could help to build beneficial systems for VR therapy.

Type
Articles
Copyright
Copyright © Cambridge University Press 2018 

Introduction

In recent years, several high-performance virtual reality (VR) systems have been introduced to the consumer market. These systems are characterized—in comparison with the previous generation of VR technology—by high-resolution and seamless rendering, minimal lag, and other technical advances that generally increase the immersiveness of the user experience, as well as accessible programming interfaces that allow for the creation of customized VR environments. In addition to their obvious commercial appeal for the entertainment market, these systems could also enable a variety of clinical applications. Hybrid systems that combine advanced VR with auxiliary technology such as high-density motion capture and the sensing of an individual’s (neuro)physiological data, for example, would create new opportunities for treating psychiatric and neurological disorders. Today, VR systems are already used, still mostly in clinical research, for treating neurological and psychiatric symptoms and disorders (see Table 1 for an overview).

Table 1. Summary of the Current Use of VR Systems in Clinical Research and Therapy in Neurology and Psychiatry

* If applicable, given the breadth of the literature, (recent) reviews and controlled trials are preferably provided here as a guide to further exploring the literature.

MCI, mild cognitive impairment; PTSD, posttraumatic stress disorder.

To date, however, very little systematic discussion of the neurophilosophical and ethical challenges from the clinical use of these new VR systems is available. Here, I discuss some of these aspects centered on the vulnerabilities and needs of patients as the intended end users in neurology and psychiatry. Furthermore, I also look into the role of therapists and other medical professionals employing such systems in the future. For the patients, I show how these systems may provide a safe environment for exploring therapeutic options in otherwise highly stressful (or stigmatized) contexts; for example, in exposure therapy in social anxiety disorder. Hybrid VR-based systems that combine VR with electroencephalography (EEG), for example, could also help paralyzed patients in exercising their autonomy by providing communication capabilities and a platform for social interactions.

On the other hand, the immersiveness of these new systems and possibilities for creating persuasive simulations of avatars or virtual body parts (even full body illusions) may produce adverse effects, for example a disturbed sense of agency, particularly in vulnerable individuals.

Furthermore, patients could also become dependent on the assistive or therapeutic effects of advanced VR technology that could negatively affect their autonomy. I will argue that the new VR systems open promising avenues for new therapeutic approaches in neurology and psychiatry, but that this development requires a careful and proactive ethical deliberation that integrates perspectives from end users, philosophers, and developers as well as medical professionals and clinical researchers.

Core Concepts at the Intersection of Philosophy, Cognitive Science, and Psychology

I will first give a short overview of core concepts at the intersection of philosophy, cognitive science, and psychology that may help readers to better understand the actual and potential adverse effects of highly immersive VR, and their ethical implications, from a subject-centered perspective.

The 4E Framework of cognition

Important concepts and theories that have emerged at the intersection of philosophy, psychology and cognitive science are embodiment,Footnote 46 enactivism,Footnote 47 the extended mind hypothesis,Footnote 48,Footnote 49 and embeddedness.Footnote 50 In an attempt at unification, these are often referred to as the 4E framework of cognition.Footnote 51 Historically, this development could be understood as a reaction to the rise of cognitivism as the main explanatory model for mental processes; for example (and most prominently in the work of Noam Chomsky), in linguistics.Footnote 52 Although the debate on the relative merits and demerits of both approaches has engulfed philosophy (particularly phenomenology), cognitive science and psychology for many years, we also find promising attempts to identify cross-fertilizing aspects, rather than further deepening and entrenching the scholarly divide.Footnote 53

In the realm of VR studies, however, it seems safe to say that concepts pertaining to the 4E framework for studying the effects of VR on user experience and behavior are far more widely used than cognitivist models. Many even consider highly immersive VR as a promising, if not ideal, testbed for studying phenomenological and action-related concepts such as the sense of agency, the active self, and others.

Body Ownership and Body Memory

An important aspect of embodiment theories is a person’s sense of (and ability to claim) body ownership; that is, to generate a stable and clear representation of their bodies (and their bodies’ boundaries) in space. The primordial nature and importance of this mechanism for maintaining a stable embodied self-representation over time seems quite evident. Phenomenologically inspired work on body representation under unusual circumstances (e.g., in experiments including the so-called “rubber hand illusion”Footnote 54 or the “body ownership illusion” in experiments with brain-computer interfaces [BCI]),Footnote 55 however, demonstrate the frailty of our body representation. Connected to the concept of body ownership is the idea of a body memory. Whereas body ownership refers to the stable mental representation of one’s body, body memory encompasses the level of implicitly formed and embodied body-related memories. These are often re-actualizations of past body-related experiences or body-related abilities in present activities. These may be complex action sequences and kinematic patterns, such as playing the piano or riding a bike, but they also, for example, may anticipate the sensations that occur when one’s body hits the water when jumping off of the diving board at the pool. Like body ownership, body memory can also be subject to alterations; for example, in individuals with phantom pain after losing a limb.Footnote 56

The Self, Identity, and Authenticity

In philosophical and psychological scholarship on the self, we find many different (and sometimes incommensurable) theories and positions.Footnote 57 At one end of this spectrum are constructivist ideas that deny the ontological existence of a unified and holistic self in favor of constructed narratives about one’s self (the “narrative self”). At the other end, we encounter more cognitively and neurobiologically (particularly in motor cognition theories) grounded concepts that view the core feature of the self as the ability to perceive one’s self as an active agent of one’s own but not of others’ actions (the “active self”) and are thus related to the concept of agency (see subsequent discussion). Other theories emphasize the importance of interpersonal relationships (the “relational self”) and social embeddedness (the “social self”) in shaping our identity. As the related concepts (personal) identity and authenticity are also contentiously debated, we shall have to be content with using the concept of “self,” particularly the “active self” here. The model of an active self has proven much more fruitful in studying the sense of agency in cognitive science and psychology, particularly using VR.

Agency and Sense of Agency

A fundamental aspect of the active self is that we perceive ourselves, through our “sense of agency,” as the agents of our own actions but not of others’ actions. Important factors for a stable sense of agency are the feeling (and certainty) of voluntarily initiating, performing, and terminating our actions.Footnote 58

This fundamental sense of agency is also a constitutive element for building a self-consciousness, and it can become unstable under pathological conditions; for example, during psychosis or in some movement disorders, or through experimental manipulation.Footnote 58 This malleability of the sense of agency adds strong motivation to current translational research in clinical neuroscience, including the use of VR experiments, to understand the core mechanisms, both functionally and at the neurobiological level, of agency.

For systematic purposes, researchers have decomposed the sense of agency into different levels of processing. One popular model distinguishes a level of implicit processing (the “feeling of agency”), linked to basic sensorimotor and affective processes, from an explicit level (“judgment of agency”), linked to higher-order cognitive processes such as intentionformation and decisionmaking.Footnote 59,Footnote 60 The precise interplay (and hierarchy) of these processing levels, however, as well as the precise role of sensorimotor integrations for the emergence and stable maintenance of a sense of agency are not understood in depth thus far.Footnote 61,Footnote 62 From the perspective of embodiment theory, “active agency” (related to the concept of the “active self”) is an important concept. In active agency, it is the acquisition of skilled action through procedural motor learning and the internal representation of action goals (in close interaction with the environment) that is crucial for developing a stable sense of agency.

Intriguingly, combining a highly immersive VR system with an EEG-based BCI for real-time closed-loop control of the simulations in VR (based on the online analysis of the brain signals), would provide an interesting testbed for investigating the role of sensorimotor functions for the sense of agency. With such a hybrid closed-loop VR-BCI system, researchers could, for example, modulate the degree to which the sensorimotor system is involved in the control of human avatars (or robotic simulations), or modulate the realism of human avatars (or the human-likeness of humanoid robots) to investigate agency (and sense of agency) under experimentally controlled conditions.

Core Concepts in VR Research

Immersion

Much like being immersed in water during diving, the encompassing nature of a VR environment with seamless rendering, minimal lag, and convincing graphic simulations (that respect, for example, actual and intuitive physics) can lead to a state of sensory (and cognitive) immersion. Increasing the immersiveness of VR systems, by expanding the field of view and better graphics performance, for example, is therefore one of the main priorities in VR development for entertainment purposes. As the immersiveness of any VR system primarily depends on such technical aspects, it can be assessed empirically and thus compared between different systems.

For systematic purposes, it should be noted that immersion consists of different features, each of which jointly (or, depending on the context, separately) contributes to the overall feeling of immersion when using VR. One important feature is sensory fidelity: visual or auditory perceptions in VR that match closely our perceptions in the real world. Whereas sensory fidelity seems to be a particularly important feature in “serious” applications of VR (such as medical or educational VR), being able to do “impossible” things, such as fly, is an important feature for entertainment purposes.Footnote 63

As greater immersion, however, increases the “costs” of computational processing—a problem limiting the miniaturization and portability of VR systems—it seems reasonable to ask what minimal amount of immersion is required for an optimal VR user experience in any given context, be it gaming or in medical therapy, a question that has not been answered thus far.

Presence, Co-presence, Social Presence, and Telepresence

In short, presence in VR, as opposed to the perceptual aspect of VR experience modulated by immersion, usually refers to users’ feeling of “being there” in the virtual environment (rather than experiencing situatedness in the room in which they use VR).Footnote 64 It is influenced by users’ internal willingness (and propensity) to “dive” into the VR experience, by their attention, and, also, by the immersiveness of the VR simulation.Footnote 65

Co-presence refers to the experience of sharing the VR environment with other persons. Social presence, although related to co-presence, further includes the possibility to interact meaningfully with co-present users in VR. Although these are certainly highly subjective experiences, presence, co-presence, and social presence can be studied empirically with questionnaires and in-depth user interviews.Footnote 66

With the emergence of systems for virtual therapy, whether in psychotherapy, physical therapy, or other forms of virtual teletherapy, these different facets of presence will likely play an important role in patients’ comfort level and acceptance of these new approaches.

Virtual Body Ownership and Body Memory

Body ownership, as discussed, is the ability to build a stable representation of one’s body (and its physical boundaries) in space. In a virtual environment, the sense for these boundaries, based on sensory and proprioceptive input, may easily conflict with the content of the visual simulations Developing highly realistic and even personalized avatars is, therefore, an important goal for VR design to minimize this potential mismatch and give users the experience of possessing (and controlling) a virtual body. In a medical context, for example in developing VR-based treatments for eating disorders, the ability to modulate an individual’s body image via changes in the avatar’s virtual body could be an important tool for developing new therapies. For body memory, the unique possibilities of changing the user perspective in VR from an egocentric to an allocentric view or to let the user experience the virtual avatar body of another person (body “swapping” or “transfer”) also provides unique opportunities for research in psychology and cognitive sciences as well as therapies in neurology and psychiatry.Footnote 67,Footnote 68,Footnote 69,Footnote 70

Realistic Avatars and the “Uncanny Valley”

Advances in VR graphics performance, motion capture, and the rendering of an individual’s physiognomy (particularly facial features) allow VR developers to steadily increase the realism of VR avatars. However, research on human–machine interaction from robotics and experiences from animation filmmaking suggest the existence of an “uncanny valley” effect—a drop in users’ rating of the familiarity of a humanoid robot (or puppet) as its human likeness increases (see Figure 1). The drop in familiarity in this model seems to occur whenever humanoid robots, puppets, or graphical avatars are very human-like, yet not quite “there;” for example, in terms of realistic kinematics and the animation of facial expressions.

Figure 1. The putative “uncanny valley” effect in animated humanoid robots, puppets, and prostheses (and possibly also VR avatars).Footnote 71

Whether and to which degree this effect of uncanniness translates to computer-generated avatars in VR has not been studied comprehensively thus far. Moreover, some researchers are skeptical whether the uncanny valley effect actually exists and, if so, whether it significantly affects the VR users’ experiences and acceptance and enjoyment of the VR environment.Footnote 72,Footnote 73

Therefore, gathering and analyzing more data on users’ rating of human-like avatars (and robots in VR) with validated qualitative research instruments could help to clarify this matter. Considering the likely further development in computer graphics and avatar simulation, however, this open question could soon also turn out to be moot once we get truly seamlessly rendered, fluent, and highly personalized avatars in VR: a bridge over the uncanny valley.

Trust

For interactions in a highly immersive VR environment, the question of trust seems crucial to ensure a positive user experience and will, ultimately, be an important determinant of whether VR will be accepted by patients and adopted by the medical community. For building trust in any particular VR system, basic preconditions have to be fulfilled: the VR system has to (1) avoid harm to the user, (2) meet the general expectations of the user (although surprise can also be a welcome element, for example in gaming), (3) be intuitive to handle, and (4) have a high level of ability to achieve the user’s specific goals.Footnote 74 Furthermore, preliminary evidence from VR research suggests that a high level of presence increases a user’s trust in a VR system.Footnote 75 But trust is of course a two way street, and apart from the technological specifications, the user’s propensity to trust in technology in general (and VR in particular) may also strongly influence whether the user develops trust toward a VR system.

For medical applications, for example VR-assisted rehabilitation or psychotherapy, the question of trust will extend beyond the question of technical reliability, and will include an interpersonal element; that is, human therapists. VR therapy would create new modes of patient–therapist interactions: for example, a human therapist may be present in the same room as the patient while also being co-present in the VR environment, but the therapist could also be telepresent in VR while actually being thousands of miles away from the patient.

How these different modes of interaction affect the amount (and strength) of trust that builds between patients and therapists under such substantially different circumstances would also be an important topic for future research on emerging VR therapy.

Adverse Effects of VR on Medical, Action-Related, and Phenomenological Aspects of User Experience

To better contextualize the emerging ethical challenges from VR therapy in neurology and psychiatry, I will first delineate some adverse effects that users of highly immersive VR systems can (and could) experience.

Reliability of Technical Performance: From a Glitch Into the Ditch?

From a purely technical perspective, it is important to recognize that the current commercially available VR systems have been primarily developed for providing entertainment to consumers. This means, however, that the demanding, sometimes exacting, technical standards that are usually required from an experimental apparatus in science (or from medical devices) are not necessarily met by the current generation of VR headsets. One issue that apparently limits the technical reliability of the current systems is random fluctuations in rendering latency.Footnote 76 To provide one example of how such a seemingly minute technical problem may amplify into an adverse effect consider the following scenario. If such fluctuations lead to unexpected glitches in the graphics performance, this could lead to a mismatch between a user’s visual experience and bottom-up sensory (and proprioceptive) input which, in turn, could make the user stumble or fall. Research on medical applications of VR systems should, therefore, also include investigation into whether the technical reliability of the currently used systems is sufficient to avert such adverse effects.

Cybersickness: An Important Challenge for VR Therapy

Some users of VR systems experience, to varying degrees, nausea, vertigo, fatigue, headache and other symptoms summarized under the term “cybersickness.”Footnote 77 Although stopping the use of the VR system will usually quickly resolve the symptoms, the individual propensity for developing cybersickness (and anxiety about recurring cybersicknessFootnote 78) may prevent the affected users from repeatedly using the system. Moreover, some evidence suggests that individuals with existing damage to the central nervous system might be more prone to developing cybersickness (for example in multiple sclerosis).Footnote 79 Finding technical solutions to minimize cybersickness will, therefore, be an important challenge to VR engineers and programmers to increase the applicability of VR, particularly in medicine. To this end, one important research focus could be the identification of (neuro)physiological parameters, such as heart rate or EEG data, that reliably indicate the presence (or, better yet, the impending onset) of cybersickness in VR users.Footnote 80 Here too, a closed-loop VR-EEG system could provide important insights into the feasibility of adaptively controlling the VR environment based on the real-time analysis of neurophysiological (and other biometric) data to reduce (or prevent) cybersickness in susceptible users.

Immersion and Presence: Too Much Into It?

The improved immersion, when modern VR technology is compared with older generations of VR systems, is certainly a key feature for its success. We do not yet know much, however, about the long-term effects of highly immersive simulations in VR on vulnerable individuals such as children, adolescents, or patients. Recently, an “internet gaming disorder” was introduced in the fifth edition of the Diagnostic and Statistical Manual of Mental Disorders (DSM-5).Footnote 81 To some degree at least, engaging and immersive computer games and “social” Internet services seem to tap regions of the “reward network” in the human brain that are similar to the circuits involved in substance abuse.Footnote 82 One ought to be critical of an unreflected medicalization of social phenomena, and the existing research on the subject is far from conclusive (and very uneven in quality). However, the evidence thus far provides reason to proceed with precaution and perform more research on the effects of highly immersive gaming and VR environments and the effect of persuasive (social) presence, particularly in vulnerable groups, on the users’ mental health, specifically with respect to the addictiveness of the technology.

Virtual Body Ownership: Altered Bodies, Altered Minds?

As we have seen, (future) VR will presumably offer unprecedented opportunities to manipulate the shape, movement pattern, and other features of highly realistic and personalized avatars; for example, to assist in the treatment of anorexia nervosa and other eating disorders. As we have discussed briefly, illusions of body ownership can readily be evoked in healthy individuals as well as individuals with a disability by relatively low-tech setups, such as the rubber hand illusion.Footnote 83 With the advent of highly realistic avatar simulations in VR, researchers are now building whole VR embodiment laboratories as testbeds for illusions of (virtual) body ownership.Footnote 84 Such a setup could have many beneficial uses: performing full virtual body swaps in VR could help to reduce stigma and prejudice and enhance empathy; creating persuasive illusions of body ownership in immersive VR could help to alleviate pain in phantom limbs, or help in treating eating disorders, body dysmorphic disorders, and other psychiatric conditions that are difficult to treat in some cases with existing therapeutic regimens.Footnote 85

Here too, however, researchers should proceed cautiously when using such altered avatars for treatment purposes and should study in depth the immediate and long-term effects on body ownership, body memory, and body image.

Trust: A Multilayered Affair

For a VR system to become a viable tool for assisting therapists and physicians in medical treatment, trust will be an absolutely essential prerequisite. This trust, as I discussed, is multilayered: users have to trust in the VR system’s safety and reliability, they also need to trust the person recommending the use of VR therapy, and they need to develop trust in any therapist who is present in the room (while using the VR system) and/or is co- or telepresent in the VR environment.

Such a system of discernible, yet interconnected, levels of trust means that the failure of any one component—whether a technical malfunction or human error—may erode the user’s trust in the system as a whole.

I will take the example of a closed-loop hybrid VR-EEG system. In such a system, the VR simulation could be modulated adaptively by the subject’s brain activity in real time. This technological expansion of VR would create opportunities for promising new medical applications; for example, a VR-EEG system for adaptively controlling anxiety. At the same time, the inherent intransparency of some advanced machine learning algorithms (for deep learning, for example)—the black-box problem—could make such a system less predictable for users and therefore less trustworthy.

Agency and Sense of Agency: New Disorders in VR?

As I have discussed, the combination of advanced graphics and avatar simulation with increasing immersiveness in modern VR systems, as well as the prospect of closed-loop interaction in hybrid VR-EEG/VR-BCI systems, provides exciting tools for performing innovative research at the intersection of philosophy, psychology, cognitive science, and medicine. The user’s agency and sense of agency (similar to the question of multilayered trust) will likely play a central role in such a research program and in VR therapy.

We should recognize that historically, from the perspective of neurophilosophy and practical ethics, there is not one unified concept of “agency,” but rather a variety of notions and definitions. From a deontological perspective, for example, a subject’s autonomy expresses itself in the ability and in the exercise (in “active agency”) of the subject’s intentional actions. Other concepts stress the importance of the subjects’ relationship to other persons in exercising their agency (“relational agency”) and in sharing parts of their agential capabilities in cases of relegating some aspects of the decisionmaking for and/or exercise of actions to another human or to an intelligent system (“shared agency”).Footnote 86 With respect to closed-loop systems (in this case systems for deep brain stimulation [DBS]), Sara Goering and colleagues have suggested that: “Acting autonomously may mean that you develop your motives through dialogue with others, that your motives are not entirely your own, and that even your actions may be shared in some sense.”Footnote 87

Now, in theory, emerging highly immersive VR systems offer intriguing possibilities for modulating a user’s sense of agency. Imagine, for example, the following scenarios:

  1. 1) Researchers could design and modify highly realistic simulations of whole-body avatars or body parts in VR. Imagine a user with a missing limb that can control a virtual simulation of that user’s arm or a prosthetic limb in VR, for example with a closed-loop VR-EEG system. Modulating the realism of the simulated limb—for example, from cartoonish or blurred to a near- perfect rendering of the subject’s actual limb (for example based on old videos)—could conceivably alter the subject’s sense of agency, feeling of control (as well as virtual body ownership and other factors). Could such modification, or manipulation of a subject’s sense of agency particularly if performed against the will of the subject and/or with incomplete understanding by the subject), lead to hitherto unknown (or unrecognized) disorders of agency? One could imagine, for example, that some modifications of the sense of agency could lead to something like “agential uncertainty”: a feeling of loss of control and unease about one’s (sense of) agency.

  2. 2) Similarly, the attempts to increase the immersiveness of VR by merging the VR headsets with other sensory input devices, such as haptic gloves, for example, could also provide opportunities to modify sensory bottom-up aspects of the user’s experience. Imagine, for example, a subject with chronic pain in the hand, caused by a “complex regional pain syndrome” (CRPS), a debilitating disorder of pain and sympathetic dystrophy that can develop after injuries or surgery. CRPS is notoriously difficult to treat; therefore, exploring the viability of alleviating a patient’s pain by modulating sensory processing pathways through combining haptic input devices with immersive VR would seem to be a reasonable endeavor. But what if the external manipulation of sensory input, in this case vibrotactile input by the haptic glove (but also by modifying other sensory input channels), leads to a feeling of uncertainty about the source and validity of subjects’ first-person sensory experience, “epistemic uncertainty” and/or “phenomenological unease?”

Although hypothetical at the moment, it might turn out that such emerging (hybrid) VR systems could alter core aspects of a user’s phenomenological experience and action-related processes, such as the senses of agency and selfhood, and others. Concomitant research into such possible adverse effect on the sense of agency and other action-related and phenomenological aspects of user experience, therefore, needs to become an important and integral part of further research on VR therapy.

Ethical Aspects of VR Therapy in Neurology and Psychiatry

Better technical abilities and the emergence of hybrid systems, such as closed-loop VR-EEG or VR systems with sensory input devices, could help in developing new therapies for neurological and psychiatric patients. At the same time, closer human–machine interaction and the increasing “intelligence” of adaptive systems (based on advanced machine learning and big data) may also create important ethical challenges for VR therapy. Although the focus of our analysis of the ethical implications here will be on medical VR systems, particularly for neurological and psychiatric disorders, similar developments for consumer-directed VR systems will also create important ethical, legal, and social challenges, which will be explored on another occasion.

Autonomy and Agency of Research Subjects and Patients in VR Therapy

Autonomy is a central principle of research ethics and medical ethics within the tradition of Western biomedical ethics.Footnote 88 In the context of the emerging VR systems discussed previously here, the following aspects with respect to autonomy of research subjects and patients seem particularly important in VR therapy in neurology and psychiatry.

  1. 1) First, the technological prospects of the emerging systems, particularly hybrid and closed-loop systems, offer opportunities for individuals with a neurological or psychiatric disorder to better exercise their autonomy (and agency) in cases in which these capabilities are impaired, for example through paralysis or in debilitating anxiety. Imagine, for example, a closed-loop hybrid VR-BCI system with which a severely paralyzed patient—for example, in a locked-in state with otherwise minimal communication capabilities—would gain the capability to move around in a virtual environment and communicate with co-present avatars (operated by/representing, for example, their loved ones). This would certainly restore their previously impaired autonomy and agency and could significantly improve their quality of life and their capability for informed decisionmaking.

  2. 2) On the other hand, within the potential target patient groups in neurology and psychiatry—stroke, movement disorders, dementia, severe paralysis, debilitating social (or other) anxiety, autism spectrum disorder, depression, and other conditions—we encounter particularly vulnerable patients. Therefore, giving those vulnerable patients the tools for restoring particular capabilities should not diminish the responsibility of the therapists (and other medical professionals) to assess the patient’s mental capacity for decisionmaking and for giving informed consent. But here again, being able to communicate and/or socially interact through such a hybrid VR system at all would be an important (and commendable) prerequisite for the attending therapists to actually be able to assess the patient’s mental health and cognitive functioning.

  3. 3) Highly immersive and interactive VR offers great opportunities for social interaction, particularly for individuals who might otherwise have difficulties in building and maintaining social contacts, such as in autistic spectrum disorder, severe social anxiety, or agoraphobia. Previously, I have briefly mentioned the notion of “relational agency,” a concept in which agency extends beyond the individual subject to include the level of intersubjective experience and relationships, to enable a meaningful (inter)action with the environment and others. Similarly, the notion of “relational autonomy” conceives of the subject not as an isolated being who makes decisions and performs actions solely on the basis of intrinsic motives and intentions, but as a subject for whom interpersonal relations and shared experiences are an important part of that subject’s capacity to exercise autonomy.Footnote 89 In VR-assisted therapy, particularly (but not only) in individual and group psychotherapy, such a relational concept of autonomy could, to some degree, help to level the playing field between patient and therapist and lead to more non-hierarchical modes for therapy.

  4. 4) The possibility of creating personal avatars in VR that do not need to resemble the actual physical appearance of a subject could also help to reduce stigma and bias, particularly in remote interactions, for example with telepresent therapists in VR-assisted therapy. At the same time, these degrees of freedom in abstracting or personalizing VR avatars could also be problematic in therapeutic settings in which an impaired or distorted body image is part of a patient’s diagnosis, as might be the case in eating disorders or body dysmorphic disorder. Therefore, patients with vulnerabilities in that respect should be closely supported by therapists when personalizing avatars, to prevent adverse effects on a patient’s body image.

Accountability of “Intelligent” Closed-Loop VR Systems: Keeping Patients and Therapists in the Loop

VR systems for entertainment as well as clinical applications should ideally both be intuitive and reliable to use, but also adaptive and interactive, to maximally engage the users. This poses a significant challenge to VR developers, which may explain the appeal of advanced machine learning algorithms, for example artificial neural networks for “deep learning,” for building “intelligent” and adpative VR systems to tackle this problem.

However, as in many other contexts in which humans interact with “intelligent” systems—whether autonomous vehicles or BCI—the intransparency of the underlying algorithms may create gaps in moral and legal accountability. With respect to “intelligent” BCI systems, there is currently a debate on whether (and if so to which degree) keeping humans, whether the users themselves or others (for example a therapist), “in the loop” may help to preserve accountability and strengthen the autonomy of the subject in such systems.Footnote 90

Using opaque, black-box algorithms for online data analysis in hybrid VR-EEG/VR-BCI systems could create similar problems of accountability that would be particularly precarious if such systems are developed (and marketed) for medical use. This again poses a challenge to regulatory bodies to closely scrutinize the transparency of such systems and develop rules and guidelines for the necessary levels of algorithmic transparency, perhaps adapted to the potential medical risks of any particular clinical application.

Building systems that are adaptive and may learn with the user, yet are governed by a transparent set of algorithms that ensure accountability in cases of unexpected system failure (with negative effects for users or third persons), could become a crucial model for developing “intelligent” and transparent medical devices, whether a BCI or VR-EEG system, in the future.Footnote 91

The Gadget Fallacy: Technology-Driven versus User-Centered Approaches to VR Therapy

Exciting progress in technology usually breeds a bubbling enthusiasm in technology-affine communities which, however, often quickly boils over into hype and unrealistic expectations. VR technology is not immune to this “hype cycle”—ranging from the initial excitement, to the peak of inflated expectations, into a trough of disillusionment, and up again to a plateau of actual productivity—and technology analysts disagree, where exactly the latest generation of VR technology currently finds itself on that fateful journey and how it intersects with other technology hypes such as big data and deep learning.Footnote 92

For medical technology, as in assessing medical artificial intelligence (AI) systems, the community needs to critically discuss whether it wants to adopt a technology-driven or user-centered approach in translating emerging VR technology to the clinic. In the technology-driven approach, the priorities of the companies that develop VR systems will ultimately determine the range of capabilities and possible medical applications. In a bottom-up, user-centered approach, developing medical VR systems would be driven by the actual needs and priorities of the vulnerable patients—with qualitative research using focus groups, narrative interviews, and other methods—for whom VR systems could offer important new therapies. Strengthening and incentivizing such end-user-oriented research should, therefore, be a priority in (public) funding schemes that support the development of medical VR systems.

Nonmaleficience and Regulatory Frameworks: Avoiding Harmful Effects in VR Therapy

As sketched previously here, highly immersive VR systems, particularly hybrid systems such as a VR-EEG system, may have many adverse effects, from relatively well-known effects, such as temporary cybersickness, to potentially longer-lasting effects on a user’s well-being, for example if it turned out that highly immersive VR would carry the risk of being addictive to susceptible individuals.

As these effects are not particularly well understood, let alone grounded in systematic empirical studies, research on those adverse effects in healthy users and potential groups of patients as end users should be prioritized and made integral to research and development of such VR systems, particularly for clinical applications. Otherwise, the physicians will have no sufficient evidence base for ensuring the nonmaleficient, “first do no harm,” use of this emerging technology.

An open question in that respect, from a regulatory and legislative perspective, is whether VR systems for medical applications should fall under existing medical device regulation regimens, and whether these existing guidelines are sufficient to prevent harmful effects of this technology.

Any regulatory response to medical VR technology should of course tie in and be harmonized with comprehensive proposals on effective regulatory frameworks for similar emerging medical technologies, such as BCIs, medical robots, or intelligent decision-support systems (“medical AI systems”).Footnote 93,Footnote 94,Footnote 95

Summary and Outlook

To summarize, the new generation of highly immersive VR headsets and emerging hybrids such as VR-EEG/VR-BCI systems allow for many intriguing and potentially beneficial prospective clinical applications in neurology and psychiatry. Closed-loop hybrid VR-BCI systems, for example, could help patients with severe paralysis to communicate and interact with co-present friends or relatives in a rich and stimulating VR environment. Modifying and personalizing highly realistic avatars in immersive VR environments could help people with social anxiety or autism spectrum disorder to interact with others in a safe space of their choosing in VR. Highly realistic simulations of a patient’s limb could be used to augment motor rehabilitation, for example in spinal cord injury; or to treat phantom pain sensations through virtual “rubber limb” illusions. Telepresence of therapists in VR may bring psychotherapy to people in remote places that otherwise would perhaps have no access to mental health services.

However, the immersive nature of modern VR systems and the many possibilities for modifying user experience in VR may also produce adverse effects, particularly in vulnerable users such as children and adolescents or patients with brain disorders. Importantly, the potential phenomenological, psychological and action-related adverse effects hypothesized here: agential uncertainty, phenomenological unease, or epistemic uncertainty— but also others such as self-alienation—are of course not restricted to users with neurological or psychiatric disorders. They could also occur in patients who use (closed-loop) VR systems for rehabilitation after amputation of limbs (or severe traumatic injuries) and in users of VR systems for entertainment. However, patients with impaired brain functions, particularly cognitive or emotional processing, in neurology and psychiatry could be especially vulnerable and prone—neurophysiologically as well as psychologically—to experiencing these effects and perhaps less resourceful in coping with them.

When relying on “intelligent” hybrid systems, such as VR-EEG, for providing therapy, it should also be discussed whether users have the right to know when they are interacting with an AI system and how granular the consent process for such AI-assisted therapies should be. As is often the case in the interaction between humans and “intelligent” systems, keeping humans—in this case the patient and therapists—in the loop at key decision points may help to preserve moral and legal accountability.

For ensuring a patient-oriented research and development process in the emerging field of VR-assisted therapy, I have discussed how technology-driven approaches might create applications (“use cases” in technology parlance) that do not necessarily cater to the most pressing needs of the patients. Promoting a user-centered approach in which the capabilities and needs of target patient groups play a central role to guide the design and development process of medical VR technology might therefore help to evade this “gadget fallacy.” This user-centered process could be complemented by the input of ethicists and philosophers and VR developers, as well as medical professionals and clinical researchers, into the optimal technological and design features to best meet the needs of the patients.

VR has many technological and creative features and facets that could help to substantially improve the lives of many patients in neurology, psychiatry, and many other medical fields. As with many other powerful technologies before—whether pharmacology, genetic engineering, electrical stimulation, AI, or others—the decisive question will not be, whether they will be used for medical purposes but rather how we can ensure that medical VR technology contributes to diminishing the burden for patients and to promoting human flourishing.

Footnotes

This work was (partly) supported by the German Ministry of Education and Research (BMBF) grant 13GW0053D (MOTOR-BIC) and the German Research Foundation (DFG) grant EXC 1086 BrainLinks-BrainTools to the University of Freiburg, Germany.

References

Notes

1. Spicer, R, Anglin, J, Krum, DM, Liew, S-L. REINVENT: A Low-Cost, Virtual Reality Brain-Computer Interface for Severe Stroke Upper Limb Motor Recovery—IEEE Conference Publication. Los Angeles: IEEE; 2017. doi:10.1109/VR.2017.7892338.Google Scholar

2. Pedreira da Fonseca, E, Ribeiro da Silva, NM, Pinto, EB. therapeutic effect of virtual reality on post-stroke patients: Randomized clinical trial. Journal of Stroke and Cerebrovascular Diseases 2017;26:94100.CrossRefGoogle ScholarPubMed

3. Saposnik, G, Cohen, LG, Mamdani, M, Pooyania, S, Ploughman, M, Cheung, D, et al. Efficacy and safety of non-immersive virtual reality exercising in stroke rehabilitation (EVREST): a randomised, multicentre, single-blind, controlled trial. The Lancet Neurology 2016;15:1019–27.CrossRefGoogle ScholarPubMed

4. Corbetta, D, Imeri, F, Gatti, R. Rehabilitation that incorporates virtual reality is more effective than standard rehabilitation for improving walking speed, balance and mobility after stroke: A systematic review. Journal of Physiotherapy 2015;61:117–24.CrossRefGoogle ScholarPubMed

5. Yin, CW, Sien, NY, Ying, LA, Chung, SF-CM, Tan May Leng, D. Virtual reality for upper extremity rehabilitation in early stroke: A pilot randomized controlled trial. Clinical Rehabilitation 2014;28:1107–14.CrossRefGoogle ScholarPubMed

6. Lohse, KR, Hilderman, CGE, Cheung, KL, Tatla, S, der Loos, HFMV. Virtual reality therapy for adults post-stroke: A systematic review and meta-analysis exploring virtual environments and commercial games in therapy. PLOS ONE 2014;9:e93318.CrossRefGoogle ScholarPubMed

7. Laver, K, George, S, Thomas, S, Deutsch, JE, Crotty, M. Virtual reality for stroke rehabilitation. Stroke 2012;43:e20–1.CrossRefGoogle Scholar

8. Saposnik, G, Mamdani, M, Bayley, M, Thorpe, KE, Hall, J, Cohen, LG, et al. Effectiveness of virtual reality exercises in stroke rehabilitation (EVREST): Rationale, design, and protocol of a pilot randomized clinical trial assessing the Wii gaming system. International Journal of Stroke 2010;5:4751.CrossRefGoogle ScholarPubMed

9. Saposnik, G, Teasell, R, Mamdani, M, Hall, J, McIlroy, W, Cheung, D, et al. Effectiveness of virtual reality using wii gaming technology in stroke rehabilitation. Stroke 2010;41:1477–84.CrossRefGoogle ScholarPubMed

10. Yang, Y-R, Tsai, M-P, Chuang, T-Y, Sung, W-H, Wang, R-Y. Virtual reality-based training improves community ambulation in individuals with stroke: A randomized controlled trial. Gait & Posture 2008;28:201–6.CrossRefGoogle ScholarPubMed

11. Henderson, A, Korner-Bitensky, N, Levin, M. Virtual reality in stroke rehabilitation: A systematic review of its effectiveness for upper limb motor recovery. Topics in Stroke Rehabilitation 2007;14:5261.CrossRefGoogle ScholarPubMed

12. Le May, S, Paquin, D, Fortin, J-S, Khadra, C. DREAM project: Using virtual reality to decrease pain and anxiety of children with burns during treatments. In: Proceedings of the 2016 Virtual Reality International Conference, New York: Association for Computing Machinery; 2016:24:124:4.Google Scholar

13. Malloy, KM, Milling, LS. The effectiveness of virtual reality distraction for pain reduction: A systematic review. Clinical Psychology Review 2010;30:1011–8.CrossRefGoogle ScholarPubMed

14. Das, DA, Grimmer, KA, Sparnon, AL, McRae, SE, Thomas, BH. The efficacy of playing a virtual reality game in modulating pain for children with acute burn injuries: A randomized controlled trial [ISRCTN87413556]. BMC Pediatrics 2005;5:1.CrossRefGoogle Scholar

15. Chiarovano, E, Wang, W, Rogers, SJ, MacDougall, HG, Curthoys, IS, de Waele, C. Balance in virtual reality: Effect of age and bilateral vestibular loss. Frontiers in Neurology 2017;8:5.CrossRefGoogle ScholarPubMed

16. Tjernström, F, Zur OJahn, K. Current concepts and future approaches to vestibular rehabilitation. Journal of Neurology 2016;263:6570.CrossRefGoogle ScholarPubMed

17. Whitney, SL, Alghadir, AH, Anwer, S. Recent evidence about the effectiveness of vestibular rehabilitation. Current Treatment Options in Neurology 2016;18:13.CrossRefGoogle ScholarPubMed

18. Hsu, S-Y, Fang, T-Y, Yeh, S-C, Su, M-C, Wang, P-C, Wang, VY. Three-dimensional, virtual reality vestibular rehabilitation for chronic imbalance problem caused by Ménière’s disease: A pilot study. Disability and Rehabilitation 2017;39:1601–6.CrossRefGoogle ScholarPubMed

19. Meldrum, D, Herdman, S, Moloney, R, Murray, D, Duffy, D, Malone, K, et al. Effectiveness of conventional versus virtual reality based vestibular rehabilitation in the treatment of dizziness, gait and balance impairment in adults with unilateral peripheral vestibular loss: A randomised controlled trial. BMC Ear, Nose and Throat Disorders 2012;12:3.CrossRefGoogle ScholarPubMed

20. Liao, Y-Y, Yang, Y-R, Cheng, S-J, Wu, Y-R, Fuh, J-L, Wang, R-Y. Virtual reality-based training to improve obstacle-crossing performance and dynamic balance in patients with Parkinson’s disease. Neurorehabilitation and Neural Repair 2015;29:658–67.CrossRefGoogle ScholarPubMed

21. Mendes, FA dos, S, Pompeu, JE, Lobo, AM, da Silva, KG, Oliveira, T de P, Zomignani, AP, et al. Motor learning, retention and transfer after virtual-reality-based training in Parkinson’s disease—effect of motor and cognitive demands of games: A longitudinal, controlled clinical study. Physiotherapy 2012;98:217–23.CrossRefGoogle Scholar

22. Mirelman, A, Maidan, I, Herman, T, Deutsch, JE, Giladi, N, Hausdorff, JM. Virtual reality for gait training: Can it induce motor learning to enhance complex walking and reduce fall risk in patients with Parkinson’s disease? The Journals of Gerontology: Series A 2011;66A:234–40.CrossRefGoogle Scholar

23. Ma, H-I, Hwang, W-J, Fang, J-J, Kuo, J-K, Wang, C-Y, Leong, I-F, et al. Effects of virtual reality training on functional reaching movements in people with Parkinson’s disease: A randomized controlled pilot trial. Clinical Rehabilitation 2011;25:892902.CrossRefGoogle ScholarPubMed

24. Yen, C-Y, Lin, K-H, Hu, M-H, Wu, R-M, Lu, T-W, Lin, C-H. Effects of virtual reality-augmented balance training on sensory organization and attentional demand for postural control in people with Parkinson disease: A randomized controlled trial. Physical Therapy 2011;91:862–74.CrossRefGoogle ScholarPubMed

25. Moyle, W, Jones, C, Dwan, T, Petrovich, T. Effectiveness of a virtual reality forest on people with dementia: A mixed methods pilot study. The Gerontologist 2017 [epub ahead of print].Google Scholar

26. Teo, W-P, Muthalib, M, Yamin, S, Hendy, AM, Bramstedt, K, Kotsopoulos, E, et al. Does a combination of virtual reality, neuromodulation and neuroimaging provide a comprehensive platform for neurorehabilitation?—A narrative review of the literature. Frontiers in Human Neuroscience 2016;10:284.CrossRefGoogle ScholarPubMed

27. Cushman, LA, Stein, K, Duffy, CJ. Detecting navigational deficits in cognitive aging and Alzheimer disease using virtual reality. Neurology 2008;71:888–95.CrossRefGoogle ScholarPubMed

28. Anderson, PL, Edwards, SM, Goodnight, JR. Virtual reality and exposure group therapy for social anxiety disorder: Results from a 4–6 year follow-up. Cognitive Therapy and Research 2017;41:230–6.CrossRefGoogle Scholar

29. Bouchard, S, Dumoulin, S, Robillard, G, Guitard, T, Klinger, É, Forget, H, et al. Virtual reality compared with in vivo exposure in the treatment of social anxiety disorder: a three-arm randomised controlled trial. The British Journal of Psychiatry 2017;210:276–83.CrossRefGoogle ScholarPubMed

30. Gebara, CM, Barros-Neto, TP de, Gertsenchtein, L, Lotufo-Neto, F, Gebara, CM, Barros-Neto, TP de, et al. Virtual reality exposure using three-dimensional images for the treatment of social phobia. Revista Brasileira de Psiquiatria 2016;38:24–9.CrossRefGoogle ScholarPubMed

31. Miloff, A, Lindner, P, Hamilton, W, Reuterskiöld, L, Andersson, G, Carlbring, P. Single-session gamified virtual reality exposure therapy for spider phobia vs. traditional exposure therapy : A randomized-controlled trial. Trials 2016;17:60.CrossRefGoogle Scholar

32. Meyerbröker, K, Emmelkamp, PMG. Virtual reality exposure therapy in anxiety disorders: a systematic review of process-and-outcome studies. Depression and Anxiety 2010;27:933–44.CrossRefGoogle ScholarPubMed

33. Lindner, P, Miloff, A, Hamilton, W, Reuterskiöld, L, Andersson, G, Powers, MB, et al. Creating state of the art, next-generation Virtual Reality exposure therapies for anxiety disorders using consumer hardware platforms: Design considerations and future directions. Cognitive Behaviour Therapy 2017;46:404–20.CrossRefGoogle ScholarPubMed

34. Mölbert, SC, Thaler, A, Mohler, BJ, Streuber, S, Romero, J, Black, MJ, et al. Assessing body image in anorexia nervosa using biometric self-avatars in virtual reality: Attitudinal components rather than visual body size estimation are distorted. Psychological Medicine 2018;48:642–53.CrossRefGoogle ScholarPubMed

35. Keizer, A, Elburg A van, Helms R, Dijkerman HC. A virtual reality full body illusion improves body image disturbance in anorexia nervosa. PLOS ONE 2016;11:e0163921.CrossRefGoogle Scholar

36. Yang, YJD, Allen, T, Abdullahi, SM, Pelphrey, KA, Volkmar, FR, Chapman, SB. Brain responses to biological motion predict treatment outcome in young adults with autism receiving Virtual Reality Social Cognition Training: Preliminary findings. Behaviour Research and Therapy 2017;93:5566.CrossRefGoogle ScholarPubMed

37. Didehbani, N, Allen, T, Kandalaft, M, Krawczyk D Chapman S. Virtual Reality Social Cognition Training for children with high functioning autism. Computers in Human Behavior 2016;62:703–11.CrossRefGoogle Scholar

38. Ip, HHS, Wong, SWL, Chan, DFY, Byrne, J, Li, C, Yuan, VSN, et al. Virtual reality enabled training for social adaptation in inclusive education settings for school-aged children with autism spectrum disorder (ASD). In: Blended Learning: Aligning Theory with Practices. Cham: Springer; 2016:94102.Google Scholar

39. Dehn, LB, Kater, L, Piefke, M, Botsch, M, Driessen, M, Beblo, T. Training in a comprehensive everyday-like virtual reality environment compared to computerized cognitive training for patients with depression. Computers in Human Behavior 2018;79:4052.CrossRefGoogle Scholar

40. Falconer, CJ, Rovira, A, King, JA, Gilbert, P, Antley, A, Fearon, P, et al. Embodying self-compassion within virtual reality and its effects on patients with depression. BJPsych Open 2016;2:7480.CrossRefGoogle ScholarPubMed

41. Rus-Calafell, M, Garety, P, Sason, E, Craig, TJK, Valmaggia, LR. Virtual reality in the assessment and treatment of psychosis: A systematic review of its utility, acceptability and effectiveness. Psychological Medicine 2018;48:362–91.CrossRefGoogle ScholarPubMed

42. Veling, W, Pot-Kolder, R, Counotte, J, van Os, J, van der Gaag, M. Environmental social stress, paranoia and psychosis liability: A virtual reality study. Schizophrenia Bulletin 2016;42:1363–71.CrossRefGoogle ScholarPubMed

43. Veling, W, Moritz, S, van der Gaag, M. Brave new worlds—Review and update on virtual reality assessment and treatment in psychosis. Schizophrenia Bulletin 2014;40:1194–7.CrossRefGoogle ScholarPubMed

44. Beidel, DC, Frueh, BC, Neer, SM, Bowers, CA, Trachik, B, Uhde, TW, et al. Trauma management therapy with virtual-reality augmented exposure therapy for combat-related PTSD: A randomized controlled trial. Journal of Anxiety Disorders 2017 [epub ahead of print].Google ScholarPubMed

45. Gahm, G, Reger, G, Ingram, MV, Reger, M, Rizzo, A. A Multisite, Randomized Clinical Trial of Virtual Reality and Prolonged Exposure Therapy for Active Duty Soldiers with PTSD. Tacoma WA: Geneva Foundation; 2015.CrossRefGoogle Scholar

46. Thompson, E, Varela, FJ. Radical embodiment: neural dynamics and consciousness. Trends in Cognitive Sciences 2001;5:418–25.CrossRefGoogle ScholarPubMed

47. Gallagher, S, Allen, M. Active inference, enactivism and the hermeneutics of social cognition. Synthese 2016;122.Google ScholarPubMed

48. Clark, A, Chalmers, D. The extended mind. Analysis 1998;58:719.CrossRefGoogle Scholar

49. Sterelny, K. Minds: Extended or scaffolded? Phenomenology and the Cognitive Sciences 2010;9:465–81.CrossRefGoogle Scholar

50. Sutton, J, Harris, CB, Keil, PG, Barnier, AJ. The psychology of memory, extended cognition, and socially distributed remembering. Phenomenology and the Cognitive Sciences 2010;9:521–60.CrossRefGoogle Scholar

51. Menary, R. Introduction to the special issue on 4E cognition. Phenomenology and the Cognitive Sciences 2010;9:459–63.CrossRefGoogle Scholar

52. Pereplyotchik, D. Cognitivism and nominalism in the philosophy of linguistics. In: Psychosyntax. Cham: Springer; 2017:1944.CrossRefGoogle Scholar

53. Adams, F, Aizawa, K. The value of cognitivism in thinking about extended cognition. Phenomenology and the Cognitive Sciences 2010;9:579603.CrossRefGoogle Scholar

54. Kalckert, A, Ehrsson, HH. The moving rubber hand illusion revisited: Comparing movements and visuotactile stimulation to induce illusory ownership. Consciousness and Cognition 2014;26:117–32.CrossRefGoogle ScholarPubMed

55. Alimardani, M, Nishio, S, Ishiguro, H. Removal of proprioception by BCI raises a stronger body ownership illusion in control of a humanlike robot. Scientific Reports 2016;6:33,514.CrossRefGoogle ScholarPubMed

56. Blumberg, MS, Dooley, JC. Phantom limbs, neuroprosthetics, and the developmental origins of embodiment. Trends in Neurosciences 2017;40:603–12.CrossRefGoogle ScholarPubMed

57. Gallagher, S. Philosophical conceptions of the self: Implications for cognitive science. Trends in Cognitive Sciences 2000;4:1421.CrossRefGoogle ScholarPubMed

58. Moore, JW, Fletcher, PC. Sense of agency in health and disease: A review of cue integration approaches. Consciousness and Cognition 2012;21:5968.CrossRefGoogle ScholarPubMed

59. Gentsch, A, Weber, A, Synofzik, M, Vosgerau, G, Schütz-Bosbach, S. Towards a common framework of grounded action cognition: Relating motor control, perception and cognition. Cognition 2016;146:81–9.CrossRefGoogle ScholarPubMed

60. Synofzik, M, Vosgerau, G, Lindner, A. The experience of free will and the experience of agency: an error-prone, reconstructive process. In: Glannon, W, ed. Free Will and the Brain: Neuroscientific, Philosophical, and Legal Perspectives. New York: Cambridge University Press; 2015.6679.CrossRefGoogle Scholar

61. Ma, K, Hommel, B. The role of agency for perceived ownership in the virtual hand illusion. Consciousness and Cognition 2015;36:277–88.CrossRefGoogle ScholarPubMed

62. Wegner, DM, Sparrow, B, Winerman, L. Vicarious agency: Experiencing control over the movements of others. Journal of Personality and Social Psychology 2004;86:838–48.CrossRefGoogle Scholar

63. Bowman, DA, McMahan, RP. Virtual reality: How much immersion is enough? Computer 2007;40:3643.CrossRefGoogle Scholar

64. Witmer, BG, Singer, MJ. Measuring presence in virtual environments: A presence questionnaire. Presence: Teleoperators and Virtual Environments 1998;7:225–40.CrossRefGoogle Scholar

65. Salanitri, D, Lawson, G, Waterfield, B. The relationship between presence and trust in virtual reality. In Proceedings of the European Conference on Cognitive Ergonomics , New York: Association for Computing Machinery; 2016:16:116:4.Google Scholar

66. Poeschl, S, Doering, N. Measuring co-presence and social presence in virtual environments - psychometric construction of a German scale for a fear of public speaking scenario. Studies in Health Technology and Informatics 2015;219:5863.Google ScholarPubMed

67. Waltemate, T, Gall, D, Roth, D, Botsch, M, Latoschik, ME. The impact of avatar personalization and immersion on virtual body ownership, presence, and emotional response. IEEE Transactions on Visualization and Computer Graphics 2018;24:1643–52.CrossRefGoogle ScholarPubMed

68. Pavone, EF, Tieri, G, Rizza, G, Tidoni, E, Grisoni, L, Aglioti, SM. Embodying others in immersive virtual reality: Electro-cortical signatures of monitoring the errors in the actions of an avatar seen from a first-person perspective. The Journal of Neuroscience 2016;36:268–79.CrossRefGoogle ScholarPubMed

69. Serino, S, Pedroli, E, Keizer, A, Triberti, S, Dakanalis, A, Pallavicini, F, et al. Virtual reality body swapping: A tool for modifying the allocentric memory of the body. Cyberpsychology, Behavior, and Social Networking 2015;19:127–33.CrossRefGoogle Scholar

70. Slater, M, Spanlang, B, Sanchez-Vives, MV, Blanke, O. First person experience of body transfer in virtual reality. PLOS ONE 2010;5:e10564.CrossRefGoogle ScholarPubMed

71. Smurrayinchester. An SVG Version of Image:Moriuncannyvalley.gif. 2007; available at https://commons.wikimedia.org/wiki/File:Mori_Uncanny_Valley.svg. (last accessed 2 Feb 2018).Google Scholar

72. Saygin, AP, Chaminade, T, Ishiguro, H, Driver, J, Frith, C. The thing that should not be: predictive coding and the uncanny valley in perceiving human and humanoid robot actions. Social Cognitive and Affective Neuroscience 2012;7:413–22.CrossRefGoogle Scholar

73. Tinwell, A, Grimshaw, M, Nabi DAWilliams, A. Facial expression of emotion and perception of the Uncanny Valley in virtual characters. Computers in Human Behavior 2011;27:741–9.CrossRefGoogle Scholar

74. Salanitri, D, Hare, C, Borsci, S, Lawson, G, Sharples, S, Waterfield, B. Relationship between trust and usability in virtual environments: An ongoing study. In Human-Computer Interaction: Design and Evaluation. Cham: Springer; 2015:4959.CrossRefGoogle Scholar

75. See note 65, Salanitri et al. 2016.

76. Rehfeld, S, Latoschik, ME, Tramberend, H. Estimating latency and concurrency of asynchronous real-time interactive systems using model checking. In: Virtual Reality (VR). Greenville, SC: IEEE;2016:5766.Google Scholar

77. Davis, S, Nesbitt, K, Nalivaiko, E. A systematic review of cybersickness. In: Proceedings of the 2014 Conference on Interactive Entertainment, New York: Association for Computing Machinery; 2014:8:18:9.Google Scholar

78. Pot-Kolder, R, VelingW, Counotte J, van der Gaag, M. Anxiety partially mediates cybersickness symptoms in immersive virtual reality environments. Cyberpsychology, Behavior, and Social Networking 2018; 21:187–93.CrossRefGoogle ScholarPubMed

79. Arafat, IM, Ferdous, SMS, Quarles, J. The effects of cybersickness on persons with multiple sclerosis. In: Proceedings of the 22nd Association for Computing Machinery Conference on Virtual Reality Software and Technology, New York: Association for Computing Machinery; 2016:51–9.Google Scholar

80. Kim, YY, Kim, HJ, Kim, EN, Ko, HD, Kim, HT. Characteristic changes in the physiological components of cybersickness. Psychophysiology 2005;42:616–25.Google ScholarPubMed

81. Petry, NM, Rehbein, F, Gentile, DA, Lemmens, JS, Rumpf, H-J, Mößle, T, et al. An international consensus for assessing internet gaming disorder using the new DSM-5 approach. Addiction 2014;109:1399–406.CrossRefGoogle ScholarPubMed

82. Weinstein, A, Livny, A, Weizman, A. New developments in brain research of internet and gaming disorder. Neuroscience & Biobehavioral Reviews 2017;75:314–30.CrossRefGoogle ScholarPubMed

83. Holmes, NP, Snijders, HJ, Spence, C. Reaching with alien limbs: Visual exposure to prosthetic hands in a mirror biases proprioception without accompanying illusions of ownership. Perception & Psychophysics 2006;68:685701.CrossRefGoogle Scholar

84. Spanlang, B, Normand, J-M, Borland, D, Kilteni, K, Giannopoulos, E, Pomés, A, et al. How to build an embodiment lab: Achieving body representation illusions in virtual reality. Frontiers in Robotics and AI 2014;1. See also note 67, Waltmate et al. 2018; note 70, Slater et al. 2010.Google Scholar

85. Gonzalez-Franco, M, Lanier, J. Model of illusions and virtual reality. Frontiers in Psychology 2017;8:1125. See also note 35, Keizer et al. 2016.CrossRefGoogle ScholarPubMed

86. Kellmeyer, P, Cochrane, T, Müller, O, Mitchell, C, Ball, T, Fins, JJ, et al. The effects of closed-loop medical devices on the autonomy and accountability of persons and systems. Cambridge Quarterly of Healthcare Ethics 2016;25:623–33.CrossRefGoogle ScholarPubMed

87. Goering, S, Klein, E, Dougherty, DD, Widge, AS. Staying in the loop: Relational agency and identity in next-generation DBS for psychiatry. AJOB Neuroscience 2017;8;5970.CrossRefGoogle Scholar

88. Beauchamp, TL, Childress, JF. Principles of Biomedical Ethics. New York: Oxford University Press; 2001.Google Scholar

89. Walter, JK, Ross, LF. Relational autonomy: Moving beyond the limits of isolated individualism. Pediatrics 2014;133:S1623.CrossRefGoogle ScholarPubMed

90. Gilbert, F, O’Brien, T, Cook, M. The effects of closed-loop brain implants on autonomy and deliberation: What are the risks of being kept in the loop? Cambridge Quarterly of Healthcare Ethics 2018;27:316–25.CrossRefGoogle Scholar See also note 86, Kellmeyer et al., 2016; note 87, Goering et al. 2017.

91. See note 85, Gonzalez-Franco, Lanier 2017; See also note 35, Keizer et al. 2016.

92. Robertson, A. How virtual reality developers are surviving the hype cycle. The Verge 2017; available at https://www.theverge.com/2017/10/11/16458806/oculus-rift-htc-vive-vr-hype-game-developers (last accessed 2 Feb 2018).Google Scholar

93. Yang, G-Z, Cambias, J, Cleary, K, Daimler, E, Drake, J, Dupont, PE, et al. Medical robotics—Regulatory, ethical, and legal considerations for increasing levels of autonomy. Science Robotics 2017;2:12.CrossRefGoogle Scholar

94. Yuste, R, Goering, S, Arcas, BA, Bi, G, Carmena, JM, Carter, A, et al. Four ethical priorities for neurotechnologies and AI. Nature News 2017;551:159.CrossRefGoogle ScholarPubMed

95. Elenko, E, Speier, A, Zohar, D. A regulatory framework emerges for digital medicine. Nature Biotechnology 2015;33:697702.CrossRefGoogle ScholarPubMed

Figure 0

Table 1. Summary of the Current Use of VR Systems in Clinical Research and Therapy in Neurology and Psychiatry

Figure 1

Figure 1. The putative “uncanny valley” effect in animated humanoid robots, puppets, and prostheses (and possibly also VR avatars).71