Hostname: page-component-7b9c58cd5d-hpxsc Total loading time: 0 Render date: 2025-03-13T12:34:29.068Z Has data issue: false hasContentIssue false

Psychiatric Interventions in Virtual Reality: Why We Need an Ethical Framework

Published online by Cambridge University Press:  07 September 2020

Rights & Permissions [Opens in a new window]

Abstract

Recent improvements in virtual reality (VR) allow for the representation of authentic environments and multiple users in a shared complex virtual world in real time. These advances have fostered clinical applications including in psychiatry. However, although VR is already used in clinical settings to help people with mental disorders (e.g., exposure therapy), the related ethical issues require greater attention. Based on a thematic literature search the authors identified five themes that raise ethical concerns related to the clinical use of VR: (1) reality and its representation, (2) autonomy, (3) privacy, (4) self-diagnosis and self-treatment, and (5) expectation bias. Reality and its representation is a theme that lies at the heart of VR, but is also of specific significance in a clinical context when perceptions of reality are concerned, for example, during psychosis. Closely associated is the autonomy of VR users. Although autonomy is a much-considered topic in biomedical ethics, it has not been sufficiently discussed when it comes to applications of VR in psychiatry. In this review, the authors address the different themes and recommend the development of an ethical framework for the clinical use of VR.

Type
Articles
Copyright
© The Author(s), 2020. Published by Cambridge University Press

Introduction

Although the core idea of virtual reality (VR) is decades old,Footnote 1 it has become a very powerful tool in psychology and the everyday worldFootnote 2 , Footnote 3 as a (close-to-)perfect simulation of the real world including representations of oneself and other persons. VR can appear in different graduations, depending on its technical implementation. Most convincing are fully immersive settings, i.e. those which most effectively prevent the perceptual distinction between reality and simulation (e.g. by using head-mounted visual displays). There are many examples of possible clinical applications of VR. Virtual bodies can be used in immersive VR to modulate the pain perceptions of chronic pain patients by influencing their bodily self-representations.Footnote 4 It is believed that especially severely ill patients could benefit the most from VR therapies.Footnote 5 The inducement of out-of-body experiences in immersive VR offers another potential therapeutic tool. Initial results suggest that this could reduce other forms of suffering, such as that associated with the fear of death.Footnote 6 In the social domain, people can interact within a shared virtual space using representations of their own bodies recorded via so-called motion capture technologies.Footnote 7 , Footnote 8 This makes it possible to modify multiple aspects of the ongoing dyadic interactions in real-time.Footnote 8 , Footnote 9 In the future, these techniques could be integrated into assistive communication tools for people with communication impairments or into training programs, for example, to train how to interact in social situations.Footnote 10

Some clinical applications of VR are already well established in psychiatry. For instance, VR is used for exposure therapies and the treatment of posttraumatic stress disorder.Footnote 11 , Footnote 12 A recent literature review discussed clinical trials of the use of VR for a wide range of mental disorders including anxiety disorder, specific phobias, posttraumatic stress disorder, schizophrenia, autism, mild cognitive impairment, and dementia, as well as approaches for stress and pain alleviation.Footnote 13 Immersive VR applications have been explored to treat anxiety disorders, depression, psychosis, substance related disorders, and eating disorders.Footnote 14 , Footnote 15 , Footnote 16 A very recent example currently under investigation is an approach for the treatment of persecutory delusions in VR, which is not only fully immersive, but also fully automatic (i.e. does not require a therapist´s ongoing supervision).Footnote 17 Considering the evidence to date, it seems likely that more and more VR applications will find their way into clinical use.

These developments show that an increasing number of patients are likely to be exposed to VR in the future. These novel VR-based therapies may offer welcome benefits to some or all psychiatric patients, but it is also important to note the possible risks. VR may alter perceptions of the self, of others and of reality in ways that may be destabilizing, and the approach also opens up the possibility of deliberate or accidental manipulation. Given the particular vulnerability of psychiatric patients, it is clear that ethical guidelines are required.

The Need for an Ethical Debate in VR

In 1991, the Editors of the Lancet published a short article,Footnote 18 in which they argued that medical responsibilities arising from the clinical applications of VR must be taken seriously. They expressed concern about “premature and ill-judged clinical applications,” and proposed that a promising field would be better served by examining medical and ethical responsibilities early rather than awaiting reports of ill-effects, which might then delay its clinical implementation. Since then multiple authors have argued for the adoption of ethical guidelines for the clinical use of VR.Footnote 19 , Footnote 20 In 2016, Michael Madary and Thomas Metzinger published a first code of ethical conduct,Footnote 21 in which they state that “VR technology will eventually change not only our general image of humanity but also our understanding of deeply entrenched notions, such as ‘conscious experience’, ‘selfhood’, ‘authenticity’, or ‘realness’. In addition it will transform the structure of our life-world, bringing about entirely novel forms of everyday social interactions and changing the very relationship we have to our own minds.”Footnote 22 Their deliberations constitute a key contribution to the debate, but they focused on ethical concerns arising in VR research and on the risks for the general public and society. However, as Philipp Kellmeyer and colleagues recently pointed out, the ethical discussion must be extended to consider the disease-related vulnerability of patients in the context of clinical applications of VR.Footnote 23

Although the ethical implications have been pointed out from the very beginning, the systematic public debate is still limited. So far, we do not know much about the long-term effects of VR exposure in patientsFootnote 24 or in nondisturbed control persons.Footnote 25 Our aim here is to provide an overview of the ethical matters being discussed in relation to the clinical use of VR, structured as follows: (1) reality and its representation, (2) autonomy, (3) privacy, (4) self-diagnosis and self-treatment, and (5) expectation bias. A preliminary observation is that reality and its representation is a topic that for the first time becomes a relevant theme of ethical discussions as a result of VR. To our knowledge, there is no similar debate about reality and its representation elsewhere in medical ethics or neuroethics. A second observation is that VR, understood as artificially created and completely controllable environments, raises challenges regarding how to respect the ethical principle of autonomy. Thus, although autonomy is a much-discussed value in medical ethics, the clinical application of VR appears to raise new and interesting questions for the field.

The following discussion is offered to help lay the foundations for the development of ethical guidelines for the clinical use of VR, particularly addressing the individual vulnerability of patients.

Reality and its Representation

One of the characteristic features of VR is its potential to create an experience that feels real, even though the user knows that it is a simulation. This is usually described as “presence,” or the feeling or experience of being there.Footnote 26 The immersive and interactive features of VR are fundamental for the experience of “realism.”Footnote 27 If the virtual experience is convincing, behavior, feeling, and bodily reactions will resemble those of a real situation even though the user is aware of its being simulated.Footnote 28 According to Mel Slater, two components are crucial for immersive VR to feel real: First, “place illusion (PI)” based on “qualia of having a sensation of being in a real place,” and, second, “plausibility illusion (Psi)” understood as the “illusion that the scenario being depicted is actually occurring.”Footnote 29 PI and Psi together determine the degree of immersion, which in turn depends on the sophistication of the technical system that implements the VR environment. While the use of head mounted displays (HMD) and motion-tracking capture suits can induce the illusion of being there, some authors doubt whether less immersive systems, such as desktop systems, can produce a similar sensation.Footnote 30 , Footnote 31 This seems plausible because it is not possible visually to compare actual reality with the virtual while using an immersive VR system. Furthermore, VR can create not just the illusion of an inanimate world, but also “social hallucinations”Footnote 32 through the use of avatars who would make the virtual experience even more convincing.Footnote 33

The fact that users of VR actually have the feeling of being in a virtual place and react to it and its inhabitants as if they were real holds great therapeutic potential. While some interventions aim to create a convincing replica of reality that can be fully controlled (e.g., exposure therapies), other approaches may be designed to exploit the chances of VR transcending the limits of reality (for instance flying, being embodied in bodies of another shape, another sex, bodies of children etc.). At the same time, the literature identifies various risks that may arise in relation to the manipulation and representation of reality in VR research and in its clinical application.

Blurring boundaries.

An obvious problem relates to the exposure of individuals who are vulnerable to disturbed perceptions of reality. What happens if a virtual environment feels real but the knowledge that it is simulated is lost or was not fully present in the first place? Patients who already have difficulties in their perception of reality bear a substantial risk that their judgments of reality will be destabilized further by VR.Footnote 34 , Footnote 35 In particular, patients with cognitive impairment or hallucinations could be confused in their distinctions between real and virtual environments. Vulnerable patients may be at increased risk of negative behavioral responses, misinterpretations of events or even the development of paranoid delusions.Footnote 36 According to Kellmeyer, patients might experience a feeling of “epistemic uncertainty” and/or “phenomenological unease” due to the external manipulation of sensory input.Footnote 37 Accordingly, VR should be used with great caution in individuals with reduced capacity of reality testing, such as those who suffer from dementia or experience symptoms of psychosis.Footnote 38 , Footnote 39

Virtual trauma.

Jose Ramirez and Scott LaBarge call experiences in VR “virtually real experiences” if they successfully convey the described feeling of presence or actually being there. Accordingly, the potential of VR to induce reality-like experiences raises the risk of a “virtually real trauma.”Footnote 40 Along similar lines, Xueni Pan and Antonia F. Hamilton discuss the exposure of healthy research participants to emotionally stressful situations in VR. Subjects’ self-perception may be harmed if they fail to act ethically in a virtual situation, leading to reduced self-esteem despite the otherwise physically safe framework provided by VR.Footnote 41 An interesting idea in this respect is the so-called “Equivalence Principle,” which states, that “if it would be wrong to allow subjects to have a certain experience in reality, then it would be wrong to allow subjects to have that experience in a virtually real setting.”Footnote 42 The position is developed as a counter-point to the argument that VR makes experiments possible that should not be carried out in reality for ethical reasons. This view was held, for example, by Slater, who had recreated a modified version of the well-known Milgram’s experiment in VR,Footnote 43 and others.Footnote 44 The debate about which situations individuals can be exposed to in VR is certainly of great importance with regard to its clinical applications.

The foreseeable infliction of trauma may raise challenging legal questions as well.Footnote 45 In an older but widely-reported case, an online player maliciously hacked a virtual world, and caused certain players to sexually assault another fellow player.Footnote 46 Although the attacks took place solely in the form of words rather than graphics or animation, the distress described by the victims was real, with one calling for “virtual castration” of the offending player. This case revealed that although virtual environments might not pose direct physical risks, they certainly could pose psychological risks.Footnote 47 The risk of trauma in the use of clinical VR likely will vary, but it will be important to consider whether reasonable care has been taken in the design and deployment of the technology and whether proper informed consent has been obtained.

Illusion of embodiment.

VR offers the ability to re-define the entire environment of the user and thus influence his behavior, and can even mislead the user about his own appearance and embodiment. Through visual-proprioceptive, visual-motor, or visual-tactile synchrony, embodiment can be convincingly established and manipulated.Footnote 48 Madary and Metzinger consider illusions of embodiment in VR possible, because “the human mind is plastic to such a degree that it can misrepresent its own embodiment.”Footnote 49 They state that the potential of targeting and manipulating the phenomenal conscious experience of oneself is the most fundamental theoretical issue underlying concerns about VR.Footnote 50 Related to this is the well-known “Proteus effect” which refers to the influence on individual behavior of alterations in self-representation brought about by VR. It has been shown that the behavior of a person visiting VR changes and adapts to the person´s own digital self-representation.Footnote 51 Others, such as James S. Spiegel,Footnote 52 argue that the synchronization of different sensory perceptions can result in a changed self-perception. Empirical evidence for this claim is the finding that when study participants are represented as their future aged selves, they save more money for retirement. This effect was observed during and shortly after exposure, but it is uncertain how long this effect lasts.Footnote 53 Another study suggests that racial bias against Black people could be permanently reduced by repeatedly embodying White people in Black virtual bodies in VR.Footnote 54 These data suggest that changes in embodiment in VR can substantially influence behavior, offering both great therapeutic potential, but also the possibility of unpredictable effects and an opportunity for manipulation.

Autonomy

Autonomy is frequently discussed in general bioethics. At least since Beauchamp and Childress’s “Principles of Biomedical Ethics,” autonomy has been considered the most important principle of Western medical ethics.Footnote 55 VR challenges this concept in several ways.

A man-made world.

In the debate about VR, a parallel to Nozick’s “experience machine” is repeatedly drawn: “plugging into an experience machine limits us to a man-made reality, to a world no deeper or more important than that which people can construct. There is no actual contact with any deeper reality, though the experience of it can be simulated.”Footnote 56 As with Nozick’s experience machine, VR, by its very nature, is a man-made simulation that constrains the user to those behavioral options within the VR that were provided by the designer. In the case of mental disorders, patients might interpret their VR experiences and the options provided by the designer in unpredictable ways as they seek to understand their symptoms. Some suggest that VR experiences that are subject to constraints imposed by designers are impoverished, and that patients should not be exposed to a world “that contains no more significance or deeper meaning than that which man can construct.”Footnote 57 Furthermore, using the opportunities of VR in a paternalistic framework, for example through manipulations that cause an unconscious change in the behavior of the patient, could result in a conflict between respecting a patient’s autonomy and acting beneficently to promote the individual’s best interests.Footnote 58 , Footnote 59 , Footnote 60 Virtual environments can be manipulated easily and quickly in contrast to the stability of the real physical world. This allows modifications to be introduced in real-time as direct reactions to the user’s behavior, creating ever-changing, fluid environments. This rapid adaptability of VR makes it a more complex and unpredictable environment and can complicate attempts to safeguard autonomy by providing sufficient information about the proposed VR treatment ahead of time for the purpose of voluntary and informed consent.

Persuasiveness of technology.

The burden of symptoms may drive patients‘ decisions to use VR if it provides a more desirable environment than a reality affected by severe symptoms of mental disorder or physical suffering, even at the cost of neglecting life in the real world.Footnote 61 , Footnote 62 , Footnote 63 , Footnote 64 It is possible that VR could even lead to addictive behavior in vulnerable individuals.Footnote 65 Such issues are gaining practical relevance with regard to new research approaches, such as the use of VR for patients in psychiatric hospitals. In a study with dementia patients, the aim of the VR was to enable patients housed in closed facilities to “leave” the institution, that is to choose from landscapes or places they could visit virtually.Footnote 66 Such programs are developed with the idea of strengthening the autonomy of the persons concerned, but raise the philosophical question about whether the illusion of freedom constitutes an increase in autonomy or not.

Sense of agency.

Finally, one’s own sense of agency, that is the perception that one is the author or initiator of one’s own actions, could be affected, possibly resulting in “agential uncertainty.”Footnote 67 Patients who are exposed to certain simulations such as virtual body ownership and who also lack a good understanding of the procedure could experience “a feeling of loss of control and unease about [their] sense of agency.”Footnote 68 The risk of no longer perceiving oneself as the author of one’s own actions might be a general risk inherent in VR. It is not only the virtual environment, but also the user herself/himself who is subject to the designer’s ideas regarding the potentialities afforded to the user. An example is the modification of social interactions in VR. Recently, a prototype of a system architecture that enables the automatic modification of nonverbal behavior (e.g., gaze behavior and gestures) in VR in real-time was developed.Footnote 69 , Footnote 70 Such a program could assist people who experience difficulties with social interaction and communication, such as is sometimes the case for persons with autism spectrum disorder. The ethical problem of such an application arises from the fact that the person participating in the ongoing communication cannot influence the modification her/himself. The person’s appearance as well as actions are changed during the interaction with a partner via the process of a so-called “social augmentation.”Footnote 71 Can a person participating in such a “socially augmented” interaction experience her/himself as a fully responsible author of actions that have been modified or will a feeling of manipulation prevail? The potential feeling of “agential uncertainty”Footnote 72 could undermine attempts to use such systems to increase perceived autonomy. In addition, such a program requires a definition of normality, according to which behavior can be evaluated as functional or dysfunctional and modified as necessary. Obviously, the question raises a debate about normativism and the degree to which degree cultural, historical or societal norms influence our evaluations of normality.Footnote 73

Privacy

Highly immersive VR environments supported by HMDs with inbuilt eye tracking and body sensors allowing the full description and replication of bodily behavior in VR can be used to collect personal data and as such raise the issue of privacy.Footnote 74 , Footnote 75 , Footnote 76 Complex and fine-grained data can be collected and traced back to the person performing the movements, raising the question of how to guarantee data security. Data privacy and security risks and protections must be part of informed consent discussions.Footnote 77 Another important privacy-related question is whether this data should be made available for data-mining to improve human behavior models. In the future, applications could also be developed that link brain computer interfaces (BCI) with VR. This would be particularly relevant for neurological patients. The integration of BCI provokes the question of how to deal with the privacy of neurophysiological data that may shed light on the inner experiences of persons in a way that is distinct and possibly more informative than just observing human behavior.Footnote 78

The need to ensure responsible handling of VR data becomes apparent when taking into account another crucial aspect: Various studies have shown that there is a greater willingness to entrust personal information, including information about trauma or abuse, to a virtual character than to a real person.Footnote 79 This adds a new dimension to the question of data security in the context of VR.

Self-Diagnosis and Self-Treatment

VR is a freely available technology. It is likely that members of the public will be able to gain access in one way or another to the clinical VR applications. If these clinical VR tools are used without the oversight of well-trained clinicians in a safe therapeutic context, the prevalence of self-diagnosis and self-treatment might increase, which poses the danger of misdiagnosis, inadequate treatment, and possibly in the worst case the aggravation of an existing condition. The impact of VR on the user might be greater than the use of “self-help” books.Footnote 80

Although the use of clinical VR without a physician is problematic, so too would be its use by an insufficiently trained physician. The use of clinical VR should be guided by the same care as is applied, for instance, in pharmaceutical treatments, psychotherapy, or brain stimulation, such as transcranial magnetic stimulation or deep brain stimulation. A topic requiring discussion is whether medical applications of the VR technology should fall under existing medical device regulations.Footnote 81

Expectation Bias

Under the term expectation bias we include phenomena related to the overestimation of a therapy’s efficacy, such as publication pressure and the “therapeutic misconception.” In medicine, new treatments are often perceived in a biased manner by both therapists and patients. Young researchers working in academic medicine encounter pressure to be at the “forefront of the application of new technologies.”Footnote 82 This could result in incautious overstatements in publications and in the premature and sometimes unethical adoption of a new technology.Footnote 83 However, it is not necessarily publication pressure that may lead to an overestimation of the therapeutic efficacy of VR. Rather, newly developed therapies are often associated with a disproportionately high level of hope. Patients in particular are eager to find answers to serious problems and may prematurely regard an experimental approach as effective or established, a tendency that is referred to as “therapeutic misconception.” Furthermore, Maria T. Schultheis and Albert S. Rizzo point out that it is often the capabilities rather than the limitations that are highlighted in the case of new technologies.Footnote 84 Honest and clear communication about the evidence of efficacy and the safety of clinical VR applications is therefore imperative.Footnote 85

Conclusions

The potential of VR in mental health is enormously encouraging, but VR raises at the same time ethical concerns, essentially around the representation of reality in VR and the autonomy of VR users. Although the therapeutic potential of VR has not yet been extensively explored, opportunities based on VR encourage hope particularly for those to whom today’s medicine has only little to offer. At the same time, the clinical application of VR for mental health entails the exposure of vulnerable persons. In contrast to pharmacological or even invasive techniques, VR appears comparatively harmless at first glance and likely free of serious side effects—akin to evaluations of the safety of psychotherapy where the discussion of possible side effects only very recently began.Footnote 86 A closer look reveals that the inherent potential of VR to modify behavior in a desired way at the same time holds the danger of bringing about undesired mental states that might influence the behavior of a person not only in virtuo but also in vivo. Despite these risks, clinical use of VR offers unique therapeutic opportunities. In some cases, patients may be supported to learn to understand and manage the symptoms of their illnesses through VR-enabled training or therapy, and in others at least a short-term alleviation of suffering may be achievable. Used in a careful way, VR has the potential to be a very effective therapeutic tool. Since VR can potentially replicate reality, ethical questions in the field of reality and its representation, for example with respect to embodiment, become significant. These questions are specific for VR and do not arise elsewhere in medical ethics or neuroethics, presenting novel challenges for clinical ethics. The pressure to finally develop ethical guidelines for clinical VR is increasing as more and more promising approaches are being tested in clinical trials. First steps have been provided, such as for example the idea of “The Equivalence Principle” by Ramirez and LaBargeFootnote 87 or the criteria for the development and use of VR technology in medicine by Kellmeyer and colleagues, namely “therapeutic alternativism,” “human-oriented value alignment,” and “patient-centred design.”Footnote 88

In 1991, the Editors of the Lancet argued that medical responsibilities arising from the clinical application of VR need to be carefully considered so that appropriate guidelines can be developed and implemented because they feared that beneficial clinical applications may be delayed or impeded if we were to just await the emergence of harmful effects. This concern is more acute than ever.

Footnotes

Funding acknowledgement: This study was funded as part of the project consortium THERENIA within the funding framework “Network of European Funding for Neuroscience Research” (NEURON) under the ERA-NET scheme of the European Commission, with funds from BMBF, Germany (01GP1822), and from the Canadian Institutes of Health Research (CIHR).

References

Notes

1. Lem, S. Summa Technologiae. Minneapolis: University of Minnesota Press; 2014.Google Scholar

2. Bailenson, J. Experience on Demand. What Virtual Reality Is, How it Works, and What it Can Do. New York, London: W & W Norton Company; 2018.Google Scholar

3. Lanier, J. Dawn of the New Everything. A Journey Through Virtual Reality. London, UK: The Bodley Head; 2017.Google Scholar

4. Matamala-Gomez, M, Donegan, T, Bottiroli, S, Sandrini, G, Sanchez-Vives, MV, Tassorelli, C. Immersive virtual reality and virtual embodiment for pain relief. Frontiers in Human Neuroscience 2019;13:279; available at https://www.frontiersin.org/article/10.3389/fnhum.2019.00279 (last accessed 06 Mar 2020).CrossRefGoogle Scholar

5. Kellmeyer, P. Neurophilosophical and ethical aspects of virtual reality therapy in neurology and psychiatry. Cambridge Quarterly of Healthcare Ethics 2018; 27(4):610–27.CrossRefGoogle ScholarPubMed

6. Bourdin, P, Barberia, I, Oliva, R, Slater, M. A virtual out-of-body experience reduces fear of death. PLOS ONE 2017;12(1):e0169343; available at https://doi.org/10.1371/journal.pone.0169343 (last accessed 06 Mar 2020).CrossRefGoogle ScholarPubMed

7. Vogeley, K, Bente, G. “Artificial humans”: Psychology and neuroscience perspectives on embodiment and nonverbal communication. Neural Networks 2010;23(8):1077–90.CrossRefGoogle Scholar

8. Roth, D, Bente, G, Kullmann, P, Mal, D, Purps, CF, Vogeley, K, et al. Technologies for social augmentations in user-embodied virtual reality. In: 25th ACM Symposium on Virtual Reality Software and Technology (VRST ’19), November 12–15, 2019, Parramatta, NSW, Australia. New York, NY: ACM; available at https://doi.org/10.1145/3359996.3364269 (last accessed 06 Mar 2020).Google Scholar

9. Roth, D, Latoschik, ME, Vogeley, K, Bente, G. Hybrid Avatar-Agent technology—A conceptual step towards mediated “social” virtual reality and its respective challenges. I-com 2015;14(2):107–14.CrossRefGoogle Scholar

10. Didehbani, N, Allen, T, Kandalaft, M, Krawczyk, D, Chapman, S. Virtual reality social cognition training for children with high functioning autism. Computers in Human Behavior 2016;62:703–11.CrossRefGoogle Scholar

11. Maples-Keller, JL, Yasinski, C, Manjin, N, Rothbaum, BO. Virtual reality-enhanced extinction of phobias and post-traumatic stress. Neurotherapeutics 2017;14(3):554–63.CrossRefGoogle ScholarPubMed

12. Rizzo, A, Shilling, R. Clinical virtual reality tools to advance the prevention, assessment, and treatment of PTSD. European Journal of Psychotraumatology 2017;8(5 Suppl):1414560.CrossRefGoogle Scholar

13. Park, MJ, Kim, DJ, Lee, U, Na, EJ, Jeon, HJ. A literature overview of virtual reality (VR) in treatment of psychiatric disorders: Recent advances and limitations. Frontiers in Psychiatry 2019;10(505):119; available at https://www.frontiersin.org/article/10.3389/fpsyt.2019.00505 (last accessed 06 Mar 2020).CrossRefGoogle ScholarPubMed

14. Freeman, D, Reeve, S, Robinson, A, Ehlers, A, Clark, D, Spanlang, B, et al. Virtual reality in the assessment, understanding, and treatment of mental health disorders. Psychological Medicine 2017;47(14):2393–400.CrossRefGoogle ScholarPubMed

15. Maples-Keller, JL, Bunnell, BE, Kim, SJ, Rothbaum, BO. The use of virtual reality technology in the treatment of anxiety and other psychiatric disorders. Harvard Review of Psychiatry 2017;25(3):102–13.CrossRefGoogle ScholarPubMed

16. Valmaggia, LR, Latif, L, Kempton, MJ, Rus-Calafell, M. Virtual reality in the psychological treatment for mental health problems: An systematic review of recent evidence. Psychiatry Research 2016;236:189–95.CrossRefGoogle ScholarPubMed

17. Freeman, D, Lister, R, Waite, F, Yu, LM, Slater, M, Dunn, G, et al. Automated psychological therapy using virtual reality (VR) for patients with persecutory delusions: Study protocol for a single-blind parallel-group randomised controlled trial (THRIVE). Trials 2019;20(87):18.Google Scholar

18. The Editors. Being and believing: Ethics of virtual reality. Lancet 1991;338(8762):283–4.CrossRefGoogle Scholar

19. See note 5, Kellmeyer 2018.

20. Rizzo, A, Koenig, ST. Is clinical virtual reality ready for primetime? Neuropsychology 2017;31(8):877–99.CrossRefGoogle ScholarPubMed

21. Madary, M, Metzinger, TK. Real virtuality: A code of ethical conduct. Recommendations for good scientific practice and the consumers of VR-technology. Frontiers in Robotics and AI 2016;3(3):123; available at https://doi.org/10.3389/frobt.2016.00003 (last accessed 06 Mar 2020).CrossRefGoogle Scholar

22. See note 21, Madary, Metzinger 2016.

23. Kellmeyer, P, Biller-Andorno, N, Meynen, G. Ethical tensions of virtual reality treatment in vulnerable patients. Nature Medicine 2019;25(8):1185–8.CrossRefGoogle ScholarPubMed

24. See note 5, Kellmeyer 2018.

25. See note 21, Madary, Metzinger 2016.

26. See note 5, Kellmeyer 2018.

27. Rizzo, A, Schultheis, MT, Rothbaum, BO. Ethical issues for the use of virtual reality in the psychological sciences. In: Bush, SS, Drexler, ML, eds. Ethical Issues in Clinical Neuropsychology. Studies on Neuropsychology, Development, and Cognition. Lisse, The Netherlands: Swets & Zeitlinger; 2003, at 245–77.Google Scholar

28. Slater, M. Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society 2009;364:3549–57.CrossRefGoogle ScholarPubMed

29. See note 28, Slater 2009.

30. See note 28, Slater 2009.

31. See note 14, Freeman et al. 2017.

32. See note 21, Madary, Metzinger 2016.

33. See note 21, Madary, Metzinger 2016.

34. See note 18, The Editors 1991.

35. Kuntze, MF, Stoermer, R, Mueller-Spahn, F, Bullinger, AH. Ethical codes and values in a virtual world. Cyber Psychology & Behavior 2002;5(3):203–6.CrossRefGoogle Scholar

36. See note 27, Rizzo et al. 2003.

37. See note 5, Kellmeyer 2018.

38. See note 27, Rizzo et al. 2003.

39. Schultheis, MT, Rizzo, A. Emerging technologies in practice and research. In: Morgan, JE, Ricker, JH, eds. Textbook of Clinical Neuropsychology. Studies on Neuropsychology, Neurology and Cognition. New York, NY: Psychology Press; 2008, at 848–65.Google Scholar

40. Ramirez, EJ, LaBarge, S. Real moral problems in the use of virtual reality. Ethics and Information Technology 2018;20(2):249–63.CrossRefGoogle Scholar

41. Pan, X, Hamilton AFC, . Why and how to use virtual reality to study human social interaction: The challenges of exploring a new research landscape. British Journal of Psychology 2018;109(3):395417.CrossRefGoogle ScholarPubMed

42. See note 40, Ramirez, LaBarge 2018.

43. Slater, M, Antley, A, Davison, A, Swapp, D, Guger, C, Barker, C, et al. A virtual reprise of the Stanley Milgram obedience experiments. PLoS One 2006;1(1):e39; available at https://doi.org/10.1371/journal.pone.0000039 (last accessed 06 Mar 2020).CrossRefGoogle ScholarPubMed

44. Skulmowski, A, Bunge, A, Kaspar, K, Pipa, G. Forced-choice decision-making in modified trolley dilemma situations: A virtual reality and eye tracking study. Frontiers in Behavioral Neuroscience 2014;8:426; available at https://doi.org/10.3389/fnbeh.2014.00426 (last accessed 06 Mar 2020).CrossRefGoogle ScholarPubMed

45. Lemley, MA and Volokh, E. Law, virtual reality and augmented reality. Pennsylvania Law Review 2018;166:1051–138.Google Scholar

46. Dibbell, J. A rape in cyberspace. Republished in Dibbell J. (1999) In: My Tiny Life: Crime and Passion in a Virtual World . New York, NY: Henry Holt & Co; 1993.Google Scholar

47. Buck, S. The ‘rape in cyber space’ from 25 years ago posed problems we still haven’t solved today: Free speech vs. virtual ‘action’ on the early web”. Timeline 2017; (last accessed 23 Apr 2020) available at https://timeline.com/rape-in-cyberspace-lambdamoo-da9cf0c74e9e (last accessed 23 Apr 2020)Google Scholar

48. See note 41, Pan, Hamilton 2018.

49. See note 21, Madary, Metzinger 2016.

50. See note 21, Madary, Metzinger 2016.

51. Yee, N, Bailenson, J. The proteus effect: The effect of transformed self-representation on behavior. Human Communication Research 2007;33(3):271–90.CrossRefGoogle Scholar

52. Spiegel, JS. The ethics of virtual reality technology: Social hazards and public policy recommendations. Science and Engineering Ethics 2018;24(59):1537–50.CrossRefGoogle ScholarPubMed

53. Hershfield, HE, Goldstein, DG, Sharpe, WF, Fox, J, Yeykelis, L, Carstensen, LL, et al. Increasing saving behavior through age-progressed renderings of the future self. JMR, Journal of marketing research, 2011;48:2337.CrossRefGoogle ScholarPubMed

54. Banakou, D, Hanumanthu, PD, Slater, M. Virtual embodiment of white people in a black virtual body leads to a sustained reduction in their implicit racial bias. Frontiers in Human Neuroscience 2016;10:601; available at https://doi.org/10.3389/fnhum.2016.00601 (last accessed 06 Mar 2020).CrossRefGoogle Scholar

55. Beauchamp, TL, Childress, JF. Principles of Biomedical Ethics. New York, NY; Oxford: Oxford University Press; 2009.Google Scholar

56. Nozick, R. Anarchy, State and Utopia. New York, NY: Basic Books; 1974.Google Scholar

57. See note 18, The Editors 1991.

58. See note 18, The Editors 1991.

59. See note 21, Madary, Metzinger 2016.

60. See note 27, Rizzo et al. 2003.

61. See note 18, The Editors 1991.

62. See note 23, Kellmeyer et al. 2019.

63. See note 27, Rizzo et al. 2003.

64. Whalley, LJ. Ethical issues in the application of virtual reality to medicine. Computers in Biology and Medicine 1995;25(2):107–14.CrossRefGoogle ScholarPubMed

65. See note 64, Whalley 1995.

66. Tabbaa, L, Ang, CS, Rose, V, Siriaraya, P, Stewart, I, Jenkins, KG, et al. Bring the outside in: Providing accessible experiences through VR for people with dementia in locked psychiatric hospitals. In: Proceedings of CHI Conference on Human Factors in Computing Systems Proceedings (CHI 2019). New York, NY: ACM; available at https://doi.org/10.1145/3290605.3300466 (last accessed 06 Mar 2020).Google Scholar

67. See note 5, Kellmeyer 2018.

68. See note 5, Kellmeyer 2018.

69. See note 8, Roth et al. 2019.

70. See note 9, Roth et al. 2015.

71. See note 8, Roth et al. 2019.

72. See note 5, Kellmeyer 2018.

73. Foucault, M. Maladie Mentale et Psychologie. Paris, France: Presses Universitaires de France; 1954.Google Scholar

74. See note 8, Roth et al. 2019.

75. See note 21, Madary, Metzinger 2016.

76. See note 52, Spiegel 2018.

77. See note 21, Madary, Metzinger 2016.

78. See note 8, Roth et al. 2019.

79. See note 41, Pan, Hamilton 2018.

80. See note 20, Rizzo, Koenig 2017.

81. See note 5, Kellmeyer 2018.

82. See note 64, Whalley 1995.

83. See note 39, Schultheis, Rizzo 2008.

84. See note 39, Schultheis, Rizzo 2008.

85. See note 21, Madary, Metzinger 2016.

86. Hardy, GE, Bishop-Edwards, L, Chambers, E, Connell, J, Dent-Brown, K, Kothari, G, et al. Risk factors for negative experiences during psychotherapy. Psychotherapy Research 2019;29(3):403–14.CrossRefGoogle ScholarPubMed

87. See note 40, Ramirez, LaBarge 2018.

88. See note 23, Kellmeyer et al. 2019.