Ethics of XR in the Virtual Age
“Even outside of more dramatic edge-cases, if someone built an AR driving aid similar to Google Maps but it wasn’t properly secured, it leaves the door open for data manipulation — a popup on your desktop is annoying, but a popup in your AR glasses, or across a portion of your windshield, could be deadly”. -Chris Boyd
With augmented and virtual reality technologies becoming more widespread and available, we find ourselves in a new age of technology where connections and activities in virtual space are becoming more and more common. While these changes are exciting and promise great improvements in the day-to-day lives of the average person, the long-term effects are still unknown and potential negative impacts are not as often talked about. As such, I’d like to take a brief moment to discuss potential issues developers and society at large should keep in mind as technology continues to develop.
When it comes to mental health, I believe the warning Google has added to the disclaimer of their Daydream View system puts it best¹:
Some of the content you experience in this virtual world is so realistic that your body and mind may react to it as if it were real. If the content is frightening, violent, or anxiety provoking, it can cause your body to react physically, including increasing your heart rate and blood pressure. It can also, in some individuals, cause psychological reactions, including anxiety, fear, or even Post Traumatic Stress Disorder (or PTSD).
Realism in VR is constantly improving, and while this promises many benefits to mental health such as the treatment of phobias, anxiety, and aiding in the treatment of some mental illnesses, the high level of violence and other adult situations in video games combined with the realism of VR promises the opportunity to create uniquely terrifying experiences. These risks already exist in more traditional mediums we use already, but XR has the potential to amplify and combine existing issues.
In addition, with increasing realism in VR, it is possible people, especially children, may experience difficulty distinguishing between VR and reality. A 2019 study conducted by J.O. Bailey at the University of Texas² indicated that VR may elicit different cognitive responses in children compared to less immersive mediums, suggesting that children were more likely to view the VR experience as more “real”. The character shown in the VR experience was also much more influential on the children, with the children being less likely to not do as the character did.
While many creators of VR gear are already mitigating this issue with age requirements, parents who allow their children to use this technology, even if they are above the suggested age, need to be very careful of what XR media their children are exposed to, even more so than traditional media.
Any time a new major technology is released, a major concern that tends to be on everyone’s mind is their privacy. Improperly protected personal information on a new device, submitted by users as part of registration or use, could be at risk. VR in particular carries the risk of data such as recordings, geolocation, and other private information being captured and used for marketing or other purposes without the consent of the user. In addition, vendors on the app are submitted payments and other personal information when their product is used, and at this time it is difficult to know if their security is at an acceptable level. Developers can mitigate some of these issues by simply vetting vendors, and the average person can simply ensure they only provide their information to vendors they trust, but voluntary standards and best practices need to be coordinated before significant trust can be placed in the technology.
Developers aren’t the only ones who need to focus on maintaining the privacy of the user either. As noted by the website Gizmodo⁴ in 2016, the hardware developer for Oculus retains the right to collect multiple forms of data from you while using the hardware. This data, according to their terms of service, includes:
“Information about your interactions with our services, like information about the games, content, apps, or other experiences you interact with, and information collected in or through cookies, local storage, pixels, and similar technologies”
“Information about the games, content, or other apps installed on your device or provided through our Services, including from third parties”
“Location information, which can be derived from information such as your device’s IP address. If you’re using a mobile device, we may collect data about the device’s precise location, which is derived from sources such as the device’s GPS signal and information about nearby WiFi networks and cell towers”
“Information about your physical movements and dimensions while you use a virtual reality headset”
The information they collect can be used to market other products to you. With mind-control interfaces being a potential future of XR, everything from what we see to our emotions may become new targets of data mining at the expense of our privacy. Considering Oculus is at the forefront of VR development, this is exactly the sort of practices we don’t want to see becoming the norm.
Widespread usage of AR also poses new concerns regarding privacy due to the capability to constantly and continuously collect and process personal data, and raises questions on the expectation of privacy in a public area. Even if governments do not create regulations about AR usage, at a minimum they should aid the private sector in creating a code of conduct for both developers and law enforcement to protect 4th amendment rights and their equivalents in other countries, and to ease concerns over surveillance and the expectation of privacy. In addition, as XR technology becomes more widespread and closer to reality, it will become increasingly unsure which laws extend from the real world to the virtual world, and to what extent. With how fast the technology is growing, policymakers may be unable to keep up.
“Any tech can be used for good or bad; it’s about encouraging good behavior”
- Thad Starner
When discussing the social consequences of VR, online disinhibition is a commonly talked about point. Online disinhibition refers to the phenomenon where being able to hide behind a screen lowers your inhibitions, causing you to act differently in online interactions than you would in real life, and when using VR, a person could theoretically live a different life in a different world, where they don’t have to be the person they are in real life. Social media can be used as an example of a precursor to the online disinhibition VR can provide, with people able to hide behind their screens and act differently from how they would in a real-world interaction.
Long-term exposure to VR also poses the risk of ending with people prioritizing the virtual world over the real world. This has already been seen with other games, most famously World of Warcraft, with examples such as the 13-year-old who committed suicide after a 36-hour WoW binge in 2004, or the 3-year-old who died of malnutrition and dehydration in 2009. Her mother had spent 15 hours in-game on the day of her death. With VR having the potential to allow you to experience nearly any conceivable real-life experience, and many that aren’t possible in real life, the increased immersion VR provides increases the risk that similar or worse effects may occur if precautions aren’t taken. If VR is able to provide people tools to deal with issues such as social isolation, they may lean further into the virtual world, which only isolates them further in the real world. This issue can be summed up as a simple philosophical question: When virtual reality reaches a point where it is comparable or even indistinguishable from reality, are real-life experiences by their nature still more valuable than virtual ones?
VR isn’t the only cross reality tech with the potential for negative societal impacts either: According to Thad Starner, a professor at the Georgia Institute of Technology, it’s feasible that this could be used to coordinate against other people, such as during a burglary.
In conclusion, XR technologies are an exciting new medium with possibilities in everything from law enforcement to education to gaming, but as the technology is still developing, there are many challenges it faces going forward and we just don’t know yet how it is going to impact us, whether we’re talking as individuals or as a society. Society at large and the developers that work on the technology have a duty to be informed of the potential negative impacts of XR so informed decisions can be made and said negative impacts can be mitigated or removed entirely, while hardware developers should take it upon themselves to limit the amount of information they gather from users, or at a bare minimum ensure users fully understand what they are agreeing to when using the products.
 “Daydream View Health and Safety Information — Daydream Help.” Google, Google, support.google.com/daydream/answer/7185037?visit_id=1–636162973848457164–2214380425&p=safetywarrantyreq&rd=1.
 Bailey, Jakki O., et al. “Virtual Reality’s Effect on Children’s Inhibitory Control, Social Compliance, and Sharing.” Journal of Applied Developmental Psychology, JAI, 9 July 2019, www.sciencedirect.com/science/article/abs/pii/S0193397318300315.
 Slater, Mel, et al. “The Ethics of Realism in Virtual and Augmented Reality.” Frontiers, Frontiers, 11 Feb. 2020, www.frontiersin.org/articles/10.3389/frvir.2020.00001/full.
 Liptak, Andrew. “There Are Some Super Shady Things in Oculus Rift’s Terms of Service (Updated).” Gizmodo, Gizmodo, 5 Apr. 2016, gizmodo.com/there-are-some-super-shady-things-in-oculus-rifts-terms-1768678169.