Virtual and augmented reality (VR and AR) will probably be a normal part of life in the future, which means that they’ll almost definitely become targets for people who want to use these devices to take advantage of users. On one level they don’t really present that much additional risk. We’ve been entering our credit card details and carrying around Internet-connected cameras for quite a while, and VR/AR tech is really just a new type of interface for doing the same stuff. Depending on how the technology develops, though, there could be some very real risks to security and privacy, even crossing the border into the physical world.
VR/AR Security Risks
Luckily, the “fine tracking” data that you generate in a virtual world — lots of head, hand, body, and eye movement — is not very interesting to typical criminals. They typically prefer card details, and users send those over the ‘Net all the time already. Vulnerable VR systems do raise some new security risks, however.
Digital clones
As VR and criminals both get more refined, though, it’s possible that access to your voice, behavior, and movement data could lead to digital replicas of you being created and used to impersonate you for various reasons. If we’re using VR for work, socializing, or shopping, having an evil twin running around isn’t ideal and might even be used as a type of ransomware.
Digital blackmail
A more readily-tangible issue is what might happen if someone’s more sensitive content is leaked. Humans being humans, there’s already a huge market for adult VR entertainment, and given that VR opens up pretty much anything as an option, there’s likely to be some very weird stuff happening under the headsets. Some of these recordings would almost definitely be juicy blackmail material.
The human joystick
VR hacks can also enter the physical world, though: researchers have already developed and tested software that allows them to edit virtual environments in such a way that users can be manipulated into physically moving in a certain direction known as the “human joystick” attack. Since you’re effectively blind with a VR headset on, this might end with you falling down some stairs or stepping in harm’s way.
When the virtual world directly affects the physical
If we end up relying on virtual or augmented reality to relay important information in our daily lives, the systems will need to be very well-secured. Doctors, for example, are likely to use AR to assist in viewing medical data and carrying out procedures. If hackers were able to change the feed they could potentially end up killing the patient.
Even daily tasks, like shopping in a virtual supermarket or reading AR info on highway road signs, could be altered in potentially life-threatening ways. A DDoS attack aimed at these systems could cause them to go down and create a potential crisis for people and places that are heavily AR-dependent.
Privacy risks
Security risks aren’t such a big issue yet since there’s simply not enough AR/VR in use for breaches to be very dangerous. Privacy, though, is a debate we’re already having, and the fine tracking and environmental data we generate in VR/AR worlds are going places you might not approve of.
Eye movements
Sites and advertisers are already obsessed with figuring out user behavior, and being able to tell exactly what you’re looking at and for how long is orders of magnitude more powerful than many current user metrics. Eye-tracking data is already being used to target ads and provide analytics and could potentially be used to conduct behind-the-scenes tests and create psychological profiles on users.
Body movements and other data
Your eye movements may be the most valuable, but tracking the rest of your body is also a potential gold mine for advertisers and even governments. By tracking your movements and other biophysical markers, they might be able to tell everything from how physically fit you are to your mood on that particular day. (Yes, VR emotion tracking is a thing.)
Your environment
VR gear can also collect information about your physical environment using both your movement data and, in some cases, cameras and other sensors. This could be a large security risk, but it could end up being an even bigger privacy risk. It’s unlikely that this would be legal for any company to collect and use for advertising, but hackers and/or governments could potentially use this against targets to gather intel on them.
Your virtual environment
Your security might be compromised by putting you in a fake virtual environment that resembles the one you’re used to (VR phishing!) but so could your privacy. How you shape and interact with your virtual world could be a great source of information about your behavior, and even your conversations with others could theoretically be run through natural language processing software and used as more data.
So, never get a VR headset, right?
VR and AR are awesome technologies and will probably make the world a better and more interesting place on the ‘Net. That said, anyone who wants to get one should know the risks and make informed choices about the security practices and privacy policies of the tech they buy and the worlds they enter.
As with any technology, the good comes with some bad, and in this case, the bad is the fact that you’re going to be generating a lot of very personal data that, thanks to AI, can be effectively processed so somebody can learn more about you. Our cybersecurity track record so far has been a lot better than our progress on online privacy, and that trend seems likely to continue with AR/VR. Hopefully everything gets worked out before we start building brain interfaces into the headsets.
Get the best of IoT Tech Trends delivered right to your inbox!