The military is using virtual reality, but is it vulnerable to hacking?
Virtual reality or mixed reality goggles are starting to move beyond the realm of sci-fi and gaming into real-world applications, including use by the military. These technologies could be a game-changer in certain scenarios of the modern battlefield. However, there is growing concern that these capabilities could be hacked by terrorists or foreign adversaries, which calls into question their overall usefulness.
The DOD is looking to stay ahead of the curve, not just in MR adoption but also in countering the known and unknown vulnerabilities the novel technology will inevitably introduce in tactical settings. To this end, DARPA has rolled out a new program to protect against mind hacking.
The recently announced Intrinsic Cognitive Security program, headed by Dr. Matthew Wilding, will develop “computational science to build tactical mixed reality systems that protect against cognitive attack.”
While there may ultimately be myriad MR exploits, Wilding indicated that accidents during the development of MR technology have already highlighted critical vulnerabilities adversaries could use to their advantage.
“Cognitive effects that have been demonstrated in virtual settings include manipulating emotion, inducing cybersickness, causing confusion or anxiety, and reducing trust in equipment,” wrote Wilding.
Cybersecurity and digital forensic scientist Dr. Ibrahim Baggili, founder of the Baggil(i) Truth Lab at Louisiana State University, told Return that MR is indeed ripe for exploitation.
“Our research at Louisiana State University has shown that we are indeed able to deceive people in mixed reality environments. For instance, some of our past work has shown what we termed ‘the human joystick attack,’ where we can move people from one point in a room to another without their knowledge or consent by manipulating the center of the virtual room,” wrote Baggili.
The “human joystick attack” and other exploits could lead to embarrassment or injury inside a user’s home. In wartime, such an attack could mean the loss of lives or a battle.
Although the use of mixed reality by the U.S. military is presently limited, the Pentagon appears keen on adopting various MR applications to increase the efficacy of warfighters. Wilding noted that possible applications include:
- Target choice communication.
- Soldier squad coordination.
- Weapon targeting.
- Remote medic assistance.
- Navigation aid.
- Locating snipers.
- Ally/enemy identification.
- Carrier flight deck coordination.
With these possible uses in mind, the hacks mentioned above appear all the more threatening. If an MR targeting system is compromised, then a soldier in a firefight could read allies as enemies and vice versa. Similarly, a team assigned a waypoint whose MR equipment is hacked could be led astray or worse — into an ambush.
“We typically create technology and think of the security of the technology as an after fact,” said Baggili. “This is especially true for mixed reality environments.”
Wilding noted in an Oct. 20 presentation that there are five categories of guarantees and models the 36-month ICS program will focus on addressing:
- Physiology — “'things about peoples’ … physical being and how they’re affected by using these systems”;
- Perception — “the ability to see things accurately and perceive things appropriately and not be confused by being in a mixed reality system”;
- Attention — “where you’re not distracted, you’re able to stay on task even though you’re using a mixed reality system”;
- Confidence — where you “continue to have confidence in the system [and] you don’t start ignoring information or turning the equipment off and saying, ‘this stuff doesn’t work’”; and
- Status — “things about the person that shouldn’t be shared.”
The program will focus on “all types of mixed reality technologies … including vision, audio, and haptic.”
Baggili said the “DARPA program is extremely timely” as it pertains to getting ahead of cognitive attacks in mixed reality environments.
“Realizing that systems can always be vulnerable and that humans use them on a daily basis, it becomes important for us to explore methods of protecting humans against deception and cognitive attacks, especially as mixed reality enters critical environments like the military,” added Baggili.
While the ICS program may turn out “cyber resilient” helicopters, secure clouds for MR battlefield equipment, and optical displays optimized to cut down on nausea, its announcement also serves to highlight potential vulnerabilities for civilian MR users.
The augmented reality mobile game Pokemon Go was released in 2016 to great fanfare. By summer 2018, it had over 147 million active users. The next year, it had been downloaded well over 1 billion times. The game had children and childlike adults navigating the streets of their respective cities in search of on-screen creatures. In the process of chasing after the fictional characters, multiple players ended up dying, mainly in traffic accidents.
While poor situational awareness and overzealous fandom appear to have been the causes of the gamers’ deaths, similar games could be targets for bad actors seeking to put people in harm’s way, where real-world threats are concealed on screen or navigation puts users at risk.
Entertainment is not, however, the only field where MR environments could soon become the saboteurs’ playgrounds. A hacker could intentionally mislead the surgeon using a heads-up display about where to make an incision.
The mixed reality market is proliferating, and it is projected to triple in size by 2026. Professionals in various sectors understand the benefits of real-time visual overlays, which could mean greater effectiveness in their respective fields. AR-goggled physicians could save more lives. Retailers could sell more baubles. Educators could compensate for gaps in their knowledge without cracking a textbook. For these technologies to progress beyond gaming and entertainment, there will have to be real research put into making them more secure.