Sometimes there’s just too much going on for your brain to handle. Like: when you’re zooming down the highway, trying to find your exit, and your friend in the passenger seat is telling a story about this total dick at work. Are you just giving the appropriate amount of nods and affirmative grunts? Probably the latter, because there’s too much else going on for you to care about how many times your friend’s lunch was stolen out of the company fridge (at a more focused moment, you probably would have said something appropriate, like: “reallyyyy tho who does that?”).
NextMind, a fast-growing neurotechnology startup developing a groundbreaking, noninvasive, AI-based brain-computer interface geared to the mass market, announced today it has raised $4.6 million in funding to bring its direct brain command solution to technology integrators and game developers in 2019.
The same thing happens to soldiers — or, at least, it could. The U.S. Army Research Laboratory, in concert with several artificial intelligence and neuroscience experts, is hoping to use artificial intelligence to control the flow of information to soldiers or pilots in stressful combat-related scenarios. The goal? Let the AI handle all the data, leaving humans focused to handle the critical and creative thinking.
“Humans simply cannot process the amount of information that is potentially available,” said Jonathan Touryan, a neuroscience who conducts military research, in a press release . “Yet, humans remain unmatched in their ability to adapt in complex and dynamic situation, such as a battlefield environment. We need a greater capability to be able to estimate and predict human variability, behavior and intent across different contexts.”
"Virtual Reality" Was Coined in 1987. While immersive experiences (depending on the definition) have been around for decades, the actual term most people use to describe them is relatively new. The term “virtual reality” was conceived by Jaron Lanier in 1987, during an intense period of research around this form of technology.
Specific applications for this logistics-managing AI are still unclear — the researchers are focusing their attention on the basic science side. That is, they’re dedicating their energy towards understanding the science and engineering needed for a product that achieves this goal. The specific uses will come later.
But we do know that these tools will be necessary. For several years, the military has been working towards a Warfighter Information Network-Tactical , a real-time network of logistical information about each soldier in an area. For someone trying to do a high-stakes job, that’s a lot of information to be sorting through.
Recently, the researchers ran trials in which they measured a driver’s heart rate and brain activity as they drove down a busy highway (don’t worry, it was in a virtual reality simulation). The researchers wanted to know how well people would hold onto distracting information while dealing with a stressful scenario, like another car pulling quickly in front of theirs, or having to switching lanes on a busy road.
All the while, the researchers were projecting the participants’ brain activity onto a screen using a new tool developed by the army called CLIVE, the Customizable Lighting Interface for the Visualization of EEGs. They wanted to see if the ability to learn new information was related to how well the brain waves aligned between the driver and passenger, a metric called synchrony. In some studies, higher levels of synchrony have been connected to better communication and understanding — literally jiving with someone, though the science has been called into question .
Ultimately, the Army wants to learn when, and how, soldiers fall into sensory overload so that they can design artificial intelligence algorithms that can deliver relevant information at the right times. This isn’t giving AI the keys to military operations— its more like strategically using computers to do what humans sometimes find themselves too overwhelmed to do.
iGlasses. While today Apple is infamous for their use of “i” in their products, they weren’t the first ones to come up with the idea. In the 1990s, a company known as Virtual I/O came up with a headset that was capable of color 3D stereoscopic vision, as well as head tracking. Known as iGlasses, the device had a price tag of just under $1000. While the glasses were fully capable of delivering an immersive experience, they didn’t truly ignite the consumer market.