![]() |
audi.com |
How can people with disabilities operate an autonomously driving car? Cornelia Engel whisks us away into a not-so-distant future where we can give orders through mere thoughts — thanks to Brain Computer Interface.
A dark room. A screen. And Cornelia, whose head is adorned with a futuristic headset. Her eyes are almost penetrating the display. A dot speeds across the screen. Suddenly, music starts to play out of nowhere. What’s going on? Is this a trick? No! Cornelia started the music – only with the power of her thoughts. How did she do it? To answer this question, we need to go back in time: Everything started with Cornelia’s cousin Markus Burkhart, who was diagnosed with multiple sclerosis several years ago and is now paralyzed from his head down. Despite this limitation, he runs his own car repair shop and with the help of eye-tracking and special computer programs, even carries out the administrative work. Although this may be somewhat tedious and time-consuming, it does work.
Brain Computer Interface in the car: With the Audi Aicon, a vision becomes reality
“I find it fascinating to observe him,” Cornelia explains. “Markus would give anything to be able to drive a car again and to be independent. I wondered if it would be possible to operate a car with an integrated eye-tracking system in combination with a brain-computer interface for him.” And so, she found a topic for her thesis at Audi in the Design Interior Interface area.
During her studies at the Mediadesign University of Applied Sciences in Munich, Cornelia Engel got to know the Brain Computer Interface (BCI) of EMOTIV. The “brain-computer interface” enables cognitively and motorically impaired people to communicate with their environment via mental commands. The mobile EEG device registers the electrical activities of the nerve cells. A computer then translates these signals into commands and passes them onto a device – for example a computer program, a wheelchair, a light switch or even a car.
With this approach of reading brain activity to translate thoughts into actions, Cornelia has not only optimized usability, but also made autonomous driving accessible to more people. Physically impaired people, like her cousin in particular, can benefit from this.
Because this is the reality so far: In the autonomous concept car Audi Aicon, which was presented at the IAA 2017, or its predecessor Audi AI:ME, the passenger can operate the graphical interface with an eye-tracking system in addition to touch and voice control. Several infrared sensors are used to detect which display area the passenger is looking at; then the function shown there is displayed on a larger scale. To activate it, the passenger needs to tap on the touch-sensitive wooden screen. However, Markus isn’t able to type anymore. But the Brain Computer Interface offers a solution.
AUDI