Aug 2 2016
Numerous tasks today are handled by machines. Ideally, machines should also be able to assist humans when the performance is poor. To mediate appropriately, the machine has to comprehend how the human is doing.
Scientists at Fraunhofer have created a diagnostic tool capable of identifying in real time user states and communicating them to the machine. The camera resolutely focuses on the eyes of the driver. In case the eyes close for more than one second, an alarm is activated. This method prevents the risky micro-sleep at the wheel.
It is not always as easy for a machine to detect what state the human is in, as it is in this case.
Jessica Schwarz, Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE
Holistic Model Feeds Real-Time Diagnosis
In her doctoral thesis, the graduate psychologist, in her doctoral thesis explored ways to precisely establish user states, the possible influence of these states on incorrect behavior, and ways in which automated systems could use this data.
"For complex applications it is not sufficient to focus on only one impact factor", says the scientist.
An increased heart rate does not necessarily mean a person is stressed. There could be several reasons. Therefore, Schwarz studied what factors particularly impact human performance and developed a holistic model that offers a comprehensive observation on user states and their causes.
In her model, she differentiates between six dimensions of user state that impact the performance of humans: motivation, workload, attention, fatigue, situation awareness, and the emotional state.
She applies behavioral and physiological measures to perceive these states. She also integrates these with outside factors such as environmental factors, current level of automation, task, and time of day, as well as individual factors, such as the experience of users.
“This allows us to assess the user‘s state in more detail and also identify causes for critical states”, Schwarz explains her procedure.
Experiments with Air Traffic Controller Simulation
The doctoral student’s theoretical findings were confirmed through experiments. She prescribed a task to test subjects. They had to take on the role of an air traffic controller and guide simulated aircraft safely across a virtual airspace. To introduce stress factors, the number of aircraft was increased, the instructions given by the "controllers" were ignored, and background noise was added in some situations.
Previously, Schwarz had gathered data on individual factors such as capabilities, level of experience, and well-being. EEG sensors on the head, an ECG chest strap, and an eye tracker recorded physiological variations in the test subjects.
We had previously conducted intensive interviews with real air traffic controllers to enable us to reproduce their challenges with man-machine interfaces as accurately as possible.
Jessica Schwarz, Fraunhofer Institute for Communication, Information Processing and Ergonomics FKIE
A diagnosis interface was subsequently developed that identifies in real time when individual impact factors turn dangerous and transmits this to the machine.
"Automated systems thus receive very exact information regarding the current capabilities of the user and can react accordingly", Schwarz describes the additional value of the software.
Close to Application
The FKIE aims to complete the research project within this year. The scientists are currently looking for industry partners.
“The technology is very close to application. The know-how to develop specific products for individual use cases is already available“, says Schwarz.
Potential fields of application can be found in all highly automated tasks where vital user states could turn into a safety issue. For instance, repetitive monitoring tasks in control rooms or training systems for pilots could be enhanced by this technology.
“Machines play an increasingly important role, but are also becoming more and more complex. This poses new challenges for the cooperation between human and machine. Adaptive systems that recognize different situations and adapt to them can solve known automation problems. A key aspect of this, however, is that not only the user understands the machine, but that the machine also understands the state of the human. We have now taken the first step towards this goal“, Schwarz sums up.