The performance of Autonomous Vehicles (AVs) is far from what engineers and scientists predicted a few years ago. If you want to have a good understanding of where do we stand when it comes to AI algorithms in AVs, you can look at this blog post from Prof. Rodney Brooks. In spite of all the news about various advances in self-driving cars, we are far away from a true AV, or a level 5 autonomy. The most obvious sign of how far we are from level 5 or even level 4 autonomy, is how bad they perform in big cities in which the traffic laws are not followed strictly by citizens. For example, look at this short clip from a Tesla car in March 2021 in a street in Vietnam, and how the AI algorithm gives up completely and easily. But why do we humans can drive in such crowded and chaotic environments while AI algorithms fail?
To have a better understanding of the huge gap that currently exists between AI algorithms and human drivers, look at this eye-tracking recording. What you see is a recording of the gaze behavior of a human who looked at this clip in the lab. Which is certainly not the same as the gaze behavior of the driver behind this dash-cam. And this is something those of us who drive any vehicle or even a bicycle do regularly. We can read billboards and pay attention to what happens on the roadside while managing to get to the destination safely (at least most of the time!). And compare that to the performance of AI algorithms in current autonomous vehicles. They have various sensors, cameras, and LIDARs, they scan the whole environment in short intervals, they are equipped with the state of the art computational power, and probably millions of hours of computational time is spent on training their algorithms, yet, none of them has managed to go beyond level 2 autonomy. This is to say all of the current self-driving cars need constant human supervision.
So, how can we bridge this gap? How can we bring the AI algorithms used in autonomous vehicles to the next level and reach the goals that have eluded us so far?
The human brain has been a source of inspiration for AI algorithms for a long time already. But as neuroscience advances and we gain insight into different aspects of brain function, such influences are becoming more and more specific and useful. Case in point, the two sets of experiments that we did in this institute. In one set of experiments, we studied the role of Locus Coeruleus Noradrenergic activity (LCNE) in response to uncertainties in the environment. Our work and also the work of other researchers in this and other institutes show that LCNE activity is instrumental in many important aspects of our brain functionality, including how we selectively observe or respond to certain stimuli among all the stimuli that our sensory inputs can collect. And the good news about LCNE activity is that pupil diameter can be used as a proxy for this activty, as I have explained in more detail here. In another set of experiments, we studied the human gaze behavior in response to complex visual stimuli. We collected both gaze behavior of the participants, using an eye-tracking system, and also their brain activity using an EEG system.
Having such a rich collection of measurements from human drivers, we can use the data to train the AI algorithms. And that is what has been conspicuously missing in current attempts to improve these algorithms. We can collect pupillometry data as a proxy for LCNE activity from a human-driver, eye-tracking data to see which features in each scene draws human drivers’ attention, and EEG data to study how cortical connections are modulated under different circumstances, and then we can compare the behavior of AI algorithms in autonomous vehicles to discover what has hampered their progress so far, and how to allow them to be more selective and how to use their sensory inputs and computation resources more efficiently. This is something that can be done and it will pave the way for a qualitative improvement in self-driving cars.
In the first step, all these recordings can be done in a lab while the driver drives in a virtual environment such as a car racing computer game. Below, you can see an example of such a setup in our lab.
Last updated on December 7, 2021. Generated using htlatex.