European car safety researchers have developed a camera-based system that watches your facial expressions while you drive, and then uses highly accurate emotion detection algorithms to work out when you’re suffering from road rage. The idea behind this system is that, when you’re irritated or angry, you become a more aggressive driver and less attentive — leading to more accidents. The same technology can also be used to measure tiredness and fatigue, by measuring the percentage of your eyelid closure, and then warning you to take a break before you fall asleep at the wheel, extremetech reported. This work, developed by EPFL’s Signal Processing 5 Laboratory (LTS5) in association with PSA Peugeot Citroen, uses an infrared camera placed behind the car’s steering wheel to track the seven universal hard-coded emotions that your face can show. Fear, anger, joy, sadness, disgust, surprise, and suspicion are so intrinsic to human nature that they have very specific muscle movements — movements that can be fairly easily picked up with an infrared camera and some computer vision software. As you can see in the video below, the software tracks your eyes, mouth, and nose, and from their movements it can work out what emotion you’re currently experiencing. If your face registers “anger” or “disgust” for long enough, the software decides that you are stressed out and probably about to do something stupid. Because this is a prototype, all the EPFL system does is tell you when you’re suffering from road rage — but presumably a production version of the technology would do a lot more. Maybe it would pre-charge your breaks, ready for when you tailgate the guy in front of you? Or maybe, if your car has some autonomous driving features, they could quietly take over — so you think you’re still driving, but it’s actually your car that’s preventing you from swerving out of your lane or piling into the car in front of you. Maybe such a system could disable your car’s horn, too… Moving forward, LTS5 hopes it can use its computer vision to detect other states, such as distraction, and to read your lips, which could considerably help with in-car voice recognition. It’s also worth pointing out that similar systems are already in production vehicles — ExtremeTech’s car of the year, the Mercedes-Benz S-Class, uses the steering wheel to detect when you’re drowsy. These systems don’t have quite the same range of emotion detection as camera-based solutions — but really, the ultimate system would combine both steering wheel and computer vision technologies, and also use sensors in your seat, noise sensors in the cockpit (noisy kids), and other clever techniques of assessing your road worthiness. One day, when autonomous vehicles are the norm, you’ll be able to get into your car — and then it will automatically detect that you’re completely wasted, and then drive you home using a route that minimizes the risk of throwing up. One day.