Does your car read your expressions? AI and machine learning make this possible

Published June 9, 2017   |   

Very soon, your car will recognize you when you get behind the wheel. Utilizing cameras with biometric indicators and facial recognition software, the automakers are looking to personalize your driving experience with cars that stare back at you adjusting quietly, the driving modes and seats. They will foresee your desires and wants by playing your favorite music depending on your mood. This is also about security and safety but not just convenience.

Describing the sophistication of these systems, a technology and systems engineer at Continental Automotive Group, Zachary Bolton stated that “it is not about only about personalization. We can utilize the gleam, the twinkle in your eye to precisely determine where you are looking.” Then, the engineers can dynamically adjust the so-called machine human interface keeping the critical information say, that you are about to exceed the speed limit or about a stalled car up ahead directly in the line of sight of the driver in a display on windshield or on the dashboard. A car sounds a warning when it “sees” that a driver is distracted by tracking the downward movements of the eye.

The in-car systems enable the drivers to register their faces utilizing something as simple as a picture of the driver’s license. An interior infrared camera is utilized to overcome the potential obstacles such as sunglasses which would impede a usual video camera. Placing the camera in center instrument cluster helps in pinpointing the eyes of the driver even if he/she is wearing a hat. The glare caused due to sunlight is the biggest technical challenge. But, it can be filtered out utilizing machine learning. The advanced sensor systems are being developed using machine learning that protects the car from glare from the oncoming headlights.

Once the car comes to know who you are, the systems located in vehicles such as Chrysler Portal concept car would adjust the seat automatically for maximum comfort, selects the mode of driving (for instance, one driver likes his car to perform all the work and another loves to take charge in the sport mode) and propose a destination depending on the past behavior of the owner.

car1

Watching the face of the driver also provides major clues to that car about the state of mind of that person. The tech companies such as Intel and carmakers such as Ford have been extremely interested in determining whether a driver is sad or happy. A car could alter its tune such as playing a pop song based on your mood and altering the interior lighting in order to improve your attitude.

For example, the NeuV concept car of Honda consists of a huge LCD dashboard that is customizable and a computer connected to the cloud that utilizes artificial intelligence to interact with the drivers. NueV functions as a helpful and thoughtful Artificial Intelligence (AI) assistant using an emerging technology developed by softBank and Honda known as “emotion engine.” Known as Honda Automated Network Assistant (HANA), in it’s application in NeuV, emotional engine will learn from the driver by detecting the emotions behind the judgements of the driver, and then makes new recommendations and choices depending on the past decisions of the driver. HANA can support the daily driving routine of the owner, make music recommendations depending on mood and check on the emotional well-being of the driver.

Honda also unveiled the concept of Cooperative Mobility Ecosystem connecting the power of Big Data, Robotics and Artificial Intelligence to enhance the quality of life of the customer and change the mobility experience of the future. With this concept, the vehicles will communicate with each other and infrastructure to remove the traffic fatalities and mitigate the traffic congestion while offering new kinds of in-vehicle entertainment experiences and enhancing the productivity of road users. Thus, by delivering autonomous services, these vehicles will generate new value when not used by their customers.

The designers say that for detecting the emotional state of a driver, there are practical reasons as well. Usually, a calm driver is a safe driver. Thus, when you become angry and is prone to road rage, the cars will recognize this and they will potentially quell the chimes and annoying bells in the car and in order to soothe you, they play some mellow jazz.

By replacing the remote control fobs and keys, the biometrics such as facial recognition could make the cars extremely difficult to steal. Faraday Future, the electric car company, in its prototype FF 91 sport utility vehicle, utilizes an external camera that is mounted in the door frame to detect the owner of the car and unlock the vehicle automatically. However, such kind of techniques can create new security challenges.

Our faces are located everywhere in this digital age: tagged in the Facebook accounts of our friends, on our Twitter accounts and in the online company profile. Finding an image to print out and foiling the facial recognition system of a car wouldn’t be much difficult.

The engineers have devised extremely advanced countermeasures fortunately. For instance, the stereoscopic video cameras can tell you the difference between a three-dimensional object and a flat image. The cameras of Continental measure the reflected light’s distance off different parts of the face of a person, making sure that it is a real object rather than a high-resolution shot of facial expression of the owner.

The high-tech personalization could be utilized to not only create the amenities for single owners, but also to adapt a vehicle instantly to suit different drivers. For instance, Valets could be prevented automatically from driving faster than a specified speed or accessing the personal information present in a navigation system. Such systems could also be utilized in a ride-sharing situation to rapidly tailor the interior of a car to the physical characteristics of different passengers and drivers.

Ford is planning to add the Alexa personal assistant of Amazon to few of its cars. It will not only enable the personalized music stations to play with a simple voice command but also allow the drivers to juggle tasks such as adding extra items to the present grocery list with just a few words.

So, very soon, we are going to witness cars that can read the expressions of a driver by providing him with greater convenience, safety and security. Many global brands and startup companies are coming up with new innovations everyday that will create a more enjoyable and productive experience for the users.