Cars Recognize Drivers and Protect Their Lives—Three Ways In-Vehicle Sensors Will Change the Era of Autonomous Driving
Editor's Note: Contributor Gideon Shmuel is CEO of EyeSight Technologies and an expert in computer vision and gesture recognition technology.
There is no doubt that in the near future, "drive" will mean riding as a navigator in a self-driving car. As automakers prepare to develop fully autonomous vehicles for end-users, there is a growing focus on driver-related features.
It will be 10 to 20 years before self-driving cars that are 100 percent autonomous and reasonably priced are actually on the road. During this development period, semi-autonomous self-driving cars need to know more about their drivers. This can be easily done using machine learning and in-car computer vision. Whether the driver himself is driving or cruising in auto-drive mode, real-world driving requires improvements in three areas:
As the level and sophistication of self-driving cars increases, carpooling will take on new meaning. Users will start sharing cars to be more environmentally friendly and reduce the monetary cost of having a car that is parked in a garage when not in use. For this reason, development will likely focus on the importance of driver ID recognition and the convenience of personalization.
Facial analysis using computer vision technology can now determine which carpool registrant is sitting in the driver's seat. Taking it one step further, deep learning and AI will give the car access to the driver's own information. Once the AI recognizes the driver, it can automatically adjust the preferred interior temperature, seat position, side and rear view mirror adjustment, in-vehicle system, radio volume, etc. to suit the individual driver.
Driver data such as age and gender will play a major role in connected cars with information terminal functions and semi-automatic self-driving cars. Cars can serve targeted content relevant to the people currently in the vehicle. For example, the AI can suggest places of interest (for example, a nearby children's restaurant if there are children on board) based on the faces of the passengers on a heads-up display (HUD). Real-time analytics on car occupants will also ensure that radio and music streaming services (Spotify, Pandora, AppleMusic, etc.) show ads that are most likely to interest them.
Inattentiveness of drivers is the number one cause of accidents, and drowsiness is one of them that should not be overlooked. The U.S. Highway Traffic Safety Administration says driver fatigue is the direct cause of at least 100,000 police-reported accidents each year.
This resulted in an estimated 1,550 deaths, 71,000 injuries and $12.5 billion in financial losses. In-vehicle sensor technology using computer vision to determine if the eyes are open with an iris (gaze) check, measure the rate of blinking, and track and detect if the head is thumping is causing drowsiness in the driver. It is possible to determine in real time whether a person is busy or distracted. If you doze off behind the wheel, the car can sound an alarm to wake the driver or switch to autonomous mode to save the driver's life and people in other cars on the same road from accidents. can.
This concept is nothing new. Automakers already offer in-car driver-monitoring cameras. But a camera is not enough. A system that can really help is a combination of computer vision and deep learning software embedded in cars. The two detect and instantly analyze the driver's state while driving, tracking and sensing more than just the movement of the driver's eyes.
The AI can also better recognize the habits and characteristics of the driver. If the driver usually blinks a lot, the car will not issue a doze warning and will rather take it as a natural action. In addition, since there are people who have narrow eyes by nature, the car learns this while analyzing the driver's face and does not give unnecessary warnings.
Cars don't send false warnings. It is equally important to distinguish between driver inattentiveness and actual drowsiness. By detecting the angle of the head and whether the eyes are focused on one point, it can determine whether the driver is focused on driving or distracted, and can differentiate between using a mobile phone or falling asleep. issue an alarm.
Today's complicated in-vehicle system operation is quite stressful in itself. The touchscreen display looks nice and fits compactly into the dashboard, but the driver is distracted from tapping complex menu commands and becomes careless ahead. Our research shows that people don't look at the road for an average of 5 seconds while maneuvering. If you're driving at less than 90km/h, it's like running blindfolded across a soccer field. Drivers need a more natural and less confusing way, like screen-free gesture control.
Simple gestures used for daily signals (e.g. pointing your index finger to your lips means be quiet, waving your hand to the right for OK, to the left for NO, etc.) reduce the driver's burden and pay attention. Minimize distractions. BMW has already implemented gesture control in the 7 Series, making it a more user-friendly in-vehicle system.
Technology is advancing the function of cars more and more. However, on the other hand, cases of careless driving are increasing. There is a high possibility that the people reading this article are also driving like that. Until fully automated driving cars come to market, we want to avoid careless driving and protect safe driving with new technologies such as in-vehicle sensors, computer vision, and AI.
[Original]
[via VentureBeat] @VentureBeat
BRIDGE operates a membership system "BRIDGE Members". BRIDGE Tokyo, a community for members, provides a place to connect startups and readers through tech news, trend information, Discord, and events. Registration is free.Free member registration