How the Automotive industry will reshape your driving experience with soon to be everywhere in-car sensors.
As we speak the high-end cars are already gathering more than 100 Electronic Control Units (ECUs) and running close to 100 million lines of code. Automotive industry is transitioning from hardware to software-defined vehicles and this is seen in the rapidly increasing software and electronics price tag per vehicle. Software is expected to grow at a compound annual rate of 11 percent, to reach 30 percent of overall vehicle content in 2030, from today’s 10%, for a D segment car.
While exterior sensors point outward in all directions to detect the countless factors on the road, powerful sensors inside the car will point inward toward drivers and passengers to detect factors even more complex: health state, emotion, attention and human action. At present, many automakers are paying attention to develop IoT enabled vehicles, including healthcare, accident prevention, vehicle safety, driver safety, driver and passenger comfort, vehicle monitoring etc.
Gartner predicts 88% of all new vehicles sold worldwide will incorporate connectivity in 2028, with the greatest proportion in the U.S., Japan, South Korea and Europe.
The key to connected vehicles becoming a moneymaking enterprise is taking sensor information and applying it in a very differentiated way from what a phone or a smart watch can provide.
It’s the trend that motivated Ford to work on health features for an edge in the competitive automotive market. The company has experimented with three different types of health features for their test cars.
An electrocardiography (ECG) reader that’s integrated into the driver’s seat. While the ECG systems used in hospitals use electrodes attached to the skin, this contactless system records its signals through the driver’s clothes. Obviously, a thick winter jacket or leather coat won’t work. The ECG happens seamlessly while driving and when you get to the office the analysis is already waiting in your inbox.With the estimates showing that 30% of its European customers will be over 65 in 2050 and with older people quite often having heart problems, Ford seems that it has a safe market for this kind of feature.
This potential feature would be brought in via wearables and apps that are connected to the car via Ford Sync, which puts info on the car’s central display screen. This works also for a backseat passenger, like a child. His mother can safely check his sugar level if necessary, on the central screen, avoiding dangerous gestures like turning her head.
Connectivity would make in-car telemedicine a possibility, so instead of visiting your doctor, your vehicle can connect you with him. Sensors in the car could remotely record the driver’s vital signs and send that data to the doctor during the consultation.
Ford is already equipping test cars with all kinds of cameras and sensors to remotely measure vital signs. Body temperature can easily be measured with infrared cameras or in-seat sensors. For respiration and heart rate, low-intensity radar can be used; as the person’s heart beats and their lungs expand and contract, the radar signal changes.
Much attention is paid to post-crash, internal injuries to the driver and passengers that aren’t visible but can be life-threatening. This is the context for medical connected services technology from MDGo, an Israeli start-up. MDGo’s software analyses crash data routinely, collected by a car’s sensors and electronics infrastructure in real-time, deducts the impact of a collision on a person’s internal organs, head, neck, chest and pelvis, then extrapolates the injuries they may have sustained. It then uses the vehicle’s embedded connectivity to send a message alerting emergency medical responders and local hospitals to these possibilities, allowing them to figure out their best course of action before arriving at the scene.
Sensors for monitoring the health of users can already be found in many conceptual vehicles of traditional manufacturers and start-up companies. However, there is a real risk that this kind of sensitive information could be misused by insurance companies and healthcare companies as well as hackers.
Automakers have been outfitting vehicles with a growing set of sensors as well as embedded cloud connectivity. Besides the above uses for these technologies, like detecting the health and wellness of the vehicle’s occupants — both during an uneventful ride or after a major crash, emotion and activity detection is being designed to alter or even override risky choices to save lives. Drunkenness, stress, confusion, distraction, or sleepiness will be detected, but these sensors will also identify and pay attention to the interaction between people in the car, then use artificial intelligence to “understand” the context of human behaviour in order to respond accordingly.
Surprising things happening inside the cars can lead to accidents. Drivers can see something disturbing—a car accident, or an animal injured by a car—or do something distracting, such as spilling coffee or dropping a mobile phone. Emotion and activity detection can detect when this happens and take safety-related actions, such as going into autonomous mode briefly and slowing down until the driver can recover. If an emergency arises, even with an unconscious or incapacitated driver, cars should be able to call 911 or even drive them autonomously to the hospital. Driver inattention is critical since the vast majority of car accidents are due to human error. Understanding the driver’s cognitive state is crucial. For example, ADAM CogTec company, from Portland, developed Adaptive Driver’s Attention Management system (hence the company name) which flashes white LED lights in the driver’s peripheral vision field and analyses physiological responses to the visual stimulus: eye’s rapid movement, pupil dilation and eyelid position, in addition to other ocular parameters. Tests are done at the start of a drive and repeated periodically in order to keep trace of changes. And if the results signal impairment, the vehicle’s ADAS technologies can initiate preventative measures to avoid potential accidents. The company’s product can distinguish among impairments caused by alcohol, drugs, extreme fatigue or wandering mind. It was tested against the breathalyzer in the lab and it has been proven 95% accurate in regard to alcohol impairment.
But safety isn’t the only objective. Some new features enabled by this technological shift will combine both safety and satisfaction. Gimmicks like “gaze detection” will probably be available, only because technology allows it. This will enable cars to answer questions like “What is that building?” or “Is that restaurant open?” and get an exact answer.
Not a life changing feature, but for some a welcomed addition.
The venerable automobile will be transformed over the next decade or two by autonomous driving technology and we’re all expecting this transformation. What most consumers are not expecting is a whole new relationship with their cars. While emotion and activity detection, combined with AI, will enable the car to understand, predict, help, and protect you (even from yourself) cars will also become more communicative. For example, what can a car do when the driver is consumed by road rage? There are a few ideas, like playing soothing music, suggesting a place to stop, changing the temperature, or even taking control from the driver and pulling over. Things can become a bit weird (and possibly make an angry driver even angrier…) like the voice of the car’s virtual assistant taking the driver through deep breathing exercises and what is essentially on-the-spot anger management. The car becomes your anger manager, helping you to calm down. Hopefully.
Meanwhile let’s hope that autonomous driving will not turn that soon into obnoxious driving.