CONTRIBUTOR

Mobilty and IoT connectivity specialist AddSecure recently announced the launch of an AI-powered video telematics solution for driver monitoring system (DMS) and advanced driver assistance system (ADAS) platforms. 

This platform lets fleet managers monitor driver behavior on the road and detect any risky driving practices through advanced algorithms, which analyze driver data and detect patterns of fatigue including yawning, nodding or head drooping. 

The system can also alert drivers of potential hazards and help them avoid accidents, provide feedback to improve their driving skills and alert drivers in real-time, warning them to take a break or rest; features that will digitally transform the transportation industry.

“AI-powered driver monitoring systems use cameras, sensors and machine learning algorithms to analyze driver behavior, such as eye movements, head positioning and steering patterns,” explains Johan Stråkander, marketing director at Addsecure.

He points out many car manufacturers, including Tesla, BMW, Mercedes-Benz, Audi and Subaru have already implemented DMS camera technology into their vehicles.

Berg Insight estimates that the installed base of active video telematics systems in North America reached 2.9 million units in 2021.

Growing at a compound annual growth rate (CAGR) of 16.5 percent, the active installed base is forecasted to reach almost 6.3 million units in North America by 2026.

With the growing level of autonomy in passenger vehicles there is a higher dependency on DMS to alert and intervene to avert a crash from happening when a situation arises.

Gartner analyst Pedro Pacheco notes DMS has been around for a long time – the first car to have one was the Lexus LS600h back in 2006 – but the technology has been evolving to be more precise.

“For example, it has advanced to operate properly in low-light or intense-light conditions or with different facial features,” he explains. “This minimizes the number of false positives or false negatives.”

Pacheco adds it is quite common for more evolved fleet operators to use DMS (in many cases, aftermarket systems) to monitor the performance of drivers and reduce accidents.

“This is more critical in long-haul driving, where spending long periods behind the wheel raises the chances of drowsiness,” he says. “There are several areas of evolution for computer vision in the field of DMS.”

One is the adoption of emotion AI. This allows the use of computer vision to gauge an individual’s emotions.

“This could be useful for the vehicle to anticipate a driver’s needs by activating systems that improve cabin experience,” he says. “In addition, some automakers have started mounting the DMS camera in a different location – like on the rearview mirror or above – which allows it to have a view of all occupants in the cabin.”

This can be used for different features; for instance, to detect if a child or a pet has been forgotten in the cabin while the driver exits the vehicle.

“Moreover, computer vision can be used to detect particular health conditions of an individual, especially those that are noticeable in elements of the face,” Pacheco says. 

IDC analyst Sandeep Mukunda says the use of AI in DMS would simplify efforts to identify anomalies (driver attention related), ensure compliance and generate or manage alerts to the vehicle’s driver, owner and insurers.

“The AI-powered dashcam is at a nascent stage of growth, with fleet operators increasingly adopting the technology,” he says. “AI-powered DMS systems are necessary for the growth of autonomous vehicles, especially for boosting confidence amongst stakeholders such as insurance providers and regulatory bodies.”

He adds that with more data captured over time, AI engines will be optimized to determine the need for action and intervene if necessary to avert a crash.

“With Level 4 autonomous driving systems and beyond, DMS systems would transition to in-cabin space monitoring systems for robotaxis and mobility as a service applications,” Mukunda explains. 

Stråkander notes the AI embedded in DMS systems must continue to evolve to improve accuracy and reliability in detecting and predicting driver behavior.

“The next big step for DMS technology is likely to involve the integration of machine learning algorithms that can adapt to different driving scenarios and individual driver behaviors,” he says.

Currently, most DMS systems rely on pre-defined rules and thresholds to detect and classify driver behavior.

However, machine learning algorithms can learn from data and adjust their models based on the specific characteristics of each driver and driving situation.

This can improve the accuracy of DMS systems and make them more robust to changes in lighting conditions, driving styles and environmental factors.

“Another important area for the evolution of DMS technology is the integration of multiple sensors and data sources,” he says. “DMS systems can use a combination of cameras, infrared sensors and other technologies to capture a more comprehensive view of the driver’s behavior and environment.”

Stråkander says by integrating data from different sources, DMS systems can improve their accuracy and reliability in detecting and predicting driver behaviors.