By navigating our site, you agree to allow us to use cookies, in accordance with our Privacy Policy.

History of ADAS: From Moving Parts to Vehicles Driven by Code – Part Three

City driving, which involves frequent changes in direction and speed makes the algorithms less dependable.

-By Dan Clement, Applications Marketing Engineer, onsemi

Introduction

Previous articles in this series discussed the evolution of ADAS systems. In this final article we consider drowsiness detection and driver monitoring, key enablers for fully autonomous vehicles. We also review mirror replacement and surround-view features. Finally, we discuss future developments in this area, including augmented and virtual reality and software-defined vehicles.

Drowsiness Detection and Driver Monitoring

Many ADAS systems were initially mechanical designs, and early efforts to combat driver drowsiness were no exception. Rumble strips on the roadside created mechanical vibrations that alerted a driver if they began to veer out of their lane. This simple, if inelegant, solution helped to reduce the number of accidents caused by driver drowsiness. (NHTSA, 1998).

A 1998  report published by U.S. National Highway Traffic Safety Administration (NHTSA) addressed the topic of driver drowsiness and documented contemporary tools used to measure driver drowsiness in both in the laboratory and in vehicles. Physiological signals were commonly used to detect drowsiness, but this approach was only practical in a laboratory environment as it required the study and calibration of individual subjects. The report also stated that in-vehicle approaches were under investigation, including monitoring of eye-closure, steering sensors, and lane tracking (NHTSA, 1998). However, limitations in available technologies meant that these systems were only available in commercial vehicles at the time. (Dinges, 1995).

The steering angle sensor, which tracks how far and fast the steering wheel turns, was one of the first electronic drowsiness detection tools, with commercial versions becoming available by the turn of the millennium. Driver drowsiness can be reliably estimated when the information provided by the sensor is combined with camera, speed, and stability information (pitch and yaw) and processed by a software algorithm. During the early phase of each vehicle journey, this type of system calibrates the driver behavior to establish a performance baseline which is used to compare later observed changes in driving behavior. Usually, these systems operate best at highway speeds because they measure tiny changes in steering behavior. City driving, which involves frequent changes in direction and speed makes the algorithms less dependable. Many consider Bosch to be the technology leader in this area — their product page provides details about their solution. Ultimately, autonomous vehicles, which do not require manual control, will make these systems redundant.

A solution which is suitable for both manual and autonomous vehicles has now emerged – the driver monitoring system (DMS). This was developed in the late 1990s but was not ready for production until the 2020s. This uses a combination of a camera, computer vision and processing that monitors a driver’s face and eyes to ensure they are properly focused on the road. While this approach might appear straightforward, the algorithms it uses are complex. This article from the EE Times article provides further details about the DMS and debunks some myths that surround it. (Barnden, 2021).

By 2024, all new vehicles must have DMS to attain get highest Euro NCAP crash and safety ratings. Several proprietary solutions ranging from entry level to premium versions that are now available. This creates a problem for OEMs because the market for this system is highly cost-sensitive and customers reluctant to pay extra for this feature,

This system is now being extended to also monitoring all other vehicle occupants – occupant monitoring system (OMS) – which also functions as a comfort and convenience feature. Drivers can customize vehicle settings using gesture and facial recognition. Video call or social media apps and safety features can also use OMS, for example to detect if a child has been left unattended in a vehicle or by automatically disabling airbags in empty seats. Security features can use OMS for video recording.

The hardware and optics for DMS and OMS differ in that DMS typically uses near-infrared imaging with a global shutter, while OMS typically uses a rolling shutter and visible light. Many OEMs seek to combine DMS and OMS into DOMS (driver and occupant monitoring systems) to lower costs and achieve economies of scale.

OMS Solution
Figure 1. OMS Solution, Picture courtesy of Business Wire

An economical solution for DOMS uses high-performance image sensors from onsemi. This novel solution allows one rolling shutter image sensor for both DMS and OMS. When installed separately, DMS is typically mounted on the steering column or the dash, while OMS is usually located above the rearview mirror or on the pillars. A combined DOMS solution is commonly situated in the rearview mirror.

Surround-View and Mirror Replacement

Surround-view cameras use visible light and are located on the exterior of a vehicle to help improve visibility for the driver when reversing or parking. There are typically four cameras: one in the front, one in the rear, and one on each side of the vehicle. These wide-angle lens cameras create a fisheye image of the surrounding environment. Image processing algorithms then combine images from the four cameras, and this is then presented on a dashboard display. This so-called Omniview (or 360° view) feature emulates a camera situated above the vehicle. The 2007 Infiniti EX35 was first vehicle to have a surround-view system.

Example of Surround-View System
Figure 2. Example of Surround-View System, courtesy of ExtremeTech

A surround view system, when used in combination with ultrasonic sensors, is highly effective at preventing collisions with other vehicles and pedestrians during parking. Apart from parking, side cameras can also be used to replace traditional side mirrors. Cameras can be manufactured much smaller than mirrors, meaning vehicle are more aerodynamic and hence more energy efficient.

The adoption of camera mirrors is thus slower than surround-view (Howard, 2014) because many drivers still prefer side mirrors and, in many countries, mirrors are required as a backup to cameras.

Software-Defined Vehicle

The term software-defined vehicle is now being used to capture the tremendous change which is happening in vehicle design over the past ten years. The level of computing required to support the large number of sensors in autonomous vehicles has been a major force behind the significant changes in how vehicle systems are deployed and managed.

The traditional vehicle architecture used a central CAN or LIN network to connect modules. However, as vehicle subsystems became more complex and required centralized computing, domain controllers were replaced this approach. For example, a dedicated ADAS controller is required to combine sensor inputs, process the data, and send instructions to multiple safety systems. Increased data rates have also necessitated higher-speed data transfer protocols. Ultimately, centralized computing will become the norm, driven by the amount of processing required by autonomous driving (Morris, 2021).

Vehicle Architectures
Figure 3. How Vehicle Architectures are evolving—on the left, the original system; in the middle, modern style with domain controllers; and on the right, with centralized computing (Photo courtesy of EE Times and Siemens Digital)

Similar to the evolution of cell phones, consumers will also demand more digital features and longer-term value through software updates. OEMs are attracted to the idea of over-the-air software updates, installation of new features and bug patches. This model also enables new revenue streams through services and subscriptions.

This scale of this transition has resulted in some automobile companies redefining themselves as software, instead of car companies. Tesla began as a tech company in 2003, before producing its first Roadster in 2008 (Reed, 2020).

Virtual and Augmented Reality

Virtual and augmented reality and Web 3.0, commonly referred to as the metaverse, is also generating a lot of excitement. Already, there are vehicles on the road which have head-up displays (HUD). These will overlay ever more digital content as augmented reality for the driver—for example, 3D navigation projection.

In some future level 5 autonomous vehicles, glass windshields may even be replaced a solid unit in which giant screens will present a completely virtual view. A fascinating vision from Nissan is presented here.

Invisible-to-Visible
Figure 4. Invisible-to-Visible, Image Courtesy of Nissan

Conclusion

This series began by considering mechanical cruise control (the Speedostat) before charting the evolution of ADAS from mechanical to electrical implementations, through software-defined vehicles to fully autonomous driving.

The automotive industry is undergoing an incredible transformation. While new paradigms are exciting, they are best appreciated after first reviewing the fascinating history of the systems we have come to expect in present day vehicles.

As a market leader in automotive and ADAS, we at onsemi hope you enjoyed these articles. To learn more about our ADAS offerings, please visit our solution page at https://www.onsemi.com/solutions/automotive/adas.

References

Barnden, C. (2021, May 13). Busting Myths of Driver Monitoring Systems. Retrieved from EE Times: https://www.eetimes.com/busting-myths-of-driver-monitoring-systems/

Dinges, D. (1995). An Overview of Sleepiness and Accidents. J. Sleep Res. 4, Suppl. 2, 4-14. Retrieved from https://onlinelibrary.wiley.com/doi/epdf/10.1111/j.1365-2869.1995.tb00220.x

Howard, B. (2014, July 18). What are car surround view cameras, and why are they better than they need to be?, Part Two. Retrieved from Extreme Tech: https://www.extremetech.com/extreme/186160-what-are-surround-view-cameras-and-why-are-they-better-than-they-need-to-be/2

Morris, B. (2021, March 29). EE Times. Retrieved from E/E Architecture Considerations for AV Development: https://www.eetimes.com/e-e-architecture-considerations-for-av-development/

NHTSA. (1998). Drowsy Driving and Automobile Crashes. n/a: n/a. Retrieved 11 30, 2021, from https://rosap.ntl.bts.gov/view/dot/1661

Reed, E. (2020, October 5). History of Tesla: Timeline and Facts. Retrieved from TheStreet: https://www.thestreet.com/technology/history-of-tesla-15088992

Unknown. (2021, December 1). Invisible-to-Visible (I2V). Retrieved from Nissan Motor Corporation: https://www.nissan-global.com/EN/TECHNOLOGY/OVERVIEW/i2v.html

Unknown. (2021, December 1). Software-Defined Vehicles – A forthcoming Industrial Evolution. Retrieved from Deloitte: https://www2.deloitte.com/cn/en/pages/consumer-business/articles/software-defined-cars-industrial-revolution-on-the-arrow.html

Tags

BiS Team

BIS Infotech is a vivid one stop online source protracting all the exclusive affairs of the Consumer and Business Technology. We have well accomplished on delivering expert views, reviews, and stories empowering millions with impartial and nonpareil opinions. Technology has become an inexorable part of our daily lifestyle and with BIS Infotech expertise, millions of intriguers everyday are finding for itself a crony hangout zone.

Related Articles

Upcoming Events