The importance of technology development ecosystems to the safety of autonomous driving

2020-02-26 16:33:35 sunrise-auto Read

The technical collaboration of autonomous driving helps promote autonomous driving and prove that it is safe, efficient, and feasible.

Self-driving cars (AV) are rapidly moving from hype to reality. Emerj's recent report documents the plans of the 11 largest automakers, with Honda, Toyota and Renault-Nissan as early as next year. However, it is clear that deploying mass-produced autonomous vehicles has more requirements than traditional cars. Autonomous driving requires active interaction with drivers, other vehicles, and infrastructure, and requires more validation. Just one participant cannot do it, and cooperation between different participants in the autonomous driving ecosystem is required. Recent technical collaborations, such as 3M's next-generation digital intelligent code signing technology and NVIDIA's DRIVE Constellation ? simulation platform, have demonstrated the importance of the ecosystem in enabling autonomous vehicles.

image.png


Some progress has been made so far, despite a noteworthy accident, the safety record of the Level 3+ system is still outstanding. In fact, the California Motor Vehicle Administration (DMV) compiles and publishes statistics on human intervention for all companies testing self-driving cars on its roads; last year Waymo's cars traveled 1.2 million miles at an intervention rate of 11,018 miles per intervention 1 time. This speed is not only close to half of 2017, but is rapidly approaching the average annual mileage in the United States (13,476 miles), which is more than 1.5 times the average annual mileage in the UK (7,134 miles).

Sensor progress

At the core of autonomous vehicles is perception technology. Combined with low latency between vehicles, between vehicles and infrastructure communication systems, and combined data interpreted by powerful artificial intelligence (AI) -based processors.

There are three core sensor technologies:

Lidar: For depth mapping. Current systems can exceed a distance of 100 meters with a wide field of view

Radar: for motion measurement (up to 300km / h), object detection and tracking range up to 300m

Camera: for identification and classification

Although not all vehicles will use the same sensor combination (some currently only use radar and imaging, while others use LiDAR and imaging), each additional sensor provides more data and complements each other through sensor fusion. Greatly improved the accuracy, safety and reliability of the entire system and vehicle.

Each of the core sensor technologies is constantly evolving. ON Semiconductor benefits from next-generation silicon photomultiplier tube (SiPM) and single-photon avalanche diode (SPAD) solutions, enabling LiDAR systems to detect longer distances, even for low-reflectivity targets, while reducing system size And cost. The company is developing radar technology that uses the same IC to work in both short-range and long-range modes, improving accuracy, reducing power consumption, and reducing the number of devices. In imaging, sensors such as ON Semiconductor's Hayabusa series are offering a wider selection of resolutions to meet the diverse needs of autonomous vehicles.

 

image.png

Thanks to the development of advanced pixel structures, the Hayabusa series also uses the industry-leading super exposure mode, supporting a high dynamic range of more than 140 dB (providing high-quality images in challenging scenes containing very dark and very bright areas) , While suppressing LED flicker (LFM) to reduce the flicker of increasingly popular LED vehicles, road signs and street light sources.

Another important example of progress in sensor technology and the autonomous vehicle ecosystem is that vehicles will be able to communicate with the road infrastructure itself. This can be crucial, for example, to alert vehicles to dangerous road conditions or accidents ahead.

An autonomous driving ecosystem can improve the efficiency and safety of autonomous vehicles by defining and facilitating the way vehicles communicate with road networks, thereby obtaining warnings of dangerous situations or accidents ahead. Short-range wireless communication is a key part of achieving this goal, but it is expensive to deploy in the entire road network, and it is easy to be attacked by hackers. Therefore, it is necessary to establish security mechanisms and network security solutions.

As a result, 3M has also shifted to a vision-based approach and recently announced a partnership with ON Semiconductor to help improve navigation for vehicles equipped with autonomous driving capabilities. This can be implemented with wireless communication systems on major roads; deploying wireless infrastructure on smaller roads and temporary routes may not be feasible.

 

image.png


Image sensors can now “see” far beyond human drivers. By co-developing image sensors with 3M, they can use signals to transmit more information to further assist drivers beyond traditional advanced driver assistance systems (ADAS) And paving the way for autonomous driving. The results of the collaboration were exhibited at CES in January. ON Semiconductor's AR0234AT CMOS image sensor integrates 3M's intelligent code signing technology.

The addition of ON Semiconductor's vision technology not only improves accuracy, provides redundancy, and enables vehicle-to-infrastructure communications where wireless systems cannot be implemented; the visibility of such systems can also help expose the public Unveiling the mystery of such technologies can help increase consumer trust in self-driving car technology.

The processors of autonomous vehicles face considerable computing challenges, not only merging the output of different sensors, but also processing the large amounts of data generated by these sensors, especially the vision system. Therefore, the ecosystem is critical to guiding companies' technology development and reducing the pressure on automotive processors.

A typical example of such a development ecosystem platform is NVIDIA DRIVE, a complete hardware and software ecosystem that enables system developers to collaborate and utilize advanced development systems to accelerate the design and production of autonomous vehicles. DRIVE combines deep learning, sensor fusion and peripheral vision to transform the driving experience and meet the highest safety standards ISO 26262 ASIL-D functional safety possible.

An example of implementing this ecosystem was exhibited at the GPU Technology Conference in March. NVIDIA and ON Semiconductor demonstrated an open, cloud-based platform that provides real-time data from image sensors to NVIDIA DRIVE Constellation. This supports simulations for large-scale testing and verification to accelerate the development of safe, robust driverless vehicles.

to sum up

The transportation industry is undergoing disruptive changes. Almost all automakers' self-driving vehicles will be put into production in the next few years, which will bring many benefits to driving and society, especially the drastic reduction of road traffic accidents. Semiconductors are at the core of driving innovation in emerging transportation modes. Technological progress has been rapid, but complexity and difficulties have grown exponentially. Automotive companies, technology companies, universities, and governments must collaborate to deploy autonomous vehicles safely, reliably, and in a timely manner. The development of ecosystems is essential to guide and accelerate development and to prove that autonomous vehicles are safe, efficient, and feasible.


91精品啪在线观看国产,91精品人妻aⅴ区,91精品视频AⅤ,91精品视频在线,91精品手机国产在线能下载