Stages of AV Adoption

I expect autonomous vehicle adoption to come in five stages, each of which will be approximately five years long.

We are currently in what I call Stage 0 (2016-2020) Development and Testing. AV development is fully underway with $100 billion invested or committed, and the leading companies are already into testing. By 2019 and 2020, Waymo, Uber, and others will complete testing sufficiently to begin rapidly deploying ARS in Stage 1.

Stage 1 (2021-2025) Launch of ARS will be the most exciting stage for AV adoption. As the stage title implies, the focus will be on ARS. There will be what is essentially a land rush by significant players to deploy ARS fleets to critical metropolitan markets during this stage. I estimate that as much as $200 billion will be invested to establish ARS during this stage. By the end of this stage (2025), almost a million ARS vehicles will be deployed generating revenue of more than $150 billion. During this stage, there also will be significant adoption of autonomous vehicles in the retail market and trucking and delivery. The disruptions previously discussed will start toward the end of this stage.

Stage 2 (2026-2030) will bring broad acceptance of AVs. ARS will continue to grow with revenue approaching $750 billion, and more than 15% of the miles traveled will be by ARS. The automotive retail market will continue to shrink, and by the end of this stage, potentially 75% of the new vehicles sold will be at least sufficiently autonomous, enabling them to drive on their own most of the time. It is also the stage where the disruptions of AVs will become apparent to everyone.

Stage 3 (2031-2035) More Advanced AVs and Stage 4 (3036+) AVs Completely Displace Cars will be the next two stages in the adoption of AVs. By then the horseless-carriage, a little more than a century after its introduction, will be gone into history, replaced by the driverless autonomous vehicle.

For a more detailed explanation and forecasts, see Autonomous Vehicles: Opportunities, Strategies, and Disruptions.

Lessons Learned from the Fatal Uber AV Accident

Michael E. McGrath

Autonomous vehicles (AVs) offer the prospect of enormous changes to our lifestyles and the creation of major industries, so, it’s important to examine the lessons learned from the recent fatal Uber AV accident in Arizona.  On March 18th, Elaine Herzberg was walking her bicycle far from the crosswalk on a four-lane road in the Phoenix suburb of Tempe at about 10 PM when she was struck by an Uber autonomous vehicle traveling at about 38 miles per hour.

The minimum expectation for an AV is that it is as safe as a human driver, and this appears to be the case in this accident. This was not an example demonstrating that an AV is less safe than a human driver.  “It’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” stated the local police chief. AVs, unlike human-driven cars, fully record all video and vehicle data. Based on the AVs onboard video recorder, the pedestrian appeared suddenly from the dark and was visible for only 1 second before impact, and she didn’t appear to notice the car even though the headlights were shining directly on her. It takes a human 2 seconds to react instantly to a situation like this. While Uber’s AV may have performed as well as a human driver, we expect more from AVs.

This appears to be a case where an AV should have outperformed a human drive. By the time she came into view of the camera, she had crossed the left lane and was moving into the right lane. Although hidden by the dark, she was out in clear space on the road for at least a couple of seconds. In addition to video, the Uber AV used radar and lidar sensors capable of seeing the surroundings in complete darkness, which should have detected Herzberg’s presence in the roadway. The reason why the Uber AV’s lidar failed to detect her in time is still undetermined. It could have been a blind spot in its sensor array, a failure to characterize the object properly, or possibly something else. Reuters subsequently reported that Uber reduced the number of sensors, including lidar, on its AVs when it went from earlier Ford Fusion versions to the newest Volvo test vehicle. It’s possible that the reduction in sensors may have caused a blind spot in the sensor coverage.

This accident raises questions on the appropriate sensor package. Most AVs use multiple sensors and fuse together what each detects, but some rely primarily on video sensing, seeing what the human eye sees. This accident may cast doubts on relying too much on video sensors. In addition, it shows that there may be a benefit to adding thermal sensors to the sensor package. An AV cannot react to all objects in motion. Slamming on the breaks when a bag is blown across the road can cause an unnecessary accident, but if the object is a human or an animal, breaking should be initiated. These objects have different thermal heat signatures. Humans and animals on or near the road create far more heat than their surroundings.

AVs learn exceptionally better than humans. While a human driver involved in this type of accident may learn from it personally, the other hundreds of millions of drivers won’t. This is a big difference between human drivers and AVs. Every AV will learn from this experience and be more cautious in the same conditions. The software, which is the brains of the AV, will put this case in memory to be called upon in any similar circumstances. This ability to distribute learning will make AVs increasingly safer over time. Within a few days after the accident, I expect that all companies developing AVs have verified this case situation and made sure that they would have detected the pedestrian and taken corrective action. It is unlikely that a similar accident will occur with any AV in the future.

This accident shows how humans are unpredictable and that unpredictable behavior needs to be included in an AV’s artificial intelligence. The victim in this case, appeared to walk directly in front of the vehicle without noticing it. Almost 6,000 pedestrians were killed in the United States in 2016, and most these were not using crosswalks.

There is also a lesson here on news reporting. A headline that you won’t see is: “Autonomous vehicle avoids hitting pedestrian that would have been killed by a human driver”. Successes aren’t anything more than entries into a database for the artificial intelligence guiding AVs. They don’t make news stories. Unfortunately, this tragic accident could have been simply an addition to the database. So, we need to expect that AV incident reports in the press will be overwhelming negative. Eventually, there may be reports that a drunk driver killing someone would have been avoided by an AV. In the meantime, one-sided news may cause people to be more cautious and may motivate regulation of AVs

This accident may show that some AVs may be better and safer than others because of the way they are designed and developed. We already saw a glimpse of this when Waymo claimed that its AVs would have spotted and avoided the pedestrian in the Uber accident. These differences among AVs may raise many interesting competitive issues.

Finally, the publicity of this accident could make pedestrian traffic safer. Herzberg walked her bike from the center median across two lanes when she is struck by the vehicle. The large median at the site of the crash has signs warning people not to cross and to use the crosswalk instead, but the median also has a brick pathway that accommodates people who do cross there.

Autonomous vehicles will profoundly change transportation as we currently know it, and in the process, significantly improve lifestyles and create major new industries. However, it’s important to be implemented carefully, and the learnings from this incident can help.