A fatal pedestrian crash involving a self-driving Uber SUV in a Phoenix suburb could have far-reaching consequences for the new technology as automakers and other companies race to be the first with cars that operate on their own.
The crash Sunday night in Tempe was the first death involving a full autonomous test vehicle. The Volvo was in self-driving mode with a human backup driver at the wheel when it struck 49-year-old Elaine Herzberg as she was walking a bicycle outside the lines of a crosswalk in Tempe, police said.
Uber immediately suspended all road-testing of such autos in the Phoenix area, Pittsburgh, San Francisco and Toronto. The ride-sharing company has been testing self-driving vehicles for months as it competes with other technology companies and automakers like Ford and General Motors.
Though many in the industries had been dreading a fatal crash they knew it was inevitable.
Tempe police Sgt. Ronald Elcock said local authorities haven’t determined fault but urged people to use crosswalks. He told reporters at a news conference Monday the Uber vehicle was traveling around 40 mph when it hit Helzberg immediately as she stepped on to the street.
Neither she nor the backup driver showed signs of impairment, he said.
“The pedestrian was outside of the crosswalk, so it was midblock,” Elcock said. “And as soon as she walked into the lane of traffic, she was struck by the vehicle.”
The National Transportation Safety Board, which makes recommendations for preventing crashes, and the National Highway Traffic Safety Administration, which can enact regulations, sent investigators.
Uber CEO Dara Khosrowshahi expressed condolences on his Twitter account and said the company is cooperating with investigators.
The public’s image of the vehicles will be defined by stories like the crash in Tempe, said Bryant Walker Smith, a University of South Carolina law professor who studies self-driving vehicles. It may turn out that there was nothing either the vehicle or its human backup could have done to avoid the crash, he said.
Either way, the fatality could hurt the technology’s image and lead to a push for more regulations at the state and federal levels, Smith said.
Autonomous vehicles with laser, radar and camera sensors and sophisticated computers have been billed as the way to reduce the more than 40,000 traffic deaths a year in the U.S. alone. Ninety-four percent of crashes are caused by human error, the government says.
Self-driving vehicles don’t drive drunk, don’t get sleepy and aren’t easily distracted. But they do have faults.
“We should be concerned about automated driving,” Smith said. “We should be terrified about human driving.”
In 2016, the latest year available, more than 6,000 U.S. pedestrians were killed by vehicles.
The federal government has voluntary guidelines for companies that want to test autonomous vehicles, leaving much of the regulation up to states.
Many states, including Michigan and Arizona, have taken a largely hands-off approach, hoping to gain jobs from the new technology, while California and others have taken a harder line.
California is among states that require manufacturers to report any incidents during the testing phase. As of early March, the state’s motor vehicle agency had received 59 such reports.
Arizona Gov. Doug Ducey used light regulations to entice Uber to the state after the company had a shaky rollout of test cars in San Francisco. Arizona has no reporting requirements. Hundreds of vehicles with automated driving systems have been on Arizona’s roads.
Ducey’s office expressed sympathy for Herzberg’s family and said safety is the top priority.
The crash in Arizona isn’t the first involving an Uber autonomous test vehicle. In March 2017, an Uber SUV flipped onto its side, also in Tempe. No serious injuries were reported, and the driver of the other car was cited for a violation.
Herzberg’s death is the first involving an autonomous test vehicle but not the first in a car with some self-driving features. The driver of a Tesla Model S was killed in 2016 when his car, operating on its Autopilot system, crashed into a tractor-trailer in Florida.
The NTSB said that driver inattention was to blame but that design limitations with the system played a major role in the crash.
The U.S. Transportation Department is considering further voluntary guidelines that it says would help foster innovation. Proposals also are pending in Congress, including one that would stop states from regulating autonomous vehicles, Smith said.
Peter Kurdock, director of regulatory affairs for Advocates for Highway and Auto Safety in Washington, said the group sent a letter Monday to Transportation Secretary Elaine Chao saying it is concerned about a lack of action and oversight by the department as autonomous vehicles are developed. That letter was planned before the crash.
Kurdock said the deadly accident should serve as a “startling reminder” to members of Congress that they need to “think through all the issues to put together the best bill they can to hopefully prevent more of these tragedies from occurring.”
…