- Michelle Cordova
- Poll: Most Floridians Say They're Prepared for Hurricanes, But Not Covered for Flood
- Dust from Sahara Desert Moves West, Puts Brakes on Atlantic Hurricanes
- Senate Agrees with House to Renew Flood Insurance Program for 4 Months
- House Passes Another Stop-Gap Flood Insurance Extension; Senate Expected to Follow
- August 2018 (3)
- July 2018 (6)
- June 2018 (5)
- May 2018 (2)
- April 2018 (6)
- March 2018 (3)
- February 2018 (1)
- January 2018 (4)
- December 2017 (2)
- November 2017 (7)
- October 2017 (5)
- September 2017 (4)
- August 2017 (11)
- July 2017 (5)
- June 2017 (9)
- May 2017 (7)
- April 2017 (6)
- March 2017 (10)
- February 2017 (6)
- January 2017 (3)
- October 2016 (2)
- September 2016 (5)
- August 2016 (7)
- July 2016 (1)
Thursday, March 30 2017
A recent crash involving an Uber Technologies Inc. driverless car suggests autonomous software sometimes takes the same risks as the humans it may one day replace.
The accident on Friday in Tempe, Arizona, caused no major injuries. Another human-driven car turning left failed to yield, hit the Uber car and flipped it on its side. After a short pause, the company’s self-driving test fleet was back on public roads in Tempe, Pittsburgh and San Francisco early this week.
But the Tempe Police Department report, released Wednesday, recounts a complex story.
The Uber Volvo SUV, outfitted with autonomous driving sensors, was heading south on a wide boulevard with a 40 miles-per-hour speed limit. It had two of the company’s test drivers in front and no paying passengers. The light turned yellow as the vehicle entered an intersection. A green Honda on the other side of the road was trying to make a left at the light. The driver thought it was clear and turned into the oncoming Uber SUV, according to the police report.
In a statement to police, Patrick Murphy, an Uber employee in the car, said the Volvo SUV was traveling 38 miles per hour, a notch below the speed limit. He said the traffic signal turned yellow as the Uber vehicle entered the intersection. He then saw the Honda turning left, but “there was no time to react as there was a blind spot” created by traffic. The Honda hit Uber’s car, pushing it into a traffic pole and causing it to turn on its side.
During the event, the Uber vehicle was in autonomous mode, a spokeswoman for the company and the Tempe police said.
Others involved in the accident, though, didn’t imagine a robot behind the wheel. Alexandra Cole, the driver of the Honda, told police that she could not see any cars coming when she decided to make the left turn. “Right as I got to the middle lane about to cross,” she wrote, “I saw a car flying through the intersection.”
Another witness told police that Cole was not at fault. “It was the other driver’s fault for trying to beat the light and hitting the gas so hard,” Brayan Torres told police in a statement. “The other person just wanted to beat the light and kept going.”
Eyewitness accounts can often be unreliable, and other witnesses in the police report did not say that the Uber car was at fault — something the police agreed with. Still, Torres’s account raises the question of whether Uber’s self-driving sensors spotted the light turning yellow and, if so, whether it decided it could safely continue through the intersection.
One of Uber’s self-driving SUVs ran a red light in San Francisco last year, and on five other occasions the company’s mapping system for its cars failed to recognize traffic lights in the area, the New York Times reported in February.
Uber’s problems show the potential hurdles to winning approval for autonomous vehicles from the public and regulators. The company, and rivals like Alphabet Inc.’s Waymo and major automakers, are working to tweak software to handle “edge cases,” like unusual driving conditions.
Self-driving cars have more often been criticized for driving too cautiously, slowing or stopping when human drivers would be more aggressive. Autonomous vehicles operated by Waymo have been rear-ended due to such issues and the company has been working to make its system more human.
There is a potential upside for Uber from the Tempe crash: It now has rich, unique data to use for its self-driving program. Last year, after a Waymo car car bumped into a bus, the company said it used the incident, and “thousands of variations on it,” to refine its software.
“This is a classic example of the negotiation that’s a normal part of driving — we’re all trying to predict each other’s movements,” it added.