- Federal Flood Insurance Premiums Could Rise in Flood-Prone Areas Under Trump Plan
- Florida CFO Launches ?Fraud Free Florida' Anti-Fraud Initiative
- How Cell Phone Use While Driving Is Changing
- New Florida Gov Signs Order to Fight Algae, Red Tide
- Florida Roofer Arrested for Stealing Nearly $50K from Homeowners, Insurance Fraud
- March 2019 (2)
- January 2019 (3)
- August 2018 (3)
- July 2018 (6)
- June 2018 (5)
- May 2018 (2)
- April 2018 (6)
- March 2018 (3)
- February 2018 (1)
- January 2018 (4)
- December 2017 (2)
- November 2017 (7)
- October 2017 (5)
- September 2017 (4)
- August 2017 (11)
- July 2017 (5)
- June 2017 (9)
- May 2017 (7)
- April 2017 (6)
- March 2017 (10)
- February 2017 (6)
- January 2017 (3)
- October 2016 (2)
- September 2016 (5)
- August 2016 (7)
- July 2016 (1)
Thursday, June 22 2017
Davy Andrews is so adept at technology that he’s become the de facto IT troubleshooter in his office. But there’s one bit of tech he won’t touch: self-driving cars.
“I wouldn’t want to be the first to jump into something with that kind of risk,” said Andrews, 33, an administrative assistant at a New York investment firm. “I would have to see enough evidence that it is safer, considerably safer. From where we are right now, it’s hard to imagine getting to that point.”
Autonomous autos are advancing so rapidly that companies like Uber Technologies Inc. and Alphabet Inc.’s Waymo are beginning to offer robot rides to everyday consumers. But it turns out the traveling public may not be ready. A recent survey by the American Automobile Association found that more than three-quarters of Americans are afraid to ride in a self-driving car. And it’s not just Baby Boomers growing increasingly fearful of giving up the wheel to a computer, a J.D. Power study shows — it’s almost every generation.
“One of the greatest deterrents to progress in this field is consumer acceptance,” U.S. Transportation Secretary Elaine Chao told Bloomberg News last week at a department-sponsored conference in Detroit. “If there’s public concern about safety, security and privacy, we will be limited in our ability to help advance this technology.”
Most commuters don’t have access to a self-driving car, so Chao has called on Silicon Valley to ” step up” and explain how they work. She and other regulators advocate for autonomy as a solution for curbing the hundreds of horrific collisions that happen every day in regular automobiles. Among those that end up being fatal, 94 percent are caused by human error, according to U.S. authorities.
Consumers will only become comfortable with driverless cars after they ride in them, Mary Barra, the chief executive officer of General Motors Co., said this week. The largest U.S. automaker is testing 180 self-driving Chevrolet Bolts and ultimately plans to put them in ride-hailing fleets, though it won’t say when.
“You can talk about it, but until you experience it,” self-driving cars are hard to comprehend, Barra told reporters at the GM factory building the Bolts north of Detroit. “Once you’re in the vehicle and you see the technology, you understand how it works.”
The opportunity for autonomy to make a meaningful impact on public safety is immense. Last year, 40,200 people died in motor-vehicle accidents on U.S. roads, the National Safety Council estimates. That was up 6 percent from the year before.
“Forty thousand people a year is unacceptable,” Alex Epstein, the council’s senior director of digital strategy, said during a panel discussion at the TU-Automotive technology conference in Detroit last week. “It’s a jumbo jet going down every couple days.”
Dangerous as it may be to operate cars themselves, many drivers are anxious about autonomous technology because they associate it with the fragility of electronic devices. Laptops crash and calls drop with nagging regularity. The consequence of a computerized car crash is much greater.
“While it might be convenient to have a car drive for you, driving is a very high-stakes pursuit,” said Andrews, who has no interest in letting a robot take the wheel of his Volvo. “When things go wrong, it’s not the same as a normal computer error.”
Another culprit killing consumer confidence has been automakers over-hyping the capabilities of today’s driver-assist technologies. That’s led some drivers to drop their hands from the wheel even with systems built to require constant attention of the traffic environment, as was the case with the fatal crash last year of a driver in a Tesla operating in the semi-autonomous Autopilot mode.
Respondents to J.D. Power’s survey made mention of Tesla crash and recognized vehicles with autonomous features can still get into accidents, said Kristin Kolodge, executive director of J.D. Power’s driver-interaction research.
“When you’re not in control and the vehicle is in control, now you’re in this dark space where you wonder ‘What actually happens if the technology fails?”‘ she said. “This fear of failure is the major reason” consumers are wary.
Regulators investigated the Tesla crash and cleared the company’s Autopilot system of fault in January. And the company hasn’t been the only one to come under scrutiny — Daimler AG last year pulled Mercedes-Benz ads that consumer groups complained had wrongly suggested its E-Class sedan with driver-assist features was fully autonomous.
The television spot showed the driver removing his hands from the wheel, even though the automaker’s Drive Pilot system requires resuming control every 30 seconds.
“The fastest way to make sure the public does not accept these technologies is to over-promise and then have some horrific crash because the consumer believed the capability was higher than it actually was,” Epstein said.
Another impediment to consumer acceptance may arise from semi-autonomous features, which should inspire confidence and instead feel unnatural and annoying, said Lukas Kuhn, chief technology officer at Tourmaline Labs Inc., a California company that analyzes driving behavior for insurance and ride-sharing companies.
Driver-assist features like adaptive cruise control, which adjusts speed to the flow of traffic and lane keeping that steers a car back into the lines, can feel intrusive rather than intuitive.
“In order to make the user buy into the feature, we have to make it feel more natural,” Kuhn said. “If I can drive this car way better than the machine, why should I take my hands off the wheel?”