Uber’s Self-Driving Car Passengers Were Signing Their Lives Away

Uber’s Self-Driving Car Passengers Were Signing Their Lives Away

The company has rolled out a self-driving fleet in Pittsburgh. But passengers might not be aware of how experimental the pilot still is!!

Uber’s fleet of self-driving cars in Pittsburgh are super exciting for anyone interested in the future of transportation-but they could come at a huge risk for passengers riding in the vehicles.

A new report from the Guardian, citing documents obtained under public record showed that until June, Uber required anyone riding in one of its self-driving cars to sign a legal document that kept the company free of liabilities in case of their injury or death.

The report shows that a senior Pittsburgh police officer had signed the waiver that said:

“I acknowledge that some or all of the [autonomous vehicles] in which I ride are in a development phase, are experimental in nature, and that riding in an [autonomous vehicle] may involve the potential for death, serious injury, and/or property loss.”

The document also said:

“Risks associated with riding in an [autonomous vehicle] may include, without limitation, those caused by equipment failure, development vehicle operators or other safety drivers, actions of other motorists, weather, temperature, road conditions, negligence or human error.”

Additionally, Uber’s Advanced Technologies Center group conducted a friends and family program this summer where users were asked to assume full responsibility for riding in Uber’s self-driving cars. The document they were asked to sign included waiving Uber of “any risks that may arise from the negligence or carelessness of [Uber and the ATC], operation of the AVs and/or dangerous or defective equipment.”

The sketchiest part of the agreement asked riders to free Uber of liability even after death. According to The Guardian, this clause said, “I hereby agree on behalf of myself, my executors, administrators, heirs [and] next of kin.”

In addition to the legal document asking passengers to free Uber of liability of death or injury caused by its self-driving cars, The Guardian also points to a public interview company executives participated in. In August, Uber CEO Travis Kalanick told Bloomberg Businessweek, “Nobody has set up software that can reliably drive a car safely without a human.”

The disclosures raise serious questions about whether Uber’s self-driving cars are safe enough for public use and whether they were rushed to market. These concerns that have been raised before. But the story gets even is stranger: Passengers who take Uber’s self-driving taxis in Pittsburgh are not required to sign any waivers and are not explicitly told of the risks involved with the ride. In fact, passengers can’t even choose whether they receive a self-driving car for pickup.

Pittsburgh taxi passengers that hail an Uber within a 12-square-miles of downtown between 7 a.m. and 10 p.m. can be randomly assigned a self-driving Ford Fusion rather than a normal, human-operated vehicle. For now, riders operated by self-driving cars are free.

These new revelations about the risks associated with Uber’s self-driving taxi fleet come in the wake of a wave of bad press for self-driving and driver-assistance technologies.

On Friday last week, one of Google’s self-driving cars was involved in an accident when a driver ran a red light and collided with the Google car as it passed through a green light. The crash caused airbags to deploy and crushed both right-side doors of the vehicle. The operator of the Google car, and the autonomous system itself, both applied the brakes upon realizing a car was running a red light. Sadly, it wasn’t enough to prevent the collision.

A Google spokesperson told 9to5Google: “Our light was green for at least six seconds before our car entered the intersection. Thousands of crashes happen everyday on US roads, and red-light running is the leading cause of urban crashes in the US. Human error plays a role in 94 percent of these crashes, which is why we’re developing fully self-driving technology to make our roads safer.”

Earlier this month, former Tesla autopilot supplier Mobileye ended its relationship with the company over safety concerns . Company CTO Amnon Shashua said Tesla was “pushing the envelope in terms of safety.” He explained by saying, “[Autopilot] is not designed to cover all possible crash situations in a safe manner…No matter how you spin it, [Autopilot] is not designed for that. It is a driver-assistance system and not a driverless system.”

Although Tesla’s autopilot feature is different than Uber’s self-driving car technology (the former uses ultrasonic sensors and the latter users LiDAR), both have drawn criticism for putting passengers in danger. In July, former Tesla Autopilot engineer Eric Meadows told CNN Money he was pulled over by police on the suspicion that he was driving drunk while using Autopilot mode. He said the car was struggling to make sharp turns on its own. The engineer said he was “pushing the limit” of Autopilot mode and assumed customers would do the same thing.

Luckily, each self-driving car in Pittsburgh includes two Uber employees-a safety driver and a vehicle operator-but as we learned with the Google crash last week, sometimes even an operator behind the wheel isn’t enough to prevent a crash.

Comments

No comments yet. Why don’t you start the discussion?

Leave a Reply