Contact Us!

Fatal Uber crash raises red flags about self-driving safety

Every day, as he goes to and from work, Arizona State University urban planning professor David King rides his bike* past the intersection where Elaine Herzberg was killed on Sunday night. The seven-lane road (counting turn lanes) in Tempe, Arizona is wide open, with no bushes or parked cars for a person to jump out from behind. In the immediate vicinity are a large park, an office building, and a nightclub that’s closed on Sundays—few potential distractions for a driver negotiating the area.

Herzberg, a 49-year-old woman who was homeless, was pushing a bicycle laden with her belongings along this road when she was struck by a self-driving Uber vehicle around 10 p.m. Sunday. She later died at a hospital, gaining the grisly distinction of being the first known pedestrian to be killed by a self-driving car.

The Uber vehicle, which was in autonomous mode with a backup operator behind the wheel, was going 38 mph at the time of the crash (some stories stated the car was speeding, but it was in a 45 mph zone), and the driver made no attempt to slow down or brake, according to police reports.

To King, whose research focuses on the urban impacts of new transportation technologies, the location of the crash—and how it happened—raises red flags about Uber’s approach to road safety. Since Uber arrived in Tempe in March 2017, he’s often seen Uber vehicles testing in that exact spot, charting details of the roadways to perfect the company’s internal maps. This seemed like familiar territory for them. Based on what is known about Uber’s technology, King said, a pedestrian or other foreign object should have been readily detected by the AV.

“If there is any real-world scenario where it would be seemingly safe to operate in an automated mode, this should have been it,” he said. “Something went seriously wrong.”

Precisely what went wrong may be unlocked by federal and local investigations now underway. Already, though, law enforcement interpreting video footage from the Uber vehicle’s external cameras seem to have placed the blame squarely on the victim: On a multi-lane corridor with scant crosswalks, Herzberg was crossing outside of a crosswalk.

“The driver said it was like a flash, the person walked out in front of them,” Sylvia Moir, the chief of Tempe Police Department, told the San Franciso Chronicle. Viewing the videos, “it’s very clear it would have been difficult to avoid this collision in any kind of mode (autonomous or human-driven) based on how she came from the shadows right into the roadway,” she said.

That video footage has not been made public yet, however, and other observers say it’s too soon to draw conclusions about a situation with no precedent. The fundamental safety promise of autonomous vehicles, after all, is their ability to automatically detect and brake for people, objects, and other vehicles using laser-based LIDAR systems: In darkness and light, they’re supposed to be programmed to drive far more safely than humans. King believes that releasing those videos, as well as the onboard vehicle data, would be a step towards transparency by Uber and law enforcement—as well as a signal to the public that safety is a priority, whether the blame rests on Uber’s software, its employee, or its victim.

Alain Kornhauser, the director of Princeton University’s transportation program and the faculty chair of Princeton’s Autonomous Vehicle Engineering research group, suggested via email that this crash may have been “a crash that was waiting to happen,” due to the design of the roadway. (What this crash, and the presence of AVs, means for homeless individuals living on the streets is yet another open question.) Kornhauser agreed that the crash data must be made available so that the public and research communities can learn from it.

“We have all of the data to precisely reconstruct what the system knew, when it knew it, and what it did about it,” he wrote. “Out of that, we should be able to recommend improvements to sensors and improvements in reactions that might have avoided or reduced the severity of this crash.”

Whether these future fixes should still be tested on public roads is another uncertainty. Autonomous vehicle developers insist that testing must occur on real-world streets in order to advance the technology to real-world readiness. But those already wary of the technology—which more of the half of the American population may be—might wonder why more companies can’t test in places like Mcity, the purpose-built testing grounds on the campus of the University of Michigan, where self-driving cars learn to drive on artificial courses built to resemble real-world road scenarios.

The problem is that the artificial intelligence software that underpins self-driving cars will, at some point, have mastered these testing grounds but not have mastered reality. There are limits to controlled experiments, said David Levinson, a professor of civil engineering at the University of Sydney focused on the impacts of new mobility technology. “How do you get from here to there?” he wondered via email. If the public desires the safety benefits of autonomous vehicles—a dramatic reduction of the tens of thousands of deaths by human-caused car crashes annually—they may have to consent to live with these beta-versions driving in their midst.

That doesn’t necessarily absolve Uber from potential blame, though. The reason that the company, like others, keeps backup drivers behind the wheel is that the technology is not yet ready to be fully driverless. The humans are there to override the system and react. In the case of the backup driver involved in Sunday night’s incident, “how did he miss the victim too?” Levinson wonders.

On Twitter, Streetsblog’s Angie Schmitt notes that the safety rate for AVs, collectively, is now officially worse than it is for human drivers. “Human-driven cars kill 1.25 people on average per 100 million miles traveled,” she tweeted, citing NHTSA data. Uber has driven about 2 million miles in AVs on U.S. roads. Waymo, Google’s self-driving car unit and the industry’s current leader in terms of mileage, has driven about 4 million. It has staked its reputation on the safety of its vehicles. At Uber’s self-driving car unit, meanwhile, the reported running joke among employee is “Safety Third.”

Some 20 states now permit some form of autonomous vehicle testing. While many statehouses have enacted safety, insurance, and tax laws specific to AVs, Arizona has opened its arms to the industry, clearing a virtually regulation-free path for companies to rack up mileage on public roads. Kevin Biesty, the deputy director for policy at Arizona’s Department of Transportation, told Reuters in September 2017 that the state’s approach is instead to allow automakers to tweak the technology, on their own, as issues come up. “One of the reasons we did not step forward and regulate is because the industry is changing so fast and what you release today might become obsolete in six months,” he said at the time.

Whether Sunday’s incident will be an impetus for stronger regulations, as many traffic safety advocates have called for, remains to be seen.

Alongside Uber, rival firms like Waymo, GM, Tesla, and Intel also pilot vehicles in various Arizona communities. Testing by these and other companies has been a familiar sight to locals in other U.S. cities, including Pittsburgh, San Francisco, Ann Arbor, and Miami. Yet in some cases, AV experimentation has occurred without explicit support—let alone enthusiasm—from communities. Pressed by an industry concerned with avoiding a “patchwork” of rules, many state legislatures already have moved to block cities from passing their own safety protections and requirements. Federal lawmakers in Congress, too, are moving to preempt state and local governments from the same. But neither Congress nor the U.S. Department of Transportation has issued clear rules, leaving a void of regulation at all levels.

That contrasts sharply with the approach to self-driving vehicles in Europe, where city authorities are the ones establishing the rules of play. Linda Bailey, the executive director of the National Association for City Transportation Officials, released a statement on Monday that highlighted the need for local leadership to guide the technology’s testing and deployment.

“Cities need vehicles to meet a clear minimum standard for safe operations so the full benefits of this new technology are realized on our complex streets,” Bailey said. “We cannot afford for companies’ race-to-market to become a race-to-the-bottom for safety.”

Published on citylab.com on March 30, 2018. 

By | 2018-04-04T06:21:28+00:00 September 10th, 2018|Engineering, Roadway Design|