First Responders Work With Developers to ‘Teach’ Self-Driving Cars to Pull Over
On a dark Friday morning in November, a police officer began tailing a Tesla Model S traveling 70 mph on California’s Route 101, between San Francisco International Airport and Palo Alto. The car’s turn signal was blinking, but it kept passing exits. The officer pulled up alongside and saw the driver in a head-slumped posture, and guessed the car was driving itself under what Tesla calls Autopilot. However, the officer’s lights and sirens failed to rouse the driver.
Musk
Every Tesla is equipped with hardware the automaker says could someday enable its vehicles to drive themselves on entire trips, from parking space to parking space, with no driver input. Currently, Tesla limits the system to guiding cars from on-ramps to off-ramps on highways, but in this example the car kept driving, safely, with a seemingly incapacitated driver. But it didn’t know how to obey police sirens and pull over.
In this case, there was no way for police to commandeer the car, so they improvised; while a patrol car blocked traffic from behind, the officer following the Tesla pulled in front and began to slow down until both cars came to a stop.
The incident encapsulates both the hopes and anxieties of a driverless future. The 45-year-old Tesla driver failed a field sobriety test, according to the police, and was charged with driving under the influence. The car, which seems to have navigated about 10 miles of nighttime highway driving without the aid of a human, may well have saved a drunk driver from harming himself or others. Neither Tesla nor the police, however, are ready for people to begin relying on the technology in this way.
Drivers, according to Tesla’s disclaimer, are supposed to remain “alert and active” when using Autopilot, and be prepared to take over control if, for instance, the police approach. If the car doesn’t sense hands on the wheel, it’s supposed to slow to a stop and put on its hazard lights. Two days after the incident, Tesla Inc. CEO Elon Musk tweeted that he was “looking into what happened here.” A company spokesperson referred back to Musk’s tweet and declined to share anything the company had learned from the car’s data log.
Exactly. Default Autopilot behavior, if there’s no driver input, is to slow gradually to a stop & turn on hazard lights. Tesla service then contacts the owner. Looking into what happened here. — Elon Musk (@elonmusk) December 3, 2018
The police who stopped the Tesla had never before used the technique to stop the car; they just knew enough about the car to improvise a response.
“That’s great adaptation,” says Lt. Saul Jaeger of the nearby Mountain View Police Department. Such familiarity is to be expected, perhaps, in the heart of Silicon Valley — the incident occurred halfway between the headquarters of Facebook and Google — but relying on quick-witted law enforcement is not a long-term answer for the safety issues at hand; automakers, engineers, lawmakers and police must work through questions including how police can stop an autonomous car, what automated vehicles should do after collisions, and how can automated cars be programmed to recognize human authorities?
Five years ago Massachusetts Institute of Technology roboticist John Leonard began taking dashcam videos while driving around Boston, looking to catalog moments that would be difficult for artificial intelligence to navigate. One night he saw a police officer step into an intersection to block traffic and allow pedestrians to cross against the light. He added that to his list.
Of all the challenges facing self-driving technology, “what if” instances like these are among the most daunting, and a big part of the reason truly driverless cars are going to arrive “more slowly than many in the industry predicted,” Leonard said. He’s in a position to know: Leonard took leave from MIT in 2016 to join the Toyota Research Institute and help lead the automaker’s AV efforts.
Waymo, the autonomous-driving startup launched by Google’s parent company and now serving passengers in the Phoenix area, has already run up against the scenario that worried Leonard. In January a sensor-laden Chrysler Pacifica minivan in Waymo’s automated ride-hailing fleet rolled up to a darkened stoplight in Tempe, Ariz. The power had gone out, and a police officer was in the roadway directing traffic. In dashcam footage alongside a rendering of the computer vision provided by Waymo, the minivan stops at the intersection, waits for the cross traffic and a left-turning car coming the other way, then proceeds when waved through by the officer.
Waymo spokeswoman Alexis Georgeson said the company’s fleet can distinguish between civilians and police standing in the roadway and can follow hand signals. “They will yield and respond based on recognition that it is a police officer,” she said. “Our cars do really well navigating construction zones and responding to uniformed officers.”
A Chrysler Pacifica, equipped with Waymo's self-driving system, undergoes testing in Arizona. (Caitlin O'Hara/Bloomberg News)
Waymo is taking a territorial approach to autonomous vehicles, focusing on developing fleets that could serve as taxis in limited areas and stopping short of full, go-anywhere autonomy, a not-yet-reached threshold known in the industry as Level 5. Working in a limited space allows both for building detailed maps and for easier coordination with government and law enforcement. Rather than trying to launch across different jurisdictions, Waymo picked Chandler, a Phoenix suburb with wide avenues, sunny weather and welcoming state and local governments, for its first living laboratory.
Many of its competitors are taking a similar approach, focusing on fleets that stay within defined boundaries. Ford Motor Co. is testing in Miami and Washington, D.C.; General Motors Co.’s Cruise, Zoox Inc. and Toyota Motor Corp. are among dozens of companies testing autonomous cars in California.
In the summer of 2017, about a year and half before the debut of its service in Chandler, Waymo invited local police, firefighters and ambulance personnel to a day of testing in which trucks and patrol cars — sirens blaring and lights flashing — approached the driverless minivans from all angles on a closed course. “We’ve had a lot of interaction with their personnel on the research and development of their technology,” said Chandler spokesman Matt Burdick.
Last year, Waymo became the first AV maker to publish a law enforcement interaction protocol. If one of its self-driving cars detects police behind it with lights flashing, the document says, it’s “designed to pull over and stop when it finds a safe place to do so.”
Jeff West, a battalion chief with the Chandler Fire Department, said the Waymo vehicles he’s seen on the road have been quicker to move out of the way than many human-driven cars. “Once it recognizes us, it pulls over,” he said, “versus someone maybe listening to a radio, or they got their air conditioner on.”
Stolen stop signs, goods falling from trucks, and smoke billowing from a car on fire. We’ve seen a lot out on the road. Waymo’s Head of Research speaks to how we are building AI to handle situations that happen once in a million miles. https://t.co/8SUJ9zpNjq — Waymo (@Waymo) February 12, 2019
For now, however, most Waymo cabs have a safety driver at the wheel to take over in any situation that might stump the car. But there haven’t been any run-ins between local police and a human-free driverless car yet, Burdick said.
Should that day come, said Matthew Schwall, head of field safety at Waymo, the police can get in touch with the company’s support team by either calling a 24-hour hotline or pushing the minivan’s help button above the second row of seats. At that point, Waymo’s remote staff can’t take direct control of the vehicle, but they can reroute it — if, for instance, the police want it to move to the side of the roadway after a collision.
Michigan state trooper Ken Monroe took Ford engineers on ride-alongs around Flint last summer. The engineers were especially curious about what he wanted drivers to do as he came up behind them with lights flashing, and how those responses differed depending on whether he was pulling over a car or trying to get past.
“While I was responding to an emergency, they said, ‘OK, you’re approaching this vehicle here. What is the best-case scenario that you can find for that vehicle to do?’ ” They spoke at length, Monroe said, about how an autonomous vehicle could recognize when it was being pulled over. “The biggest cue that we came up with was just the length of time that the police vehicle was behind the AV.”
In addition to its testing in Miami and Washington, Ford has been working with police in Michigan for nearly two years as part of preparations for the rollout of autonomous ride-hailing and delivery cars scheduled for 2021. Two years ago, a few dozen troopers from the Michigan State Police came to its offices in Dearborn to talk about its plans. “We emphasized that these are not going to be privately owned vehicles,” says Colm Boran, head of autonomous vehicle systems engineering at Ford. “That instantly helped to relieve some of their concerns.”
Teaching autonomous cars to pull to the right is a relatively straightforward task. The point of the lights and sirens, after all, is to be noticed from far away. “If it’s salient to a human, it’s probably salient to a machine,” said Zachary Doerzaph, director of the Center for Advanced Automotive Research at the Virginia Tech Transportation Institute. The greater challenges come when police and other first responders are outside their vehicles: “It’s all of these other cases where that last 10% of development could take the majority of the time.” Doerzaph’s team is researching such scenarios for a group of automakers, but he can’t yet talk about its findings.
Tesla Model S. (Tesla)
The jargon often used for these atypical moments is “edge cases,” but the term belies the extent of the challenge, Doerzaph said. At any given moment there are thousands of construction zones, crash sites and cops standing in intersections all across the country. The cues that humans use to recognize them are subtle and varied. Humans also recognize basic hand signals and, perhaps most important for the police, acknowledge instructions with eye contact or a nod.
It might prove necessary, as self-driving researchers try to replicate these subtle interactions, to create new modes of communication between cars and police. In theory, when Trooper Monroe gets out of his patrol car on the expressway in Michigan, he could, with a couple of taps on a handheld device, instruct all AVs in the area to steer clear. But these kinds of solutions, while technologically elegant, present a host of logistic and legal hurdles.
Inrix Inc., a Washington state-based startup that specializes in digital traffic and parking information, has started offering software to cities that allows them to enter traffic rules and roadway markers into the high-definition maps used by AV developers. City officials can mark the locations of stop signs, crosswalks, bike lanes and so on, and when an AV pings the navigation software to map a route, it will get the rules and restrictions for its trip. Boston, Las Vegas, Austin, Texas, and four other cities are currently using the service, called AV Road Rules.
New @INRIX Traffic Scorecard found Americans lost about 97 hours in 2018 due to congestion, costing them up to $1,348 last year. https://t.co/RD6tFDCNVa pic.twitter.com/IjmtKZdNXj — INRIX ® (@INRIX) February 12, 2019
The maps can be updated constantly. If roadwork blocks a lane, a city can mark the change. Inrix is working on making it possible for police to be able to update the map instantly from their cars. “That’s something that we’ve heard that there is interest in, and we are exploring how could we turn that hypothetical functionality into a real tool,” said Avery Ash, head of autonomous mobility at Inrix.
Once the AV industry solves the day-to-day traffic stops, accident scenes and roadwork, a long list of true “edge cases” awaits. “What if it’s a terrorist suspect? What if I ordered the car and just throw a backpack in it, and then tell the car to go wherever and then blow it up?” asked Lieutenant Jaeger in Mountain View, who’s been working with Waymo engineers since the company was a self-driving car project inside Google. The good news for the industry is that cities, cops and automakers are all motivated to find answers because they all agree that the status quo is unacceptable. More than 37,000 people lose their lives in motor vehicle crashes every year, and the overwhelming majority of collisions are due to human error. Police are some of the main witnesses to this carnage and sometimes its victims. Cars that could detect their sirens from miles away and reliably follow the rules would be a welcome change.
“The human driver is just not predictable,” said Monroe, the state trooper. “It’s very, very difficult.”
Schwall at Waymo said that when he holds training sessions with police — showing them how the company’s fleet of vans works and letting them get inside — he often hears the same question: “They ask when they can have a self-driving police car.”