Should Waymo Robotaxis Always Stop For Pedestrians In Crosswalks?
“My feet are already in the crosswalk,” says Geoffrey A. Fowler, a San Francisco-based tech columnist for the Washington Post. In a video he takes one step from the curb, then stops to see if Waymo robotaxis will stop for him. And they often didn’t.
Waymo’s position? Their cars consider “signals of pedestrian intent” including forward motion when deciding whether to stop — as well as other vehicles’ speed and proximity. (“Do they seem like they’re about to cross or are they just sort of milling around waiting for someone?”) And Waymo “also said its car might decide not to stop if adjacent cars don’t yield.”
Fowler counters that California law says cars must always stop for pedestrians in a crosswalk. (“It’s classic Silicon Valley hubris to assume Waymo’s ability to predict my behavior supersedes a law designed to protect me.”) And Phil Koopman, a Carnegie Mellon University professor who conducts research on autonomous-vehicle safety, agrees that the Waymos should be stopping. “Instead of arguing that they shouldn’t stop if human drivers are not going to stop, they could conspicuously stop for pedestrians who are standing on road pavement on a marked crosswalk. That might improve things for everyone by encouraging other drivers to do the same.”
From Fowler’s video:
I tried crossing in front of Waymos here more than 20 times. About three in ten times the Waymo would stop for me, but I couldn’t figure out what made it change its mind. Heavy traffic vs light, crossing with two people, sticking one foot out — all would cause it to stop only sometimes. I could make it stop by darting out into the street — but that’s not how my mama taught me to use a crosswalk…
Look, I know many human drivers don’t stop for pedestrians either. But isn’t the whole point of having artificial intelligence robot drivers that they’re safer because they actually follow the laws?
Waymo would not admit breaking any laws, but acknowledged “opportunity for continued improvement in how it interacts with pedestrians.”
In an article accompanying the video, Fowler calls it “a cautionary tale about how AI, intended to make us more safe, also needs to learn how to coexist with us.”
Waymo cars don’t behave this way at all intersections. Some friends report that the cars are too careful on quiet streets, while others say the vehicles are too aggressive around schools… No Waymo car has hit me, or any other person walking in a San Francisco crosswalk — at least so far. (It did strike a cyclist earlier this year.) The company touts that, as of October, its cars have 57 percent fewer police-reported crashes compared with a human driving the same distance in the cities where it operates.
Other interesting details from the article:
Fowler suggests a way his crosswalk could be made safer: “a flashing light beacon there could let me flag my intent to both humans and robots.”
The article points out that Waymo is also under investigation by the National Highway Traffic Safety Administration “for driving in an unexpected and disruptive manner, including around traffic control devices (which includes road markings).”
At the same time, Fowler also acknowledges that “I generally find riding in a Waymo to be smooth and relaxing, and I have long assumed its self-driving technology is a net benefit for the city.” His conclusion? “The experience has taught my family that the safest place around an autonomous vehicle is inside it, not walking around it.”
And he says living in San Francisco lately puts him “in a game of chicken with cars driven by nothing but artificial intelligence.”
Read more of this story at Slashdot.
“My feet are already in the crosswalk,” says Geoffrey A. Fowler, a San Francisco-based tech columnist for the Washington Post. In a video he takes one step from the curb, then stops to see if Waymo robotaxis will stop for him. And they often didn’t.
Waymo’s position? Their cars consider “signals of pedestrian intent” including forward motion when deciding whether to stop — as well as other vehicles’ speed and proximity. (“Do they seem like they’re about to cross or are they just sort of milling around waiting for someone?”) And Waymo “also said its car might decide not to stop if adjacent cars don’t yield.”
Fowler counters that California law says cars must always stop for pedestrians in a crosswalk. (“It’s classic Silicon Valley hubris to assume Waymo’s ability to predict my behavior supersedes a law designed to protect me.”) And Phil Koopman, a Carnegie Mellon University professor who conducts research on autonomous-vehicle safety, agrees that the Waymos should be stopping. “Instead of arguing that they shouldn’t stop if human drivers are not going to stop, they could conspicuously stop for pedestrians who are standing on road pavement on a marked crosswalk. That might improve things for everyone by encouraging other drivers to do the same.”
From Fowler’s video:
I tried crossing in front of Waymos here more than 20 times. About three in ten times the Waymo would stop for me, but I couldn’t figure out what made it change its mind. Heavy traffic vs light, crossing with two people, sticking one foot out — all would cause it to stop only sometimes. I could make it stop by darting out into the street — but that’s not how my mama taught me to use a crosswalk…
Look, I know many human drivers don’t stop for pedestrians either. But isn’t the whole point of having artificial intelligence robot drivers that they’re safer because they actually follow the laws?
Waymo would not admit breaking any laws, but acknowledged “opportunity for continued improvement in how it interacts with pedestrians.”
In an article accompanying the video, Fowler calls it “a cautionary tale about how AI, intended to make us more safe, also needs to learn how to coexist with us.”
Waymo cars don’t behave this way at all intersections. Some friends report that the cars are too careful on quiet streets, while others say the vehicles are too aggressive around schools… No Waymo car has hit me, or any other person walking in a San Francisco crosswalk — at least so far. (It did strike a cyclist earlier this year.) The company touts that, as of October, its cars have 57 percent fewer police-reported crashes compared with a human driving the same distance in the cities where it operates.
Other interesting details from the article:
Fowler suggests a way his crosswalk could be made safer: “a flashing light beacon there could let me flag my intent to both humans and robots.”
The article points out that Waymo is also under investigation by the National Highway Traffic Safety Administration “for driving in an unexpected and disruptive manner, including around traffic control devices (which includes road markings).”
At the same time, Fowler also acknowledges that “I generally find riding in a Waymo to be smooth and relaxing, and I have long assumed its self-driving technology is a net benefit for the city.” His conclusion? “The experience has taught my family that the safest place around an autonomous vehicle is inside it, not walking around it.”
And he says living in San Francisco lately puts him “in a game of chicken with cars driven by nothing but artificial intelligence.”
Read more of this story at Slashdot.