Serve crossing with pedestrians

This robot has crossed a line it shouldn’t, because humans told • Technology Flow

A video of a sidewalk delivery robot crossing yellow caution tape and wandering around a crime scene in Los Angeles went viral this week, garnering more than 650,000 views on Twitter and sparking debate over whether the technology is ready for prime time.

It turns out that the robot’s error, at least in this case, was caused by humans.

The Video Taken by the event’s owner, William Goode, and posted on Twitter Police LA movie, an LA-based police watchdog account. Goode captured on video the bot hovering around the corner of the street during a suspected shooting at Hollywood High School around 10 a.m., allowing the bot to continue on its way until someone lifted the tape. Through the crime scene.

Uber spinout Serv Robotics told Technology Flow that the robot’s self-driving system never decided to go to a crime scene. This is the choice of the human operator operating the bot remotely.

The company’s delivery robots have Level 4 autonomy, meaning they can drive themselves in certain situations without the need for a human. Serve has been piloting its robots with Uber Eats in the region since May.

Serv Robotics has an approach that requires a human operator to remotely monitor and assist its bot at each intersection. A human operator can also take control remotely if the bot encounters an obstacle like a construction zone or a fallen tree and can’t figure out how to navigate around it within 30 seconds.

In this case, according to the company’s internal operating procedure, the bot, which had just completed a delivery, arrived at the intersection and a human operator took over. Initially, the human operator paused at the yellow warning tape. But when the audience raised the tape and were apparently “waving it,” the human operator decided to continue, Serv Robotics CEO Ali Kashani told Technology Flow.

“The robot never crossed (on its own),” Kashani said. “There are so many systems in place to make sure that it never passes until a human moves on.”

The error of judgment here is that someone actually decided to cross, he added.

Whatever the reason, that should not happen, said Kashani. SERV has removed the data from the incident and is working on new protocols for human and AI to prevent it in the future, he added.

There are some obvious steps to ensure employees follow standard operating procedure (or SOP), including proper training and developing new rules for what to do if a person tries to wave the robot through the barricade.

But Kashani says there are ways to use software to prevent this from happening again.

People can use the software to make better decisions or avoid an area altogether, he said. For example, the company could work with local law enforcement to send the robot up-to-date information about police incidents so it can roam around those areas. Another option is to give software the ability to identify law enforcement and then alert human decision makers and remind them of local laws.

These lessons will be critical as robots advance and expand their functional domains.

“The funny thing is that the robot did the right thing; It has stopped,” said Kashani. “So it goes back to giving people enough context to make really good decisions that we’re confident enough that we don’t need people making those decisions.”

Serv Robotics bots haven’t reached that point yet. However, Kashani told Technology Flow that robots are becoming more autonomous and typically operate on their own with two exceptions: intersections and certain types of obstacles.

The scene that unfolded this week contrasts with how many people view AI, Kashani said.

“I think the narrative in general is basically that people are really great at edge cases and AI makes mistakes or isn’t ready for the real world,” Kashani said. “Ironically, we’re doing the opposite kind of learning, that is, we’re finding that people make a lot of mistakes, and we have to rely more on AI.”

Leave a Comment

Your email address will not be published.