Hacker News
Waymo exec admits remote operators in Philippines help guide US robotaxis
svat
|next
[-]
svat
|root
|parent
[-]
• (29 seconds long) https://www.youtube.com/watch?v=T0WtBFEfAyo
• (59 seconds long) https://www.youtube.com/watch?v=elpQPbJXpfY
neuronexmachina
|next
|previous
[-]
i7l
|next
|previous
[-]
SilverElfin
|next
|previous
[-]
flutas
|root
|parent
[-]
jfjfjfjffhfjfj
|root
|parent
[-]
flutas
|root
|parent
[-]
The description there is
In January, an incident took place where a Waymo robotaxi incorrectly went through a red light due to an incorrect command from a remote operator, as reported by Waymo. A moped started coming through the intersection. The moped driver, presumably reacting to the Waymo, lost control, fell and slid, but did not hit the Waymo and there are no reports of injuries. There may have been minor damage to the moped.
While the description in the official report to the NHTSA is (ID: 30270-6981) On January [XXX], 2024 at 10:52AM PT a rider of a moped lost control of the moped they were operating and fell and slid in front of a Waymo Autonomous Vehicle (Waymo AV) operating in San Francisco, California on [XXX] at [XXX] neither the moped nor its driver made contact with the Waymo AV.
The Waymo AV was stopped on northbound [XXX] at the intersection with [XXX] when it started to proceed forward while facing a red traffic light. As the Waymo AV entered the intersection, it detected a moped traveling on eastbound [XXX] and braked. As the Waymo AV braked to a stop, the rider of the moped braked then fell on the wet roadway before sliding to a stop in front of the stationary Waymo AV. There was no contact between the moped or its rider and the Waymo AV. The Waymo AVs Level 4 ADS was engaged in autonomous mode.
Waymo is reporting this crash under Request No. 1 of Standing General Order 2021-01 because a passenger of the Waymo AV reported that the moped may have been damaged. Waymo may supplement or correct its reporting with additional information as it may become available.
bryan_w
|next
|previous
[-]
ravenstine
|next
|previous
[-]
rcxdude
|next
|previous
[-]
tengbretson
|next
|previous
[-]
Since this experience I've just assumed all waymos have some warehoused human drone pilot actually controlling it.
akanet
|root
|parent
|next
[-]
slim
|root
|parent
[-]
galkk
|root
|parent
|next
[-]
Cloud console shows pings between Google data centers in us-west and ones that are in proximity of Philippines around 160-200ms. Then you also have inherent lag of wireless connection itself. Then you have also connectivity from google’s data center to Philippines.
If you want remote driving in uncontrolled environment, you reasonably can expect only the same city/county operators.
I’m obviously uninformed, but I’d expect the remote operators job (from another country) to be like “car is safe to proceed, based on that picture that I see” or, in the worst case scenario, put some waypoints in the ui and let car drive on its own.
Marsymars
|root
|parent
|next
|previous
[-]
This seems like it would be fairly straightforward to program, if not for all lights, at least for a lot (e.g. say half) of lights.
thebruce87m
|root
|parent
|next
|previous
[-]
The least likely possibility is a person controlling the vehicle directly over a variable latency connection that may fail completely at any time.
thenthenthen
|root
|parent
[-]
thebruce87m
|root
|parent
[-]
I could see certain situations where it could be authorised when a vehicle is stranded and unable to operate autonomously at all due to an error, but it would have to be extremely slow speed with a full-stop failsale on connection drop or high-latency detection.
That said I bet there are some who do not consider the safety implications and “move fast and break people”
OsrsNeedsf2P
|previous
[-]
MBCook
|root
|parent
|next
[-]
bryan_w
|root
|parent
|next
[-]
* Unless it gets super stuck, then a human drives out and gets into the physical driver seat and takes over
MBCook
|root
|parent
[-]
But it’s still not the impression they’ve been giving. It’s been an impression of full automation (ignoring getting stuck) and if it’s not navigating on its own that’s disingenuous.
SR2Z
|root
|parent
[-]
This approach has two benefits: it can be unstuck without sending out a physical driver and while collecting training data, and it efficiently lets m humans control n cars with a wide range of acceptable m and n values.
It's intended for the ratio of m:n to smoothly shrink as the software gets better, but m will always be greater than zero.
anigbrowl
|root
|parent
|next
|previous
[-]
jjeaff
|root
|parent
|next
|previous
[-]
TheDong
|root
|parent
|next
[-]
They effectively are answering questions like "is this road closed", or "is the object in front of me a solid object or a weird shadow".
These are not the sort of questions that US driver's license is really related to, it's not things like "can I legally turn right on red at this intersection".
Do we require a driver's license to solve Google reCapture questions like "what squares have a bike in them"? Because the waymo stuff is closer to image classification than driving.
naveen99
|root
|parent
[-]
TheDong
|root
|parent
[-]
Driver's licenses are legal constructs. The DMV certifies self-driving cars as able to drive on the road differently, and sure, those two different processes are different.
I really don't get the point you're trying to make here.
throw4432334
|root
|parent
|next
|previous
[-]
victorbjorklund
|root
|parent
|next
|previous
[-]
whatever1
|root
|parent
|previous
[-]
Grimblewald
|root
|parent
[-]
whatever1
|root
|parent
[-]
Elons cars are already driving by themselves against the conditions they are licensed for, causing accidents and the liability falls to the drivers.
Aka, licensing means absolutely nothing. Specially today.
Grimblewald
|root
|parent
|previous
[-]
TheDong
|root
|parent
[-]
Waymo tells you explicitly that all the microphones inside the car are off unless you press the button to call rider support yourself.
If you'd ever ridden in waymo, perhaps you'd recall them telling you that the first time you rode one.
> if you can't think of more perhaps you should keep your comments out of the discussion, because at present you've contributed nothing but ignorance.
You really shouldn't end your comment with that if you're not going to read up on whether a hypothetical scenario you've imagined up is ignorant or not.