Share this @internewscast.com
Driverless cars are beginning to display human-like behaviors like impatience on the roads, in a sign of increased intelligence in the robotaxis.
A significant observation was made by Professor William Riggs from the University of San Francisco, who has been researching Waymo vehicles since they were first introduced.
During a trip with a journalist from the San Francisco Chronicle, the two noticed the Waymo car they were in rolled forward slightly at a pedestrian crossing before the person had fully crossed to the other side.
This subtle movement resembled typical human driving behavior, a curious sight for a self-driving Waymo, which is renowned for prioritizing safety by avoiding mistakes commonly made by human drivers.
The action of letting the foot gently off the break moments before they should to allow the car to begin creeping forward at a rolling pace displays a sense of impatience – a human reaction not previously seen in the robotic cars.
‘From an evolutionary standpoint, you’re seeing a lot more anticipation and assertiveness from the vehicles,’ Riggs said.
Up until this point, Waymo taxis have been known to follow the road rules down to the letter, sometimes causing frustration among motorists.
But robotaxis are designed to constantly gather information about road conditions, and the algorithm is often fine-tuned to ensure the product is the best it can be.

A Waymo recently crept to a rolling start at a pedestrian crossing before the person had reached the other footpath
David Margines, the director of product management at Waymo, said human specialists who drive the cars to train them had to juggle two separate goals: ensuring the Waymo followed every traffic law, whilst simultaneously working to transport customers in a reasonable timeframe.
‘We imagined that it might be kind of a trade-off,’ he told the publication.
‘It wasn’t that at all. Being an assertive driver means that you’re more predictable, that you blend into the environment, that you do things that you expect other humans on the road to do.’
The result is a more ‘humanistic’ way of driving.
In another example of these developments, Margines provided an example of a Waymo driving through an intersection, merging into traffic in which it had the right of way.
Another car swerves into Waymo’s path. The robotaxi hit the brakes and prevented a crash, while simultaneously beeping its horn to let the other driver know of its displeasure.
The act of using its horn is just another example of human-like behavior which serves as a reminder of the intelligence capabilities of the robot.
These small tweaks may be beneficial in getting a passenger from point A to B faster, but it raises the question of whether the car is becoming too similar to humans, now to the point that it is mimicking poor choices motorists make on the roads out of frustration or emotion.

Driverless cars are beginning to display human-like behaviors like impatience on the roads, in a sign of increased intelligence in the robotaxis

While Waymo prides itself as the ‘world’s first autonomous ride-hailing service’ and is intended to give riders a safer experience , that has not always been the experience customers have had
While Waymo prides itself as the ‘world’s first autonomous ride-hailing service’ and is intended to give riders a safer experience, that has not always been the experience customers have had.
Data suggests there have been 696 crashes involving a Waymo since 2021. This does not mean the Waymo was at fault.
In one tragic accident, the Waymo killed a small dog which was off leash and wasn’t detected by the technology in the car.
The service is available in Phoenix, San Francisco, and Los Angeles. Waymo cars are also coming to Austin, Atlanta, and Miami.
Elon Musk’s Tesla had planned to roll out its own self driving taxi this month in Austin, Texas, with about 10 models powered by its Full Self-Driving (FSD) program..
His vision suffered a minor hitch last month when the National Highway Traffic Safety Administration (NHTSA) sent the company a letter to gather additional information.
The NHTSA wants to ‘understand how Tesla plans to evaluate its vehicles and driving automation technologies for use on public roads’ before the robotaxis are unleashed on busy Austin streets.
The agency highlighted its investigations into four crashes and a pedestrian linked to Tesla’s FSD.

Tesla is targeting June for the launch of its robotaxi service in Austin, Texas, which promises to provide self-driving rides on demand

His vision suffered a minor hitch last month when the National Highway Traffic Safety Administration (NHTSA) sent the company a letter to gather additional information
The automaker is also developing a dedicated autonomous model, dubbed the Cybercab, with production starting next year.
‘I predict that there will be millions of Teslas operating fully autonomously in the second half of next year,’ he said.
Musk made a similar prediction six years ago, in 2019, saying ‘next year, for sure, we’ll have over one million robotaxis on the road.’
Tesla also revealed in April that it has completed more than 1,500 trips and 15,000 miles of autonomous driving, which has helped them develop and test FSD networks, the associated mobile app and other supporting technologies.
However, the NHTSA seems alarmed at the idea of Tesla is basing the robotaxi service on its FSD program.
Since October 2024, the NHTSA has been investigating Tesla’s FSD software — an advanced driver-assistance system that allows vehicles to operate semi-autonomously — due to concerns about its performance in low-visibility conditions.
Tesla is required to respond to the NHTSA’s information request by June 19.
If Tesla fails to meet this deadline, or the answers it provides are not satisfactory, it could delay the robotaxi launch.