Robotaxis: an increasingly impressive driving experience

Robotaxis an increasingly impressive driving experience

We see them hurtling down the streets of San Francisco under the curious eye of pedestrians and cyclists. Two operators have opened a limited robotaxis service in the Californian city. Cruisea subsidiary of General Motors, and Waymo, the autonomous vehicle division of Google. The devices are recognizable by lidar (for light detection and ranging) mounted on their roof, a rotating laser radar that gives them most of their “vision”. But the most surprising remains the absence of a driver. This time, the car is truly autonomous. It manages the collection and drop-off of passengers, navigation and traffic in this fairly dense city on its own. For the time being, they are prohibited from motorways, which does not allow them to be used to serve airports. In view of the permanent chaos on the San Francisco interchanges, no one would risk it…

A Jaguar 4×4 bristling with sensors

In concrete terms, this is what the experience of a robotaxi borrowed for an hour looks like. The car is summoned in the same way as an Uber, indicating the destination. Waiting time: thirty-three minutes. Let’s be magnanimous, the system is still in private beta version – the author of these lines benefited from an access code obtained thanks to a journalist from San Francisco. At the minute, the car arrives at the intersection of Clay Street and Fillmore Street, in the pretty neighborhood of Pacific Heights. Reluctant to double-file parking, she parks a little further up the street, on a very steep slope. The vehicle is an imposing Jaguar 4×4, bristling with sensors, cameras and laser radars of various sizes. On the roof, the lidar displays the initials of the person who placed the order. The doors unlock, we get on board. A voice recalls the safety instructions. On the screen, we click on “start ride”, and off we go. Direction, Ocean Beach, about ten kilometers away.

The experience is confusing. The Jaguar behaves exactly like a classic taxi whose driver might be a little more careful than the average. She advances to the rhythm of the flow of cars at the end of the afternoon and does not hesitate to approach the vehicle in front of her. Normal: its sensory machinery detects the slightest variation in the speed of the car in front of it, and the system reacts in half a second, i.e. three times faster than a human. For the rest, the steering wheel turns on its own without hesitation, the indicators are activated at the right moment, the robotaxi even seems endowed with a hint of personality when it approaches an intersection with four “stop” signals, common in the United States: Waymo’s vehicle is rather courteous, but also knows how to be respected, pointing its nose to show that it’s his turn to pass.

A disconcerting ease

Driving a car is a combination of rules – the Highway Code – and countless exceptions, from the most insignificant to the most critical, that constantly arise. Most are so assimilated by the driver that he no longer realizes it. Only a small percentage of them demand attention, and often action. Disturbingly, the robotaxi handles the unexpected wonderfully: a cyclist on an erratic trajectory, a child ready to cross off-road, a jogger absorbed in his effort and in his music, a creative driver who makes a U-turn in top of a hill, a slight refusal of priority, the orientation of the wheels of surrounding vehicles, all of this is spotted in advance and then managed smoothly.

Five years ago, to have lived it, the experience of an autonomous car had nothing to do. First, two engineers were in the front seats of a Lexus SUV. A driver was constantly ready to intervene, while his colleague followed the data collection on a laptop. The car was capable of taking a highway. She too was driving very close to the others, but the small streets of Mountain View, the city where Google has its headquarters, were hell: the vehicle literally began to panic at the slightest unforeseen event, suddenly swerving in all directions.

100 checks per second

Since then – and a few hundred million dollars later – things have changed dramatically. The computing capacities embedded in the car have exploded, and progress in artificial intelligence and machine learning (machine learning) have made it possible to develop models for analyzing the urban environment of exceptional granularity that operate in real time. At a rate of 100 times per second, the car scans the environment with radars, proximity sensors, cameras, and even microphones in order to detect sirens. The lidar “sees” 360 degrees from hundreds of meters away, including at night. Everything is served by powerful predictive models: a pedestrian is identified by his shape, his size and also by his posture, which, if necessary, will translate his intentions – crossing without looking, for example. Even the possible movements of a cyclist are taken into account.

None of this is obviously perfect. Even if forty minutes on board a robotaxi are likely to convert the most refractory to autonomous vehicles, as always, in technological matters, the question is how the system will “generalize”, that is to say operate in all circumstances, and go from a spectacular prototype to a product, in this case a car capable of driving on all types of roads, without having memorized the plans of an area. In other words, switch from level 4 to level 5, as for the high jump, the last centimeters being the most demanding…

There is still work. For now, this type of automated taxi is only deployed in cities like San Francisco, Phoenix or Las Vegas, with relatively simple routes and mild weather, because navigation systems are disrupted by rain and snow. .

In San Francisco, the municipality ruled on August 11 on the extension of robotaxis in the city, 300 autonomous vehicles for Cruise, the subsidiary of General Motors and 250 for Waymo. Over the next few weeks, the two operators will be able to take paying passengers using an app, like a VTC. If this deployment is satisfactory, the current restrictions to certain areas of the city will be lifted. But the battle of the lobbies remains lively: on the one hand the manufacturers who claim millions of kilometers traveled without incident (the most serious being a crushed dog), while other pressure groups underline – with the magnifying effect of the networks social – persistent bugs with sometimes consequences on traffic such as when a car blocks an intersection. And then it will be necessary to reckon with the neoluddites openly anti-Big Tech who have fun placing cones on the hoods of robotaxis in order to maximize the disturbance.

A distant profitability

Added to these questions of growth and scale is the economic equation. Google launched the development of its autonomous cars in 2009. Since then, the group has invested 11 billion dollars. In 2015, the activity was placed in a specialized entity, Waymo, which was able to raise funds from venture capital firms. If Google was alone in making this bet fifteen years ago, today the landscape is cluttered. The big manufacturers have finally started, and Tesla even presents its FSD (for full self-driving) as operational and reliable – which is vigorously contested by american justice.

The bottomless pit of necessary investments does not seem to be filling up any time soon. Despite downsizing, Waymo still employs over 2,500 people and is losing money massively. General Motors, which operates Cruise’s self-driving taxis, admitted gobbling up $5 million a day in the deal: nearly $1 billion in the first half of 2022. But the automaker’s CEO, Mary Barra, estimates that Cruise will generate $50 billion a year in revenue by 2030. The firm McKinsey sees for its part a market of 300 to 400 billion for autonomous cars. Until then, it will still take a few billion dollars before the streets of major capitals are colonized by robotaxis.

lep-sports-01