Showing posts with label tesla. Show all posts
Showing posts with label tesla. Show all posts

Tuesday, May 20, 2025

Elon Musk’s Tesla Robotaxi Rollout Looks Like A Disaster Waiting To Happen

Illustration By Emily Schrer For Forbes, Photos By Sjoerd Van Der Wal, Achisatha Khamsuwan / Getty

Elon Musk is rolling out a handful of Tesla robotaxis in Austin next month, where up to 20 self-driving electric Model Ys will be unleashed to ferry passengers around the Texas city’s streets. He’s betting the future of Tesla on their success, as the automaker’s electric vehicle revenue tanks thanks to faster-growing Chinese rivals and a political backlash against Musk’s right-wing politics…….Continue reading….

By Alan Ohnsman

Source: Forbes

.

Critics:

The Union of Concerned Scientists defined self-driving as “cars or trucks in which human drivers are never required to take control to safely operate the vehicle. Also known as autonomous or ‘driverless’ cars, they combine sensors and software to control, navigate, and drive the vehicle.” The British Automated and Electric Vehicles Act 2018 law defines a vehicle as “driving itself” if the vehicle is “not being controlled, and does not need to be monitored, by an individual”.

Another British government definition stated, “Self-driving vehicles are vehicles that can safely and lawfully drive themselves”. Operational design domain (ODD) is a term for a particular operating context for an automated system, often used in the field of autonomous vehicles. The context is defined by a set of conditions, including environmental, geographical, time of day, and other conditions. For vehicles, traffic and roadway characteristics are included.

Manufacturers use ODD to indicate where/how their product operates safely. A given system may operate differently according to the immediate ODD. The concept presumes that automated systems have limitations. Relating system function to the ODD it supports is important for developers and regulators to establish and communicate safe operating conditions. Systems should operate within those limitations. Some systems recognize the ODD and modify their behavior accordingly.

For example, an autonomous car might recognize that traffic is heavy and disable its automated lane change feature. Vendors have taken a variety of approaches to the self-driving problem. Tesla’s approach is to allow their “full self-driving” (FSD) system to be used in all ODDs as a Level 2 (hands/on, eyes/on) ADAS. Waymo picked specific ODDs (city streets in Phoenix and San Francisco) for their Level 5 robotaxi service.

Mercedes Benz offers Level 3 service in Las Vegas in highway traffic jams at speeds up to 40 miles per hour (64 km/h) Mobileye’s SuperVision system offers hands-off/eyes-on driving on all road types at speeds up to 130 kilometres per hour (81 mph). GM’s hands-free Super Cruise operates on specific roads in specific conditions, stopping or returning control to the driver when ODD changes. In 2024 the company announced plans to expand road coverage from 400,000 miles to 750,000 miles.

Ford’s BlueCruise hands-off system operates on 130,000 miles of US divided highways. The perception system processes visual and audio data from outside and inside the car to create a local model of the vehicle, the road, traffic, traffic controls and other observable objects, and their relative motion. The control system then takes actions to move the vehicle, considering the local model, road map, and driving regulations. Several classifications have been proposed to describe ADAS technology. One proposal is to adopt these categories: navigation, path planning, perception, and car control.

Navigation involves the use of maps to define a path between origin and destination. Hybrid navigation is the use of multiple navigation systems. Some systems use basic maps, relying on perception to deal with anomalies. Such a map understands which roads lead to which others, whether a road is a freeway, a highway, are one-way, etc. Other systems require highly detailed maps, including lane maps, obstacles, traffic controls, etc.

ACs need to be able to perceive the world around them. Supporting technologies include combinations of cameras, LiDAR, radar, audio, and ultrasound, GPS, and inertial measurement. Deep neural networks are used to analyse inputs from these sensors to detect and identify objects and their trajectories. Some systems use Bayesian simultaneous localization and mapping (SLAM) algorithms. Another technique is detection and tracking of other moving objects (DATMO), used to handle potential obstacles. Other systems use roadside .

real-time locating system (RTLS) technologies to aid localization. Tesla’s “vision only” system uses eight cameras, without LIDAR or radar, to create its bird’s-eye view of the environment. Path planning finds a sequence of segments that a vehicle can use to move from origin to destination. Techniques used for path planning include graph-based search and variational-based optimization techniques. Graph-based techniques can make harder decisions such as how to pass another vehicle/obstacle.

Variational-based optimization techniques require more stringent restrictions on the vehicle’s path to prevent collisions. The large scale path of the vehicle can be determined by using a voronoi diagram, an occupancy grid mapping, or a driving corridor algorithm. The latter allows the vehicle to locate and drive within open space that is bounded by lanes or barriers. Maps are necessary for navigation.

Map sophistication varies from simple graphs that show which roads connect to each other, with details such as one-way vs two-way, to those that are highly detailed, with information about lanes, traffic controls, roadworks, and more. Researchers at the MITComputer Science and Artificial Intelligence Laboratory (CSAIL) developed a system called MapLite, which allows self-driving cars to drive with simple maps.

The system combines the GPS position of the vehicle, a “sparse topological map” such as OpenStreetMap (which has only 2D road features), with sensors that observe road conditions. One issue with highly-detailed maps is updating them as the world changes. Vehicles that can operate with less-detailed maps do not require frequent updates or geo-fencing. Sensors are necessary for the vehicle to properly respond to the driving environment.

Sensor types include cameras, LiDAR, ultrasound, and radar. Control systems typically combine data from multiple sensors. Multiple sensors can provide a more complete view of the surroundings and can be used to cross-check each other to correct errors. For example, radar can image a scene in, e.g., a nighttime snowstorm, that defeats cameras and LiDAR, albeit at reduced precision.

After experimenting with radar and ultrasound, Tesla adopted a vision-only approach, asserting that humans drive using only vision, and that cars should be able to do the same, while citing the lower cost of cameras versus other sensor types. By contrast, Waymo makes use of the higher resolution of LiDAR sensors and cites the declining cost of that technology.

In the last half hour
In the last 2 hours
In the last 4 hours
In the last 6 hours
In the last 8 hours
Earlier Today

How Donald Trump could help Elon Musk with his robotaxi plans Understanding AI 06:57 

Leave a Reply

AI Mastery In 60 Days The Step By Step Guide Walks You Through AI Tools In Bite Sized Lessons

Credit to:  arminhamidian If you’ve ever felt like the  world of AI  is moving too fast for you to keep up, you’re not alone. Everywhere you...