In a tragic incident, a Tesla Model S operating in “Full Self-Driving” (FSD) mode struck and killed a 28-year-old motorcyclist near Seattle in April. The 56-year-old driver, who admitted to looking at his cell phone while using the driver assistance feature, has been arrested on suspicion of vehicular homicide, according to local police. This marks at least the second fatal accident involving Tesla’s FSD technology, which Tesla CEO Elon Musk has heavily promoted.
Tesla emphasizes that its “Full Self-Driving (Supervised)” software requires active driver supervision and does not render the vehicle fully autonomous. However, this incident highlights ongoing concerns about the limitations of Tesla’s camera-based system, which relies heavily on artificial intelligence.
The National Highway Traffic Safety Administration (NHTSA) has confirmed that it is aware of the crash and is gathering information from local authorities and Tesla. This follows the NHTSA’s report of one fatal accident involving Tesla’s FSD software between August 2022 and August 2023.
Experts have expressed skepticism about Tesla’s technology. Sam Abuelsamid, an analyst at Guidehouse Insights, pointed out that Tesla’s camera-only system can struggle with accurately measuring distances to objects, leading to potential errors. Raj Rajkumar, a professor of electrical and computer engineering at Carnegie Mellon University, noted the difficulty of collecting and processing data from various real-world conditions, such as different weather, lighting, and traffic scenarios.
Despite these challenges, Musk remains committed to advancing Tesla’s self-driving capabilities. This year, he deferred plans for an all-new affordable Tesla model, instead doubling down on the development of autonomous vehicles. Musk recently stated that he would be “shocked” if Tesla does not achieve full self-driving capability by next year. He envisions future Tesla vehicles as “tiny mobile lounges” where occupants can watch movies, play games, work, or even sleep.
However, Tesla’s self-driving ambitions have come under increasing regulatory and legal scrutiny. The NHTSA launched an investigation into Tesla’s Autopilot system in August 2021 after identifying more than a dozen crashes involving Tesla vehicles colliding with stationary emergency vehicles. The agency has also reviewed hundreds of other crashes involving Tesla’s Autopilot feature.
In December 2023, Tesla was compelled to recall nearly all its vehicles in the United States to implement additional safeguards in its FSD software, highlighting the ongoing challenges and risks associated with the technology.
Source: https://edition.cnn.com/