US launches probe into Tesla’s self-driving over pedestrian death
The United States Government Safety Agency is achieving again in the “complete leadership” system in Tesla, this time after obtaining reports of accidents in low -definition conditions, including the infantry killing system.
The National Traffic Safety Administration on Highway Traffic in Documents says it has opened an investigation on October 17, as the company has reported four incidents after it entered the low -vision areas, including glows in the sun, fog and mobile dust by air.
In addition to the death of the infantry, another collision involves an injury, according to the agency.
Investigators will discuss the ability of “full self -driving” to “discover a suitable response due to the conditions of seeing the road, and if so, the conditions contributing to these accidents.”
The investigation covers about 2.4 million Tesla from 2016 to 2024 years.
A message was left early on October 18 to get a comment from Tesla, which said repeatedly that the system could not lead itself and human drivers should be ready to intervene at all times.
Last week, Tesla held an event in the Hollywood Studio to detect a completely independent roboty without a driving wheel or pedals. The CEO Elon Musk said that the company is planning to operate the entire autonomous vehicles without drivers of human beings next year, and the robotics available in 2026.
The agency also said that it will discuss whether any other similar incidents involve “full self -driving” have occurred in low vision conditions, and will seek information from the company about whether any updates affect the system performance in those circumstances.
The documents said: “In particular, his review will evaluate the timing of any updates from these updates and their purpose and capabilities, as well as the evaluation of Telsa for its influence on safety.”
Tesla twice recalled “full self -driving” under pressure from the agency, which in July sought to obtain information from the law and the company after Tamla using the system beating and killing a motorcycle near Seattle.
Recovery operations were released because the system was programmed to operate the signs of stopping at slow speeds and because the system is opposed to other traffic laws. Both problems were to be fixed with online software updates.
Critics said that the Tesla system, which only uses cameras to discover risks, has no suitable sensors to be completely self -driving. All other companies that work on self -government vehicles use radar and laser sensors as well as cameras to see better in dark or bad vision conditions.
The “full self -summons” arrived after a period of three years in the system of less complicated automated pilot in Tesla in an emergency and ugliest on highways, where many warning lamps flabed.
This investigation was closed last April after the agency pressed Tesla to remind its cars to strengthen a weak system that confirmed that the drivers are noticeable. A few weeks after the summons, NHTSA began to investigate if the summons is working.
The investigation, which opened on October 17, enters a new NHTSA area, which had previously seen Tesla systems as helping drivers instead of driving themselves. With the new investigation, the agency focuses on the capabilities of “full self -driving” instead of ensuring that drivers are noticing.
This story was reported by Associated Press.