Tech and Science

Tesla has been instructed by auto regulators to provide Autopilot configuration data in “Elon mode.”

Tesla has received a special order from federal auto safety agencies, requiring the company to provide extensive data about its driver assistance and driver monitoring systems, as well as a once-secret configuration for them known as “Elon mode.”

Typically when a Tesla When the driver uses the company's advanced driver assistance systems, which are marketed as Autopilot, Full Self-Driving, or FSD Beta options, a visual icon flashes on the vehicle's touchscreen to prompt the driver to manipulate the steering wheel. If the driver leaves the steering wheel unattended for too long, the “nagging” increases to a beeping noise. If the driver still does not take the wheel at this point, the vehicle may disable use of its advanced driver assistance features for the remainder of the journey or longer.

As CNBC previously reported, with the “Elon Mode” configuration enabled, Tesla can allow a driver to use the company's Autopilot, FSD, or FSD Beta systems without the so-called “nagging.”

The National Highway Traffic Safety Administration sent a letter and special directive to Tesla on July 26 asking for details on how this seemingly special configuration would be used, including the number of cars and drivers Tesla has authorized to use this configuration. The file was added to the agency's website on Tuesday and Bloomberg reported about it for the first time.

In the letter and special order, the agency's acting chief adviser, John Donaldson, wrote:

“NHTSA is concerned about the safety implications of recent changes to Tesla's driver monitoring system. These concerns are based on available information indicating that vehicle owners may be able to change Autopilot driver monitoring configurations to allow the driver to operate the vehicle while in Autopilot for extended periods of time without the Autopilot prompting the driver to apply torque to the steering wheel .

Tesla was given a deadline of August 25 to provide all of the information requested by the agency and they responded in a timely manner, but they requested and their response was granted confidential treatment from NHTSA. The company did not immediately respond to CNBC's request for comment.

Automotive Safety Researcher and Carnegie Mellon University Philip Koopman, associate professor of computer engineering, told CNBC after the order was made public: “It appears that NHTSA has a bad opinion of cheat codes that allow safety features such as driver monitoring to be disabled. I agree. Hidden features that compromise security have no place in production software.”

Koopman also noted that NHTSA has yet to complete a number of investigations into accidents in which Tesla autopilot systems may have played a role, including a number of “fatal truck underride accidents” and collisions involving Tesla vehicles goods that hit stationary first responder vehicles. NHTSA Acting Administrator Ann Carlson has hinted in recent press interviews that a conclusion is imminent.

For years, Tesla has told regulators like NHTSA and California's DMV that its driver-assistance systems, including FSD Beta, are only “Level 2” and don't make their cars autonomous, even though they're marketed under brand names that could confuse the issue. Tesla CEO Elon Musk, who also owns and operates the social network X, formerly Twitter, often suggests that Tesla vehicles are self-driving.

Over the weekend, Musk live-streamed a test drive in a Tesla equipped with a work-in-progress version of the company's FSD software (version 12) on the social platform. During this demo, Musk streamed using a mobile device he held while driving and chatted with his passenger, Ashok Elluswamy, head of Autopilot software development at Tesla.

In the blurry video stream, Musk didn't show the full details of his touchscreen, nor did he demonstrate that he had his hands on the yoke and was ready to take over the driving task at any moment. Sometimes he obviously didn't have his hands on the yoke.

According to Cornell Urban Tech Fellow Greg Lindsay, his use of Tesla systems would likely violate the company's Autopilot, FSD, and FSD Beta Terms of Service. He told CNBC the entire ride was like “waving a red flag in front of NHTSA.”

Teslas website In a section titled “Using Autopilot, Enhanced Autopilot and Full Self-Driving Capability,” motorists are reminded that “it is your responsibility to stay alert, keep your hands on the wheel at all times, and stay in control of your car.” “

Bruno Bowden, managing partner at Grep VC, machine learning expert and investor in autonomous vehicle startup Wayve, said the demo showed Tesla is making some improvements to its technology but still has a long way to go before it becomes a reality safe, self-driving vehicle system.

As he drove, he observed that the Tesla system nearly ran a red light, requiring intervention from Musk, who managed to brake in time to avoid any danger.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button