Tag Archives: DriverAssist

U.S. seeks Tesla driver-assist documents; company hikes capex forecast

WASHINGTON, Jan 31 (Reuters) – Tesla Inc (TSLA.O) disclosed on Tuesday the U.S. Justice Department has sought documents related to its Full Self-Driving (FSD) and Autopilot driver-assistance systems as regulatory scrutiny intensifies.

The automaker said in a filing it “has received requests from the DOJ for documents related to Tesla’s Autopilot and FSD features.”

Reuters reported in October Tesla is under criminal investigation over claims that the company’s electric vehicles could drive themselves. Reuters said the U.S. Justice Department launched the probe in 2021 following more than a dozen crashes, some of them fatal, involving Autopilot.

Tesla did not respond to a request for comment.

Chief Executive Officer Elon Musk has championed the systems as innovations that will both improve road safety and position the company as a technology leader.

Regulators are examining if Autopilot’s design and claims about its capabilities provide users a false sense of security, leading to complacency behind the wheel with possibly fatal results.

Acting National Highway Traffic Safety Administration (NHTSA) chief Ann Carlson said this month the agency is “working really fast” on the Tesla Autopilot investigation it opened in August 2021 that she termed “very extensive.” In June, NHTSA upgraded to an engineering analysis its defect probe into 830,000 Tesla vehicles with Autopilot, a step that was necessary before the agency could demand a recall.

Latest Updates

View 2 more stories

Autopilot is designed to assist with steering, braking, speed and lane changes. The function currently requires active driver supervision and does not make the vehicle autonomous. Tesla separately sells the $15,000 full self-driving (FSD) software as an add-on that enables its vehicles to change lanes and park autonomously.

The automaker’s shares rose 2% in early trading.

The Wall Street Journal reported in October that the Securities and Exchange Commission is conducting a civil investigation into Tesla’s Autopilot statements, citing sources.

Tesla also forecast Tuesday capital expenditure between $7 billion and $9 billion in 2024 and 2025. The midpoint of that expectation is $1 billion higher than the $6.00 billion to $8.00 billion range provided for this year.

Reuters Graphics

Some of the spending will go toward a $3.6 billion expansion of its Nevada Gigafactory complex, where Tesla will mass produce its long-delayed Semi truck and build a plant for the 4680 cell that would be able to make enough batteries for 2 million light-duty vehicles annually.

Tesla said it recorded an impairment loss of $204 million on the bitcoin it holds, while booking a gain of $64 million from converting the token into fiat currency.

Cryptocurrencies such as bitcoin were hammered last year as rising interest rates and the collapse of major industry players such as crypto exchange FTX shook investor confidence.

Reporting by Akash Sriram in Bengaluru and David Shepardson; Editing by Sriraj Kalluvila and Bernadette Baum

Our Standards: The Thomson Reuters Trust Principles.

Read original article here

Self-Driving and Driver-Assist Technology Linked to Hundreds of Car Crashes

Over the course of 10 months, nearly 400 car crashes in the United States involved advanced driver-assistance technologies, the federal government’s top auto-safety regulator disclosed Wednesday, in its first-ever release of large-scale data about these burgeoning systems.

In 392 incidents cataloged by the National Highway Traffic Safety Administration from July 1 of last year through May 15, six people died and five were seriously injured. Teslas operating with Autopilot, the more ambitious Full Self Driving mode or any of their associated component features were in 273 crashes. Five of those Tesla crashes were fatal.

The disclosures are part of a sweeping effort by the federal agency to determine the safety of advanced driving systems as they become increasingly commonplace. Beyond the futuristic allure of self-driving cars, scores of car manufacturers have rolled out automated components in recent years, including features that allow you to take your hands off the steering wheel under certain conditions and that help you parallel park.

In Wednesday’s release, NHTSA disclosed that Honda vehicles were involved in 90 incidents and Subarus in 10. Ford Motor, General Motors, BMW, Volkswagen, Toyota, Hyundai and Porsche each reported five or fewer.

“These technologies hold great promise to improve safety, but we need to understand how these vehicles are performing in real-world situations,” said Steven Cliff, the agency’s administrator. “This will help our investigators quickly identify potential defect trends that emerge.”

Speaking with reporters ahead of Wednesday’s release, Dr. Cliff also cautioned against drawing conclusions from the data collected so far, noting that it does not take into account factors like the number of cars from each manufacturer that are on the road and equipped with these types of technologies.

“The data may raise more questions than they answer,” he said.

About 830,000 Tesla cars in the United States are equipped with Autopilot or the company’s other driver-assistance technologies — offering one explanation why Tesla vehicles accounted for nearly 70 percent of the reported crashes.

Ford, GM, BMW and others have similar advanced systems that allow hands-free driving under certain conditions on highways, but far fewer of those models have been sold. These companies, however, have sold millions of cars over the last two decades that are equipped with individual components of driver-assist systems. The components include so-called lane keeping, which helps drivers stay in their lanes, and adaptive cruise control, which maintains a car’s speed and brakes automatically when traffic ahead slows.

Dr. Cliff said NHTSA would continue to collect data on crashes involving these types of features and technologies, noting that the agency would use it as a guide in making any rules or requirements for how they should be designed and used.

The data was collected under an order NHTSA issued a year ago that required automakers to report crashes involving cars equipped with advanced driver-assist systems, also known as ADAS or Level-2 automated driving systems.

The order was prompted partly by crashes and fatalities over the last six years that involved Teslas operating in Autopilot. Last week NHTSA widened an investigation into whether Autopilot has technological and design flaws that pose safety risks. The agency has been looking into 35 crashes that occurred while Autopilot was activated, including nine that resulted in the deaths of 14 people since 2014. It had also opened a preliminary investigation into 16 incidents in which Teslas under Autopilot control crashed into emergency vehicles that had stopped and had their lights flashing.

Under the order issued last year, NHTSA also collected data on crashes or incidents involving fully automated vehicles that are still in development for the most part but are being tested on public roads. The manufacturers of these vehicles include G.M., Ford and other traditional automakers as well as tech companies such as Waymo, which is owned by Google’s parent company.

These types of vehicles were involved in 130 incidents, NHTSA found. One resulted in a serious injury, 15 in minor or moderate injuries, and 108 didn’t result in injuries. Many of the crashes involving automated vehicles led to fender benders or bumper taps because they are operated mainly at low speeds and in city driving.

Waymo, which is running a fleet of driverless taxis in Arizona, was part of 62 incidents. G.M.’s Cruise division, which has just started offering driverless taxi rides in San Francisco, was involved in 23. One minor crash involving an automated test vehicle made by Pony.ai, a start-up, resulted in a recall of three of the company’s test vehicles to correct software.

NHTSA’s order was an unusually bold step for the regulator, which has come under fire in recent years for not being more assertive with automakers.

“The agency is gathering information in order to determine whether, in the field, these systems constitute an unreasonable risk to safety,” said J. Christian Gerdes, a professor of mechanical engineering and a director of Stanford University’s Center for Automotive Research.

An advanced driver-assistance system can steer, brake and accelerate vehicles on its own, though drivers must stay alert and ready to take control of the vehicle at any time.

Safety experts are concerned because these systems allow drivers to relinquish active control of the car and could lull them into thinking their cars are driving themselves. When the technology malfunctions or cannot handle a particular situation, drivers may be unprepared to take control quickly.

Some independent studies have explored these technologies, but have not yet shown whether they reduce crashes or otherwise improve safety.

In November, Tesla recalled nearly 12,000 vehicles that were part of the beta test of Full Self Driving — a version of Autopilot designed for use on city streets — after deploying a software update that the company said might cause crashes because of unexpected activation of the cars’ emergency braking system.

NHTSA’s order required companies to provide data on crashes when advanced driver-assistance systems and automated technologies were in use within 30 seconds of impact. Though this data provides a broader picture of the behavior of these systems than ever before, it is still difficult to determine whether they reduce crashes or otherwise improve safety.

The agency has not collected data that would allow researchers to easily determine whether using these systems is safer than turning them off in the same situations.

“The question: What is the baseline against which we are comparing this data?” said Dr. Gerdes, the Stanford professor, who from 2016 to 2017 was the first chief innovation officer for the Department of Transportation, of which NHTSA is part.

But some experts say that comparing these systems with human driving should not be the goal.

“When a Boeing 737 falls out of the sky, we don’t ask, ‘Is it falling out of the sky more or less than other planes?’” said Bryant Walker Smith, an associate professor in the University of South Carolina’s law and engineering schools who specializes in emerging transportation technologies.

“Crashes on our roads are equivalent to several plane crashes every week,” he added. “Comparison is not necessarily what we want. If there are crashes these driving systems are contributing to — crashes that otherwise would not have happened — that is a potentially fixable problem that we need to know about.”

Jason Kao contributed reporting.

Read original article here