Automakers have been discussing for some time that they are working on self-driving technology that will allow for autonomous driving without input from the driver. In early July, BMW announced that it would have a fully self-driving vehicle on the market by 2021. The vehicle will use technology from Intel and Mobiley. Fiat Chrysler has also announced that it will partner with Google to manufacture self-driving minivans. These partnerships evidence the fact that the technology industry and the auto manufacturing sector need one another in an effort to fully achieve the goal of a self-driving car.
However, the real question is whether the public will accept and trust this technology. Earlier this month, it was also announced in the media that a Tesla automobile crashed and killed its driver while in its “autopilot” function in Florida. Tesla’s system is not a fully autonomous system, but is described as a “traffic-aware cruise control.” The Tesla system requires that a driver maintain hands on the steering wheel at all times and is supposed to slow and/or stop the vehicle if hands are not detected on the steering wheel. Nonetheless, it appears that Tesla’s system allowed the vehicle to run under an 18-wheeler that was turning in front of the vehicle while in the “autopilot” function.
Tesla’s founder, Elon Musk, has described the autopilot feature as “probably better than humans at this point in highway driving.” Nonetheless, in a blog post, Tesla asserts that “Tesla disables autopilot by default and requires explicit acknowledgement that the system is new technology and still in a public beta phase before it can be enabled.” Does this mean it is an experimental system?
The National Highway Traffic Safety Administration (NHTSA) is investigating a second rollover crash of a Tesla Model X in Pennsylvania. The July 1 crash happened on the Pennsylvania Turnpike. It doesn’t appear that this incident involved serious injury or death, but it’s another incident that causes concern.
NHTSA is seeking details about Tesla Motor Inc.’s autopilot system, particularly how it operates in situations with crossing traffic and its emergency braking mechanism. The agency has asked Tesla to supply information about the fatal May 7 crash of a Tesla Model S in self-driving mode in Florida.
Specifically, the agency wants to know how many times the automatic emergency brake function has been activated in Tesla’s cars and wants the company to hand over consumer complaints, crash reports and any lawsuits or arbitration proceedings that could be related to the alleged potential defect. NHTSA also requested details about how the autopilot system works in intersections and how it detects crossing traffic, pedestrians, cyclists and other cars.
In addition, NHTSA wants Tesla to provide its own reconstruction of the crash. The accident has drawn heavy scrutiny from regulators, including the U.S. Securities and Exchange Commission (SEC), which is reportedly investigating whether Tesla should have should have disclosed the May 7 accident as a “material” event, or a development that investors needed to know about or would consider important.
The NTSB team investigating the accident will include a recorder specialist, a reconstructionist, a vehicle investigator and an investigator with collision avoidance expertise. The NTSB said the investigation will more comprehensively examine whether the Florida crash reveals systemic issues that might inform the future development of driverless cars and the investigation of crashes involving autonomous vehicles. Another accident involving a different Tesla model likewise equipped with autopilot is also being examined, but it hasn’t yet been determined if the self-driving system was engaged at the time of that crash.
Consumer Reports has urged Tesla Motors Inc. to disable the automatic steering function in its semi-self-driving system until it is reprogrammed with additional safety enhancements to keep drivers in control of the car. The magazine, known for rating cars and other consumer products, recommended that Tesla update the Autopilot system to confirm that the driver’s hands remain on the steering wheel at all times before it can be activated.
Consumer Reports also called on the automaker to change the name of the Autopilot feature, saying that it promotes the potentially dangerous assumption that the Model S vehicle is capable of driving on its own, which can contribute to a false sense of security or laissez-faire attitude by drivers.
Again, the real question is whether the public will accept and trust this new technology. It is clear from recent events, including the Toyota sudden unintended acceleration litigation, that there can be numerous bugs and problems with automobile software that can cause serious and fatal accidents. There is very little room for error with safety system software that will take control of a vehicle away from its human operator. When wrecks occur, it will be interesting to see if the automakers will continue to blame the drivers.
If you need more information on this subject, contact Ben Baker, a lawyer in our firm’s Personal Injury / Products Liability Section, at 800-898-2034 or by email at Ben.Baker@beasleyallen.com. Ben handles product liability litigation for our firm.
Source: theguardian.com and USA Today
Contact us today for a free legal consultation with an experienced attorney.
Fields marked *may be required for submission.
If you would like to subscribe to the Jere Beasley Report digital edition, simply visit our Subscriptions page and provide the necessary information or call us at 800-898-2034.
Attorney Advertising - Prior results do not guarantee a similar outcome.