Tesla’s Autopilot technology continues to be scrutinized, as two men died on Saturday, April 17th, after a 2019 Model S crashed into a tree and ignited.
Though investigations are ongoing as to what caused the accident, a constable confirmed there was no one driving the vehicle when the crash took place. “The vehicle was going at a “high rate of speed” around a curve at 11:25 p.m. local time when it went off the road about 100 feet and hit a tree,” the Constable confirmed.
While the company believes that technology can help improve the safety of vehicles on the road, more crashes keep happening, especially those involving Autopilot.
An Autopilot system uses radar and cameras to detect lane markings, other vehicles and objects on the road. Functions like steering the wheel, holding brakes and acceleration happen automatically with little input from the driver.
Despite what many people think, Tesla’s Autopilot is not an autonomous driving system. It is more of a comprehensive driver assist set up, which requires you to be fully awake and ready to take over at a moment’s notice. Even the company acknowledged that the technology is imperfect warning drivers to stay active while driving to take control at any time.
Terms like “Autopilot”, “Cruise control”, “Autonomous”, and “Driverless” are vague ideas. Autonomy in cars is a blend of the human and the computer/software taking control of the car. It shouldn’t be an on/off state, but more of something that continues.
Therefore, should Tesla be held responsible for these accidents that keep occurring?
We would like to hear your opinion, but please, make sure you watch this video of a man who was caught sleeping while driving in his Model X.
1 Comment
Pingback: Police Demand Tesla Crash Data as Musk Denies Autopilot use | Innovation Village | Technology, Product Reviews, Business