Conceding That Tesla's Self-Driving Beta Will 'Try To Kill You,' Tesla Stans Reinvent 'Driving'
As various stories of Tesla disasters run parallel with stories of Twitter disasters, it's not uncommon to read harsh criticisms of Elon Musk's marginally self-driving cars like, "If you're trying Tesla Full Self-Driving Beta for the first time, it's important to remember that it will at some point randomly try to kill you. This is a when, not an if."
Statements like these usually come from extreme skeptics of Tesla cars, but the above warning about Tesla cars attempting to kill their passengers comes from a prominently outspoken fan of Tesla: Twitter user @WholeMarsBlog.
WholeMarsBlog, you may remember, is the person who wanted someone's child to stand in front of a self-driving Tesla so that they could prove the car wouldn't run it over, contrary to tests showing Teslas consistently ran over (fake) children multiple times.
Later on in their thread warning people about the dangers of the self-driving cars they deeply believe in, WholeMarsBlog acknowledges that if the car runs someone over because the "driver" is not paying attention, the driver could be "charged criminally."
Tesla recently released its full self-driving beta to customers who purchased the service, meaning there are now a significant amount of drivers testing out a prototype of computer-controlled driving on the road.
"It's the best thing ever. Used properly, it is safer than driving yourself," WholeMarsBlog added about the new feature.
WholeMarsBlog wasn't the only Tesla supporter to acknowledge that the car's full self-driving system (shortened to FSD) requires drivers to be constantly on the alert for the car to try and kill them. A user named "Tesla Patriot and Elon Musk" warned, "Tip: When turning left at an intersection, DON'T trust it not to turn into oncoming traffic. Keep your foot over the break." Another user replied, "Agreed. More times than not I have to stop it from pulling out into oncoming traffic."
With fans like these, one need only imagine what Tesla's critics are saying (and they are saying quite a lot, by the way).
As these helpful tips spread on social media, many wondered if Tesla drivers would be better served ignoring the FSD beta and just driving the car normally.
Considering the appeal of self-driving cars is hypothetically a safer, more relaxing experience for the riders, the fact that Tesla's self-driving beta has been released in a state where drivers need to be constantly alert in case the car decides it wants to careen into oncoming traffic sounds just as stressful as normal driving to many.
On a related note 10 of the 11 deaths caused by self-driving cars in 2022 were caused by Tesla vehicles.
Share Pin
PhasmaFelis
A self-driving car that you have to pay complete attention to at all times is worse than a regular car. You can't relax any more than you could normally, and it's harder to stay focused when you're not actively engaged.
A Concerned Rifleman
"If a self-driving vehicle gets into an accident, is the driver or the technology at fault?" In the US, at least, we rule that the driver is in full liability of the vehicle at all times, which is why automakers treat self-driving technology as an extension of cruise control.