In a recent investigation conducted by the Wall Street Journal, alarming data has been revealed regarding Tesla’s Autopilot crashes. The data gathered from over 200 Tesla crashes shows that concerns about the company’s camera-based technology, which sets it apart from other industry competitors, are manifesting on the roads and endangering the public.
The investigation delves into the reasons behind these crashes, shedding light on Tesla’s unique camera-based system. Unlike other companies that rely on a combination of cameras, radar, and lidar sensors, Tesla’s Autopilot solely relies on cameras. This has raised questions about the system’s effectiveness, as the data collected by the Wall Street Journal shows a significant number of crashes involving Teslas equipped with Autopilot.
The process of obtaining Tesla’s video and data was a crucial aspect of the investigation, as it allowed for a deeper understanding of how the system functions and where it may fall short. The analysis revealed how humans interact with Tesla’s systems, highlighting potential issues that may contribute to crashes.
Looking ahead, it is imperative that Tesla addresses these concerns and takes steps to improve the safety of its Autopilot system. As more vehicles equipped with this technology hit the roads, it is paramount that measures are put in place to prevent future accidents.
The Wall Street Journal’s investigation serves as a warning for both Tesla and the public, emphasizing the importance of transparency and accountability in the development of autonomous driving technology. The data gathered sheds light on the hidden dangers of Tesla’s Autopilot system and underscores the need for continued scrutiny and improvement.
Watch the video by The Wall Street Journal
Video “The Hidden Autopilot Data That Reveals Why Teslas Crash | WSJ” was uploaded on 12/13/2024 to Youtube Channel The Wall Street Journal
Autopilot is an ASSIST. YOU need to be ALERT at all times ! It pains me to see that many in my tesla chat group are using defeat devices like weights on the steering wheel to cheat the system into thinking you are holding the steering wheel, or to blind the cabin camera so that it does not warn you on playing your phone while driving. At the end, drivers are ultimately responsible for their own driving journey,
Any report on auto safety that isn't showing the data in context isn't worth your time. Anyone can sift for isolated anecdotes and dramatize them.
I see three main problems
1. Autopilot does not mean you leave your car unattended.
2. Saving $ by not using LiDAR—yes, they are doing that, and it is risky. Cameras don’t perform as well as LiDAR, not even close at night or in fog….
3. A very smart but also dangerous guy continues saying his cars can drive autonomously.
I took .5 ETH and bought XAI600K. Doubled my money and got my ETH back. Now I'm in 4 coins for free and up significantly.
XAI600K all the wayyyy. For all of you that said its a ghost chain. You are wrong haha.
I do appreciate what a Tesla can do. But what I don't like is the overselling of the capabilities that it does have. I don't like Elon's stating of the fact that Lidar and radar are expensive for that car. This statement in effect amounts to him saying that a person's life is cheaper than those suite of sensors that it should have. But what mostly annoys me is the fact that he is arrogant, and doesn't know how to apologise when he is wrong.
Distracted driving is at an all time high and is extremely dangerous. I'd rather have a distracted driver on autopilot than not. But it really needs to be considered the main problem. Most new cars have lane assist, it makes no sense to attack Tesla for this.
Data from 2021 published by WSJ now. How about getting data from 2024?
You must always be attentive when using autopilot.
Its not a fully autonomous system
I had a loaner Model Y with full self-driving enabled a few months ago so this is fairly recent. From my experience with it, it's nowhere near ready for prime time. It made numerous mistakes, and I watched it like a hawk. On the highway, during the daytime it seemed fine, except it always wanted to go the construction speed of 55Mph when everyone else is doing 70+ (and construction only occurs at night). But on surface streets, it had a lot more problems like it can't see to the sides very well as well as crossing solid white lines into non-traffic lanes and it tried to jump the gun on a metering light. This was only over a few days. It was also extremely hesitant. To me it seemed to drive like an inexperienced driver. The car had a LOT of problems making a left out of my neighborhood onto a major thoroughfare. If it's a sharp right or left it's even worse. I did not let my guard down on it, but I can see how people can get into serious trouble if they do.
lazy journalism…
I hate to sound insensitive….but this wouldn’t have happened if the driver (prayers to his loved ones) had been paying attention to the road. There are people who think that FSD/Autopilot will just always handle whatever the situation is. In this case we can see the drivers lack of awareness (and probably over trust in the system) lead to his unfortunate death.
I’m a bit confused by the ece expert. One thing he missed is the fact that Tesla uses 2(+) cameras for every frame of video information. Why? Because 2 cameras gets you depth perception, just like our eyes. Elon is technically correct, there is no need for laser based depth perception when you have depth perception algorithms via multiple camera pixels fused together.
I would want to know what version of fsd was present during these crashes. Certainly not fsd13 or maybe even any end to end model (v12). The earlier versions were definitely more spotty with reliability. Did they use depth perception fusion? Did that system fail? Even if the ML model does not recognize a stopped semi on a highway, the depth perception should have caught the object. This is all very strange and I hope the nhsta can get to the bottom of it. Keep up the good investigative work.
They should include LiDAR or get banned.
The roads will be much safer once people stop blindly believing Elon’s flawed vision approach.
You need a number of sensors (lidar, radar, and cameras) for the car to richly understand the world in front of them. Other OEMs are adopting and will leave Tesla in the dust.
I find it hard to have sympathy when the car alerted the guy 19 times to take over. He was mi's using the technology, being careless and having no regard for his own safety, his family, or other people on the road.
He clearly wasn't paying attention to the road and was over relying on self driving.
Moronic hit piece by WSJ. The examples given are from 2021. FSD continues to be updated and gets safer. Of course in this 11min video, no actual crash or fatality data is given. Just a couple of old fear mongering edge cases from 2021. Terrible.
AI, which has been improperly named, "Artificial Intelligence", should simply and more properly named, "Algorithmic Information". The AI as we know it today is not intelligent at all; but instead, is a network of algorithms and information derived from human input they call training.
Where to buy XAI600K pls
Waste of journalism
This guy is such a sleze
Autopilot is safer than a human driver when there’s an attentive person behind the wheel, which is how it’s meant to be used. It’s like having two drivers—Autopilot can handle most situations well, but it might struggle with unusual scenarios like the ones in clips. That’s where the human needs to step in. So, with both Autopilot and a human working together, it’s safer than just a human driver alone. But if the human isn’t paying attention, I wouldn’t say Autopilot on its own is safer than a human driver.
Just in here enjoying the delusion of all the Elon simps.
I completely saw this XAI600K pump coming, never a doubt!
All of these crashes could’ve been avoided if the drivers were paying attention lol
BOOM! Bought in at the start of XAI600K with $10k and turned it into a quick $70K profit and got out of there. DON'T GET GREEDY!
XAI600K will be a leader this bull run
Is the default "go" and only when an object is confirmed does it stop? Should be the opposite
I drive Tesla – this is just marketing , you are so dumb to trust your life one auto pilot / not tesla's fault, lawyers are just making money
Maybe night time isn't the best for autopilot
Dogfooding with lives, Tesla is going to the moon.
If unsupervised autonomy becomes possible, they will need to allow both vision and Radar or LiDAR to detect distant objects when the cameras cannot be viable. It is as simple as that. I love Tesla to pieces because it is paving the way in a huge industry, but we must also acknowledge the reality of driving. No one will rely on an autonomous vehicle that only works sometimes when the weather is perfect, when the street lights are always on, or when the weather is clear as day, etc. because these conditions aren't consistently true. To make an unsupervised system, these conditions must also be navigated reliably and safely most of all. Until that is understood, this technology and regulatory approval for use outside of regions and times will never come to fruition, especially for individual consumers.
The overturned truck in the middle of the road got there because a mishap happened before the Tesla driver rammed it.
Unfortunately, driving a motor vehicle of any kind involves risks. Furthermore, humans and computers will continue to make mistakes, because without them, there can be no perfection.
It is simply call the Circle of Life.
Reply