Tesla has long been known for closely guarding the data collected by its Autopilot system, but a recent investigation by The Wall Street Journal has revealed just how difficult it is to access this precious information. In a video titled “How Hackers and Mechanics Unearth Tesla’s Hidden Autopilot Data,” the process of extracting data from deep within a Tesla car is explored.
The video starts by showcasing the Tesla Autopilot computers that are responsible for collecting and storing data on the car’s driving behavior. These computers hold valuable information that can shed light on how the Autopilot system operates and potentially provide clues in the event of a crash.
The Journal obtained a set of this data and shows viewers how it was extracted from the car. The process involves removing the computer from the vehicle, a task that requires skill and precision to avoid damaging the delicate components inside. Once the computer is taken out, the next step is to hack into it and access the data stored within.
By exploring the intricate process of extracting Autopilot data from a Tesla car, the video highlights the lengths to which hackers and mechanics are willing to go in order to uncover valuable information. This investigation sheds light on the challenges of accessing data from a high-tech vehicle like a Tesla and the potential implications for understanding the inner workings of the Autopilot system.
For anyone interested in the inner workings of Tesla’s Autopilot system and the challenges of accessing data from these advanced vehicles, the full video investigation is a must-watch. As technology continues to evolve, the battle between hackers and manufacturers to control access to data will likely become an increasingly important issue in the automotive industry.
Watch the video by The Wall Street Journal
Video “How Hackers and Mechanics Unearth Tesla’s Hidden Autopilot Data | WSJ” was uploaded on 07/30/2024 to Youtube Channel The Wall Street Journal
WSJ should understand that Autopilot is not the self driving system on Tesla vehicles! Autopilot is cruise control.
FSD is the self driving system. Confusing? Yes, but that's it.
This comment section is packed full of Tesla glazers, lol
If its digital it isnt secure. Someone will sell the source code for money and expose everyone attached to the service
If WSJ thinks there's major safety issue with autopilot then this report should be public, since it's not, nothing to see, nothing to worry about.
Watching the video, I can understand why the car crashed, it was very hard to see in those lighting conditions. One would think that you can't rely solely on video cameras to have safe autonomous driving, but Tesla's CEO convinced himself, his employee's, investors, and customers that all you need are cameras since that how humans see…
1 fatal crash, from 2021 !!! This is what it said in the WSJ News when checking my TSLA shares on CommSec.
WSJ investigation uncovers alarming flaws in Tesla's Autopilot leading to crashes
Jul 30, 2024
The Wall Street Journal (WSJ) on Tuesday published a critical video about Tesla's driver-assist systems, highlighting “longstanding concerns” regarding the company's camera-based technology.
The video, which represents a part of the Journal’s comprehensive investigation into Tesla’s Autopilot system, suggests that this technology has been a key factor in several crashes, some of which have been fatal.
Specifically, the investigation gathered data and video from more than 200 Tesla Autopilot crashes, revealing how the driver-assist algorithms process inputs from Tesla Vision cameras in real-time.
The video highlights a fatal accident from May 2021 involving Steven Hendrickson, who was driving his Tesla Model 3 in Autopilot mode on his way to work. An overturned double trailer appeared on the highway, which Tesla’s system failed to recognize, resulting in a full-speed collision that killed Hendrickson.
“The kind of things that tend to go wrong with these systems are things like it was not trained on the pictures of an overturned double trailer. It just didn’t know what it was,” said one expert who reviewed the footage.
The WSJ’s video points out similar issues where Autopilot misinterprets the lights of emergency vehicles, leading to crashes.
The investigation notes that Tesla's self-driving crashes result from both hardware and software issues, including slow algorithm updates and inadequate camera calibration. While the findings raise significant concerns, more independent analysis may be necessary to challenge Elon Musk's assertion that Tesla's self-driving feature is ultimately safer than human drivers.
The Wall Street Journal cross-referenced individual state accident reports with the federal database maintained by the National Highway Traffic Safety Administration (NHTSA) and reenacted 222 Tesla crashes. Of these, 44 incidents occurred when Teslas on Autopilot "veered suddenly," and 31 happened when the vehicles "failed to stop or yield," with the latter causing the most severe accidents.
Experts who reviewed the crash footage and the Autopilot system's algorithmic operations indicated that it would take time to train the system to handle all road scenarios.
Tue, 30 Jul 2024 09:00:00 -0400
Copyright (c) 2024 StreetInsider.com
Sentiment
Negative
-0.79
TSLA
228.60
TESLA INC
-3.50 (-1.51%)
Up Next
Zacks Investment Research Newswire
Wall Street offered a mixed performance last week, with the S&P 500 losing 0.8%. The Dow Jones Industrial Average has gained 0.8%. The Nasdaq was off 2.1% last week. The Russell 2000 has advanced 3.5% last week.
,
2 hours ago
•
5 minute read
Dow Jones Global News
By Gareth Vipers Tesla is recalling more than 1.8 million vehicles in the U.S. due to a software issue that could result in the hood fully opening and obstructing the driver's view. The National Highway Traffic Safety Administration said Tuesday that the issue risks failing to warn a driver when the hood of the vehicle is unlatched, increasing the risk of a collision.
2 hours ago
•
2 minute read
Zacks Investment Research Newswire
Tesla TSLA shares gained 5.6% on Monday after Morgan Stanley named the most valuable automaker its "top pick" in the U.S. automotive industry, replacing Ford F, as quoted on Reuters. The Tesla stock has moved down 8.
,
3 hours ago
•
6 minute read
Seeking Alpha
Tesla (NASDAQ:TSLA) recalls about 1.85M vehicles in the United States due to a software issue that may fail to detect when the hood is unlatched, according to the National Highway Traffic Safety Administration.
3 hours ago
•
3 minute read
I once had a procedure done and I couldn't get the results without scheduling an appointment. I figured they didn't find anything, or else they'd want to tell me immediately for my safety. This is kind of how I feel about this video.
Hahha you fools pay for movie news 🎉😂😂
You can get WSJ access through a local library or your university. Just saying
WSJ = King of Cliffhanger 🙄 Useless video. Thumb down.
Super insightful, always wondered about Tesla's data! 🚗💨
I generally like the WSJ, but the intentional toning and lack of info on this video shows a clear anti-Musk/Tesla tilt.
Just another hit piece from the WSJ. What is the purpose of this video exactly?
clickbait
Nice Hit Piece.
You make it sound like Tesla is doing something wrong, when in this video there is nothing bad being shown.
I’m pretty sure if the driver did actually crash via autopilot they would say it was the autopilot and not them
how? by hacking into any website of tesla, or some employee in tesla leaks data
Still don’t know what I’ve learned from the video…
Elon is gonna flip out when he sees this and finds out people are doing this to his precious car stuff. lol I hope they keep doing it.
Love my Tesla Model 3 Performance with FSD (supervised) If you haven't experienced it, don't have an opinion on it because the talking heads are just putting forth narratives because they love to control how you think. THINK for yourself! Experience these things yourself! Also, removing circuit boards, tapping into chips to extract data is not hacking, rather some sort of forensic data mining. In addition, if it is truly "heavily encrypted" it would be impossible to extract the data. Don't quite understand why narrative is so Luddite-centric and not looking positively into the future. I'm living in the future when I drive my Tesla and it makes me smile. Also, I thought Elon was amazing despite who he voted for and recommends to vote for. I guess that tells you what type of person I am, tolerant of divergent viewpoints.
They should prohibit Tesla from using its customers as guinea pigs for its driving assistant because that does not reach the level of autonomous driving.
1st He opens a Gigafactory in China, And then He acts surprised when Chinese EV's reverse engineer his cars. And Outsale Tesla in terms of yearly units sold and output productions in 2023🙄🙄🙄🙄. Lol then we lobby the governments to Tax and "Sanction" Chinese EV's
Tesla drivers need to take control in certain situations such as coming to that road crossing and when the police were stopped on the highway. No telling what the driver was actually doing. On the last one I think most people would not have seen that truck the highway with no lights.
Lol Wall Street Journal there's a thin line between Reporting/ethical hacking or leaks and being an Accomplice to a Crime and IP Theft photoage, Brand trashing 😂😂😂😂. But hey everything's blurry nowadays.
But I guess this was done with the TESLA owners approval "atleast"…. And Lol can you criminalize or protect certain parts on vehicle's or products if they are ever opened
I love the the ending where the narrator says "Look this is my Face and I did this"😂😂😂😂. Elon you in trouble now, lol car insurance and life insurance companies are going to be on your tail for this Data
This is fixable TESLA….Increase you FPS ask @theslowmoguys
Just unsubscribed after this manipulative tactic, such a pathetic choice by the social media team.
Such BS. Tesla hasn't used radar in a long time. Hit piece.
A normal brain (one of the most complex, object in the universe)would be too late to detect that black vehicle. its called accidents. We xant have a perfect world or else we woildnt exist.
So what was the point of the video?
I think most maybe 90% of Drivers would have did the same Exact thing!
Dont let the chinese see this video
😢 1. The computer doesn't recognise the thing on the road.
2. There is no course of action (braking, steering, warning, etc.)
3. Data is severely limited afterwards as to what the system was designed or supposed to do in terms of action/s. "The unknown".
4. Encrypted proprietary data is possibly useless even for the manufacturer. "The unknown unknown". So, it is highly improbable to improve the system without new technologies like ladar, night vision, sensors, etc.
tesla autopilot uses video not radar
Whaddya expect when Tesla gets customers to beta test products with their lives.
As if some normal idiot on the road was going to see that. FSD doesn't need to be a 100% – just needs to be better than human drivers.
PAYWALL?!!?
Bye, WSJ 👋🏼
Thumbs down
FUD
What a terrible FUD video. You imply accidents are Tesla Autopilots fault on streets where auto pilot doesn’t work and stop the video with never any evidence.
You don't need a hacker to to see that the Tesla auto pilot is a death trap. The fault is the US government for allowing this stupidity to be on our streets! How did it make it to the streets? MONEY!
So much money and brain time invested in this system, yet, they forgot to implement a IR camera to see stationary objects at night! Seriously!
Watch the full Tesla autopilot video investigation here: https://on.wsj.com/4dhfcm1