Episode 29 – Tesla Cars Store Unencrypted Data

Episode 29 – Tesla Cars Store Unencrypted Data


Welcome to the data science
ethics podcast. My name
is Lexi and I’m your host. This podcast is free and independent
thanks to member contributions. You can help by signing up to support
[email protected] for just $5 per month. You’ll get access to the
members only podcast, data science, ethics in pop culture at
the $10 per month level. You will also be able to attend live
chats and debates with Maria and I. Plus you’ll be helping us to deliver
more and better content now on with the show. Hello everybody and welcome to
the Data Science Ethics Podcast. This is Marie Weber and Lexy Kassan And today we are going to be doing a
quick take on some of the news around the data collection policies that Tesla
currently has on its vehicles. So to kind of queue up this story, we have an article that we’re going
to be linking to from the drive. It talks about a few different people
who have been purchasing wrecked Tesla model threes and they’re basically taking
these cars and they’re looking at the data that’s been stored on them. Part of the reason why they’re looking
at this is because Tesla has a bug bounty program, which means that if people can find
bugs that Tesla has in their system, Tesla will pay them money for basically
doing that white hat hacking and then they can use that information
to make their system better. But it also means that the people going
in and looking at these cars were able to uncover some very surprising things
in terms of the amount of data that Tesla has been storing on their vehicles
and the way they’ve been storing it, Aka it’s not encrypted. Yeah. And there’s a bunch of personally
identifiable information on there, not only of the people who were in the
vehicle, but of everyone they know. It’s not just recording what’s happening
as the vehicles driving and who’s driving type of thing, because it can connect to digital devices. It can then look at
information on those devices. So the scope of what was being accessed
by these vehicles from the personal devices included contacts from foam books, calendar entries, and
even email addresses. Right. So when you think about the
triangulation of personally identifiable information, you’ve got all of it. You could have where someone was, when, what their phone number is,
their name, their email address, and you have a personally identifiable
link that you could leverage. That’s a very sensitive
piece of information. The cars were also recording where people
were going and they were keeping this information for quite some time. The vehicle that they looked at said that
the researchers were able to discover the last 73 locations that the
driver had navigated. Yeah, yeah. The other thing that was very interesting
about this specific vehicle is that it was a company vehicle and they noted
that the devices that paired with the vehicle, they had 11 different
devices that paired with the vehicle. So therefore potentially 11
different drivers of that vehicle. So if you think about
this as a fleet vehicle, you now have the personal information
of 11 people, their contacts, their calendars, all of them, all of
the information of all the places. They went off everything and it
was unencrypted on the machines. Within the vehicle? No. As we mentioned, these
were salvage vehicles, so they were vehicles that had been
crashed and the other thing that was recorded was actual videos
of the crash happening. So it is understandable in some regards
why Tesla has incorporated some of these features into their vehicle. They
want it to be a great experience. They want people to be able to pair their
devices and to be able to take calls while they’re in the car or play music
from their phone or in this case they’re recording the video of the crash
presumably because that will help Tesla understand what potentially went wrong
and make sure that they’re improving the, the safety features on their vehicles. So accidents don’t happen in the future. Yeah. The article also spoke about the fact
that Tesla has used these types of videos in other circumstances to prove that their
autopilot feature was not responsible for crash incident. So this is as much a covering themselves
as it is an ability to provide evidence for let’s say, investigators who are looking at how
a crash occurred or what have you, if they receive this information
back from the vehicle. Also to help improve their algorithms or
help improve autopilot functions in the future for crash avoidance
and so forth. Tesla has all so built a feature into their
vehicles that is called sensory mode, which basically allows the car to record
what’s happening around it even when it’s not being driven as a way to help
protect the car from either vandalism or from people that might try to steal it. Yeah. In addition, it has an interior facing camera system
so that it can record who’s in the vehicle. If somebody does get in,
that’s part of century mode also, so that means that if someone were to
get in the vehicle and central mode was activated, you would actually see
what they were doing in the car. If it was an nefarious actor who
was entering the vehicle illegally, you could potentially identify who that
was and provide video of what they were doing and so forth. I could see there being some good
uses for that in other applications. If you think about ways that
you could use that in AI. So let’s say for example, you wanted to develop an algorithm that
could identify one a driver look tired and try to maybe engage them in something
that would keep them alert or wake them up or what have you so that they
were more attuned to what was going on on the road. You could potentially use an
interior rear facing camera to do that. But we’re not seeing that
type of technology yet. So having an interior facing camera just
kind of makes me wonder how much of the drivers’ privacy are you taking away by
having this camera or any of this other data being collected? Well, and with the interior facing camera, that could also be something where
they anticipate building that type of functionality in the future. And so if they know they want to be
able to release it in the future, they might be building in
the hardware for it now. So when they have the software can
just be pushed or they could use it for detecting if somebody looks like they’re
under the influence of something. And have similar safeguards in place
that would make sure that the car could still be operated safely. Even if it looked like the person
was under the influence of something or maybe it could just turn itself off
and stay off if it thought that somebody was under the influence. Sure. Potentially until a different driver came in. So I think what’s interesting looking at
this case is if you look at it from the perspective of Tesla, they are trying to anticipate adversaries, which could be people that are trying to
steal a Tesla or trying to vandalize a Tesla or actual drivers that might not
be in the best condition to operate a Tesla. And what can they do to build a safer
vehicle that has greater safety features and also helps avoid crash risks. They have looked at developing
these different types of
systems to achieve those goals. But then on the
flip side to do that, they are collecting a lot of information
and are they taking the appropriate steps to protect the privacy of
the information they’re protecting? I’m going to say it’s a hard no because
they’re not taking adequate precautions. It’s one thing to say that we’re going
to sync up to your phone so that you can have a good user experience
with the vehicle. It’s another to say all of the information
is going to be downloaded into the vehicle and stored in an unencrypted
manner without your knowledge. So think of this from, again, from the perspective of this
being a company vehicle, you as a company employee go into
the vehicle, you sync up your phone, which is maybe a company phone.
It has all your contacts, you’re driving around in it. You aren’t even the one that got the
briefing on what the data was that was going to be collected from Tesla cause
you weren’t the one buying the car. You’re just using it. What happens then if you’re renting a
Tesla or something like that where you’re only temporarily using it but it still
has all of your data and it’s not taking any sort of precaution to
safeguard that information. It seems to me that that Tesla has not
been as responsible as perhaps they should have been in protecting the privacy
of users regardless of the fact that they have a very good reason
for gathering some data. Other data really should be protected
in a much more robust manner. Absolutely. Kind of the flip side is that Tesla has
said that there are options that people can use either access their
data or to opt out of it, but it’s not giving users a fair choice
because apparently in order to get access to the data, and it’s
not even all of the data, you need to buy a cable that’s almost a
thousand dollars and you can opt out of basically data being collected on you, but then you don’t get
any software updates, which is one of the key features
that Tesla owners benefit from, is that they’re driving a car and you’re
connected to the system where you get your software updates and so you always
have the latest and greatest features. So in order for them to really
be offering a true choice, they would need to have an option that
says you can still get software updates. Even if we’re not collecting personal
that on you and there should be an easier way for people to review and or have
some control over the data that is being collected. It really makes me wonder how Tesla gets
away with this in the EU with Gdpr or how it will get away
with this in California, come next year with the California
Consumer Protection Act. Those regulations are very strict as
to what data you collect and how it’s stored and what you can do with it and
how you must remove it if requested to do so. And those types of questions
I don’t think had been asked. And now that this is coming to light,
I’m thinking they’re going to be asked. I just asked it on air.
So if I’ve asked it, somebody else is thinking about very true. Somebody to whom it actually relates
cause I’m not in either of those locations and you don’t have a Tesla and I don’t
have a Tesla and I’m feeling pretty good about that right now. To be honest, I got to say as much as I’m a technophile
I don’t want my car or my house or my devices to be that smart sometimes
because this type of stuff worries me. Having control over things make sense. And I do see the promise of electric
vehicles and I do see the promise of some of the diff additional safety features
that are being developed and we’ve even talked about in some of
our previous episodes, the self driving cars and the technology
that is being developed there. I believe we reviewed in
those episodes the upsides, but we also talked about the pendulum
downsides and what needs to be covered there. I think Tesla needs to take
more responsibility and
protecting the privacy of their customers and
their customers contacts. And a lot of the data that is being
collected regardless of the potential good uses, it still should be collected carefully
and they need to protect the privacy of that data of the people that
they’re collecting it on. And the, the fact that this is now known a
Tesla vehicle is now a bigger target. And so yeah, it is now an area where they
have to anticipate adversaries. Well they should be anticipating
adversaries anyway because they’re in the article at referenced a few
other situations in which
Tesla has been basically called out for lax security practices. One in which there was somebody who was
able to get access to a cloud server cluster essentially and mine
cryptocurrency using it. And it, oh by the way happened to have a bunch
of autopilot data and another situation in which there was a wireless network
list that had username and passwords in the clear. So this isn’t their first foray into
some very murky waters around protecting privacy and having data security
policies that were not sufficient. And for a tech company
developing this level of product, you would definitely expect there to be
more consideration around these topics. Exactly. And that they would react much more
quickly to ensure that they’re meeting the privacy needs. And I
don’t see that happening. It talked about the fact that it took
an extended amount of time for each of those other concerns to be dealt
with. And that in this case, I mean this just came out in March, but it’ll be very interesting to see
how long it takes them to and or rectify this situation with
unencrypted data in their cars. Convenient that they have the wireless
update capabilities to change that in their vehicles. But I wonder how
much they’re actually going to do. The other piece of it would be are there
certain things that they have built into the system that they
could continue to do, but just retain it for
shorter periods of time? A lot of the data that they’re collecting, they want specifically
for if there’s a crash. So if the vehicle does not detect a crash, maybe it drops everything.
But the last 10 minutes, like on a rolling basis, the last 10 minutes of data that it had or or it keeps the, the latest drive because maybe there’s
something in the drive from the time it starts to the time it ends. Sure
that would be helpful to review, but if you start the car and you end
the car and there was no crash detected, then you don’t need to keep
that data for our friends. Right. Or maybe when the car is turned off it
removes all of the personally identifying information so maybe it would retain
that. A call came in from a given number, but it wouldn’t associate that number
with a name or an email address. All of that data would be removed as
soon as it’s unpaired from the vehicle. That might be a way of removing some
of the Pii data that could be really problematic that’s currently
being stored in the car. There are a couple of different things
that might be kind of a weird part of that is what happens
when there is a crash. Obviously in this case these cars were
salvage because they had been crashed. If the car crashes, what does it retain and
then how is it safeguarded. That I think is the bigger concern is if
it can’t automatically drop data or it needs to retain that information, how do you make sure that it’s retained
in a way that does not grant access to anybody who happens, have the right
kind of wire to hook up to the car and like we’ve said before
on these podcasts, we don’t
always have the answers. Sometimes it’s just coming up with the
questions and lots and lots of questions. When it comes to data science ethics,
there are a lot of questions to consider, so that’s where we’re at with this one for now until there’s updates from Tesla. It’ll be interesting to see how they
respond to this over over the next few weeks. So that’s our quick take on
the unencrypted information
that has been found on Tesla model threes. Thank you so much for joining us for
this episode of the Data Science Ethics Podcast. This is Marie Weber And Lexy Kassan. Thanks so much. Catch you next time. We hope you’ve enjoyed listening to
this episode of the Data Science Ethics podcast. If you have, please like and
subscribe by your favorite podcast App. Join in the
[email protected]
or on Facebook and Twitter at ds ethics. Also, please consider
supporting us for just $5 per month. You can help us deliver
more and better content. See you next time when we
discuss model behavior. This podcast is copyright Alexis Kassan. All rights reserved music for
this podcast is by Dj shop money. Find him on soundcloud or
Youtube as DJ shadow money beats.

Leave a Reply

Your email address will not be published. Required fields are marked *