Login to your account

Username *
Password *
Remember Me

Create an account

Fields marked with an asterisk (*) are required.
Name *
Username *
Password *
Verify password *
Email *
Verify email *
Captcha *
Reload Captcha

Advertisement

×

Message

EU e-Privacy Directive

This website uses cookies to manage authentication, navigation, and other functions. By using our website, you agree that we can place these types of cookies on your device.

You have declined cookies. This decision can be reversed.

Internet of Risky Things: When things betray their owners…

Written by  Jul 19, 2017

There have already been many privacy problems arising from what IoT things actually do: collect lots of data about the physical world around them, including their owners.


Your things may talk to the police

In March 2015 in Pennsylvania, a woman called 911 to report she had been raped by someone who broke into her house and assaulted her while she was sleeping. However, police investigators concluded she had made the story up, in part because of her Fitbit:

The device, which monitors a person’s activity and sleep, showed [she] was awake and walking around at the time she claimed she was sleeping.

Big Brother may not be watching you, but Little Fitbit is.

In December 2015 in Florida, a driver (allegedly) was involved in a car accident and then fled. When telephoned by an emergency dispatcher, she responded, “Ma’am, there’s no problem. Everything was fine.” Hit-and-run accidents are nothing new. However, what’s interesting in this case is how the police became involved:

The dispatcher responds: “OK, but your car called in saying you’d been involved in an accident. It doesn’t do that for no reason. Did you leave the scene of an accident?”

The woman’s car, like many new cars enhanced with computing magic, was set up to call 911 if its GPS information indicated a potential crash (e.g., because of a rapid change in direction and momentum). Your car can know where you are and will call 911 to help you, even if that’s not your plan. But what else will happen with this data? The article notes further:

Privacy campaigners concerned that governments might use the technology to keep permanent track of a vehicle’s movements have been told the new rules only allow for GPS information to be collected in the event of a collision, and that it must be deleted once it’s been used.

But will the GPS data really “only” be used for the noble purpose of accident reporting? Indeed, modern IT-enhanced cars collect much data, and police know about this.

In Vermont in 2015, a cyclist was killed by a car whose driver was allegedly intoxicated:

Scott said investigators obtained a search warrant for the Gonyeau car to download information from its computer. He said once the information from the car’s sensors can be reviewed, police will know more about the crash.

The investigation later concluded the cyclist was mostly to blame. In 2016, Canada’s CBC News reported:

From July 1 to Dec. 31 of last year, there were five fatal vehicle collisions in the parts of Halifax policed by the RCMP. Information from event data recorders was used in two of those investigations, according to an access to information request filed by CBC News.

CBC also noted the various implications:

  • Are the owners aware their cars collect this data?
  • Will the data only speak of things such as car accidents, and not other aspects of driver and passenger identity and behaviour?
  • Will the police only use the data for correct purposes?
  • Is the data actually correct?

It’s worth noting that a colleague of mine who spent a career in law enforcement (in the US, a country with constitutional privacy protections for citizens) observed that it’s common practice for police to use illegal means to find out who’s guilty – after which they then use legal means to obtain evidence for court. It’s also worth noting that just because a computer allegedly measured something doesn’t mean that it actually happened; “Things ‘on the witness stand’” [later in the book] will consider further the legal implications.

Your things may phone home

Law enforcement officials aren’t the only people your smart things may talk to.

In February 2013, John Broder wrote an unfavourable review of the Tesla Model S in the New York Times.

tesla model s

Broder was unhappy with the performance of the high-end electric car, and supported this conclusion with his firsthand observations of speeds and charges and such as he test-drove it. What’s interesting here from the IoT perspective is that the reviewer was not the only witness—the car itself was recording data about its experiences and sending this data back to Tesla.

Unhappy with the review, Tesla chair Elon Musk published a retort using the car’s logs to dispute what the reviewer claimed happened during this “most peculiar test drive.”

For example, one of the diagrams showed a speed versus distance graph, with annotations appearing to show that Broder’s claims of “cruise control set at 54 miles per hour” and how he “limped along at about 45 mph” did not match recorded reality. A back-and-forth ensued, with no clear winner. (Tesla would not give me permission to republish any of these diagrams, but you can see them online.)

In 2016, this pattern continues, with high-profile incidents of customers claiming their Teslas did something odd, and Tesla using its logs to claim otherwise. In the IoT, your things are also witnesses to what you witness – and they may see it differently.

Given the computationally intensive engineering challenges of high-tech and high-end cars such as Teslas, the fact that they log data and send it back home would appear reasonable.

The more data is collected, the more the engineers can analyze and tune both the design in general and that car in particular. Tesla is not alone in doing this.

One colleague reported his BMW decided it needed servicing and told BMW, which called my colleague – while he was driving. (The message was something like “Your car says it needs to be serviced right now.”)

Another colleague who handles IoT security for the company whose machines generate “half the world’s electricity” talks about the incredible utility of being able to instrument complex machines, send the data back home, and feed it into computerised models that the engineers trust more than physical observation.

However, in February 2015, Brian Krebs wrote about a family of IoT devices that appear to phone home for no reason at all:

Imagine buying an internet-enabled surveillance camera…only to find that it secretly and constantly phones home to a vast peer-to-peer (P2P) network run by the Chinese manufacturer of the hardware.

In fact, this is what newer IP cameras from Foscam were doing—which came to light when a user “noticed his IP camera was noisily and incessantly calling out to more than a dozen online hosts in almost as many countries.”

To make things even more interesting, the camera UI does allow the user to tick a box opting out of P2P – but doesn’t actually change its behaviour when the box is ticked. In this case, it’s harder to see a reasonable argument for the P2P network; Foscam claims it helps with maintenance.

Your things may talk to the wrong people

In the preceding cases, IoT things shared data about their experiences in perhaps surprising ways – but at least they were sharing it in accordance with their design (e.g., to authorised law enforcement officers, or to the original vendor for maintenance and tuning).

However, a problem with exposing interfaces is that one may inadvertently provide these services to more parties than one intended. Unfortunately, this has already happened with IoT data collection.

GM brags that its OnStar system for collecting and transmitting car data has “been the Kleenex of telematics for a long time” .

In 2011, Volt owner Mike Rosack so much enjoyed tracking the telematics he received on his phone from his car that he reverse-engineered the protocol and set up the Volt Stats website, which enabled a broader population of Volt owners to share their telematics.

Unfortunately, doing this required that the owners share their credentials with Volt Stats (the “lack of delegation” pattern from Chapter 4). GM decided this was an unacceptable privacy risk and shut down the API, but then provided an alternate one that allowed Volt Stats data sharing to continue but without this risk.

Unfortunately, in 2015, researcher Samy Kamkar found a way to surreptitiously capture owner credentials (the “easy exposure” pattern from Chapter 4). The resulting OwnStar tool allows unauthorized adversaries to usurp all owner rights].

In Australia, four shopping malls set up “smart parking” that used license plate readers to track when cars entered and left, and gave users the option of receiving text alerts when their parking time was close to expiring. However, the malls discontinued this service when it was noticed that anyone could request notification for any vehicle (the “no authentication” pattern from Chapter 4).

Chapter 1 discussed the Waze crowdsourcing traffic mapping application. Chapter 4 mentioned the “bad PKI” design pattern that has been surfacing in IoT applications. One place it has surfaced is in Waze: in 2016, scholars at UC Santa Barbara demonstrated that (due to flaws in checking certificates) they could intercept Waze’s encrypted SSL communications, and then introduce “thousands of ‘ghost drivers’ that can monitor the drivers around them – an exploit that could be used to track Waze users in real-time”.

Here, the service being usurped by the unauthorised party (“Where is driver X right now?”) was not really one of the intended services to begin with.

As an extreme case of unauthorised access to unintended services, researchers at SRI have been worrying about not just adversarial access to the internal IT of government automobiles, but even mere detection that a particular vehicle is passing by.

For a terrorist or assassin, the ability to build a roadside bomb that explodes when one particular vehicle goes by would be useful indeed. In this case, even the natural solution of “disable all electronic emanations” would not work, since the bomb could simply wait for the car that is suspiciously silent.

Leave a comment

Make sure you enter all the required information, indicated by an asterisk (*). HTML code is not allowed.

Advertisement

  1. Popular
  2. Trending
  3. Comments

Advertisement

Calendar

« December 2017 »
Mon Tue Wed Thu Fri Sat Sun
        1 2 3
4 5 6 7 8 9 10
11 12 13 14 15 16 17
18 19 20 21 22 23 24
25 26 27 28 29 30 31

Apple Mac Tips

apple mac tips