In early-2018, a coffee shop called Shiru Cafe opened its doors near Brown University in the USA. But this was no ordinary cafe. If you weren’t a university employee or student, you were out of luck. Only staff and students could be customers. What’s more, students could get their coffee for free.
But, of course, there was a catch. To get their free hit of caffeine, students had to complete an online form, supplying their personal data, including name, phone numbers, e-mail, date of birth, study course, and professional interests. In return, they received targeted advertising on their phones, tablets, and even from the baristas.
This is just one example of the increasing value of personal data. Businesses have always wanted to know as much as possible about you, but now they have the technology to collect, analyse and use that information in useful, exciting, creepy and frightening ways.
On the positive side, they tailor their marketing to make it truly relevant to you. Amazon suggests books that ‘other customers who bought this book also bought’, Spotify recommends music based on your listening tastes, Woolies tells you which of your favourite products are on sale, and an online retail store shows you the perfect scarf to go with that jacket you bought last week. Who wouldn’t like that kind of recommendation? After all, it’s exactly what your spouse, mum or best friend would do.
The problem is, these big companies aren’t your friends and family. Even though they use your personal data to help you, they also use it to make more money.
We scan our Woolies or Coles card at the checkout to earn frequent flyer points, and might even know these companies will send us targeted offers. But what if they used our data — combined with powerful artificial intelligence (AI) tools — to make creepy and intrusive predictions about us?
That happened in 2012, when Target in the USA sent a teenage girl a catalogue for baby products. Because of her buying patterns in the store, it ‘knew’ she was pregnant.
Even that’s at the low end of the scale when it comes to using personal data. Insurance companies use it to set premiums and assess claims. Employers use AI to analyse your workplace behaviour to assess whether you’re a ‘flight risk’ (likely to quit). Banks use it to assess mortgage applications and deny you a loan. And retailers can predict what you’re likely to buy based on what other people like you bought.
All of these services can be highly beneficial.
For example, Ubicar offers lower car insurance premiums if you download the Ubicar app and let it monitor your driving performance. The app monitors speeding, acceleration, braking, and even phone distraction, and adjusts your premium every month. That’s a valuable service that saves you money, but you’re also giving away mountains of useful personal data. To be clear: there’s no indication Ubicar is misusing this data, but the point is that as soon as you release this data, you can’t predict how it will be used in the future.
On November 1, Google announced it would pay $3.1 billion to buy fitness-tracking company Fitbit. That gives Google a big piece of the health and fitness technology pie, but more importantly, it also gets a lot of highly detailed information about millions of Fitbit users. Think: where they are, when they exercise, their heart rates, how they sleep, and much, much more. Google has already reassured Fitbit users they can delete their personal information if they wish. But only a few will, because, for the rest of us, a Fitbit provides a valuable service.
And that’s the real problem.
As much as we are shocked and outraged when we hear of hackers stealing data from a company’s customer database, the bigger issue is the amount of data we freely share for those companies to use, without understanding the consequences.
This doesn’t mean you should immediately throw away your iPhone and Fitbit, delete your browsing history, and go completely off the grid!
Get SmartCompany FREE to your inbox every weekday.
It’s not easy to do most of the things we do every day without sharing some private information. So, to some extent, we have to shrug and accept this loss of privacy.
But be more selective and protective with your personal data, don’t automatically assume a company has your best interests at heart, and make informed decisions about what you choose to share.
Editor’s note: this article was updated on December 4, 2019, to reflect the fact Shiru Cafe recently shuttered its doors.