Big Data – Big Danger or the Next Big Thing for Business?

Imagine a world where your company, however big or small it might be, could market directly to the customers who were likely to buy. Imagine the savings you could make on your marketing budget, imagine the increase in revenue you’d get from directing all your effort into activities that were likely to pay off. Imagine a world where safer drivers paid less for their car insurance, without having to subsidise more dangerous ones. Imagine being able to predict whether your clients will be good or bad debtors by logging in to a database that contains no financial information. Now stop imagining and start believing: that world is already here.

In the words of Alex “Sandy” Pentland, Professor of Computer Science at MIT, and one of the ‘fathers’ of big data research:

“It’s not about the things you post on Facebook, and it’s not about your searches on Google, which is what most people think about…This sort of Big Data comes from things like location data off of your cell phone or credit card, it’s the little data breadcrumbs that you leave behind you as you move around in the world.”

The premise is this: it’s not what you say about yourself that reveals the kind of person that you are, it’s what you actually do. It’s the people that you talk to, the things you spend your money on, how often you exercise, how often you visit the doctor or drive your car, and how fast. All this information is now ‘out there’, in the form of smart-phone location data, credit card information, apps like fitbit [link to http://www.fitbit.com] a nifty little device called Snapshot [link to http://www.progressive.com/auto/snapshot/] that plugs directly into your car and, in the near future, driverless cars themselves. Google glass, Samsung Gear and similar ‘wearable tech’ open up further new frontiers of data collection by offering a ‘head up’ display which in conjunction with the facial recognition technology that facebook already employs, could provide detailed information about who you speak to, what you say, and for how long, or could give you accurate information about the time and motion of your employees.

Large corporations like BP, Caterpillar and storage retail company The Container Store are already leading in the use of wearable technology for their staff, using Fitbit and Theatro (respectively) to encourage imporvements in staff health and activity and monitor performance on the job. Virgin airlines, too has trialled Google Glass in a customer service setting. Smaller companies are also getting in on the act: Capriotti’s, a sandwich bar in Los Angeles utilises Google Glass to provide first person perspective training videos, and to monitor service during the lunchtime rush.

All this is very exciting and could lead to better productivity for businesses, and more relevant products and advertising for consumers, but at what cost? One of the key concerns raised by opponents of the commercial use of Big Data and Wearable Tech is, naturally, that it constitutes a massive invasion of privacy. Consumers currently have no real choice about the way in which their ‘breadcrumbs’ of data are used, nor, even, what is actually shared. An algorithm that, for example, relates frequency of driving and braking velocity to your potential credit risk (which is

currently in development) might well result, not just in better outcomes for creditors, but also in baseless discrimination. The risk, as Al Jazeera’s Michael Keller and Josh Neufeld put it [link to http://projects.aljazeera.com/2014/terms-of-service/#1], is that in a world where the use of big data is the norm, people who don’t know you, and computers, get to ‘join the dots’, connecting those breadcrumbs of data to make up a picture of who you are, and that picture may be totally false.

Any company that collects user data, Apple, Google, Facebook and Samsung included, must by law require their customers to sign acceptance of a privacy policy which sets out broadly how this data will be used, and you might argue that this offers some form of protection. It’s worth bearing in mind, however that:

1) The statements are usually long and complicated – very few people actually read them.

2) They usually refer to vague risks and benefits to the consumer such as “improving the user experience” or “information will be shared with carefully chosen third parties”

3) Accepting the privacy policy is the only way to access that product or service (iOS update, facebook account etc.)

4) The “Unravelling Theory” (Scott Peppet, University of Colorado) highlights the peer pressure that can exist around sharing of information – if all your friends usually arrange to meet each other on Facebook and you don’t have an account, it’s an annoyance for them, and difficult for you to take a stand, even if deep down you feel uncomfortable about giving away your data.

Employees, too, typically don’t have any choice at all about whether to accept the use of wearable tech in their workplace (other than quitting completely). Their data is shared, not only with their employers, who can monitor their break times and time spent ‘idle’ but also with the parent company, who designed the technology in the first place. They may even receive pushed advertising to their personal facebook or email account based on the data collected. Furthermore, according to The Harvard Business School’s Ethan Bernstein, [link to http://www.hbs.edu/faculty/Pages/item.aspx?num=43639] increasing observation of a workforce, may actually decrease productivity and innovation, in what has become known as the ‘transparency paradox’.

One the one hand, Big Data represents an unprecedented opportunity, not just for large multinational corporations, but for SMEs as well. Aside from applications in workforce management and marketing, it can inform product design (Progressive’s car insurance example for instance) and even open up entirely new marketplaces – something which Facebook has grasped, it seems, very well indeed. On the other hand, the growth of Big Data is causing considerable unease, not just amongst civil rights activists [link to http://stopthecyborgs.org] and the “looney left” [link to http://www.libcom.org/library/moral-data-inc] but amongst serious academics too.

This is only the beginning. As the new technology develops, there are likely to be more socially orientated applications targeted at improving health, or decreasing unemployment, an

approach which, according to Sandy Pentland [link to http://edge.org/conversation/reinventing-society-in-the-wake-of-big-data] could easily extend into attempts to predict and control, for example, political revolutions. Business typically sees itself as amoral, concerned only with good business, but it’s worth taking a moment to consider the wider implications of this kind of ‘Promethian fire’.

Perhaps in the end, Big Data is controversial because it raises deep questions about our identity. What these emerging technologies are teaching us is that, far from being the irrational, complicated individuals that we’ve come to think we are, large swathes of our behaviour are highly predictable. If we get used to being ‘algorithmatised’ and seen as very small cogs in very big machines, what will that do to our status as free-thinking people? As the entrepreneurs of the future it will likely fall to us to determine the direction that this technology takes over the next few decades, with all which that entails.

Jun 8, 2015