The Big Risks of Big Data Mining

In Big Data by Shelly Kramer1 Comment

“Every step you take, I’ll be watching you” – when Sting wrote those lyrics back in the ’80’s he most certainly wasn’t thinking about digital data collection. But whether we realize it or not, every digital step we take is indeed being watched—with the resulting data providing a frightening wealth of information about our lives. Those of us who work in the space know: Every card transaction, every web site visited, every online social interaction, even our movements and exact location are routinely collected and analyzed to build up a picture of our habits and preferences. While the insights that the data provides can bring benefits for the consumer and for marketers, the mining of big data also poses risks that business leaders would be foolish to ignore.

The collection of so much data has the very real potential to ignite new privacy and ethical firestorms that corporations haven’t needed to pay too much attention to in the past.

In a 2014 report called, Big Data’s Big Meaning for Marketing, Forrester highlighted three main areas of risk businesses should be aware of:

Personal data protection: Existing methods of protecting the identity of individuals may no longer be sufficient in the era of big data. Forrester cited the example of Netflix, who were sued for releasing data after researchers at the University of Texas were able to positively identify individuals from supposedly “anonymous” reviews.

Financial liabilities. The full extent of any financial liabilities for big data practices is unknown and at present unquantifiable (Italics mine). Lawsuits against organizations that have data breaches or are perceived to be misusing data are just beginning. Those who use collect and use data need to be aware of relevant legislation and the potential for increased costs if they get it wrong.

Ethical dilemmas. New ethical dilemmas are being created by the analysis of big data. Just because something can be predicted, should that information be used, or that prediction acted upon?

Take for instance this example of how healthcare providers are mining data to predict our health needs, then judge for yourself as to where the ethical boundaries should lie.

Bloomberg Business reported last year that Carolinas HealthCare System, operator of more than 900 care centers, began to purchase data to allow them to identify high-risk patients. Why? So they could intervene in an attempt to prevent potential health problems developing. My alarm bells are already going off!

The data is collected from credit card purchases, store loyalty programs, and other public records. In theory, medical practitioners can learn more about their patients—and their patients’ lifestyles—from their shopping habits than from brief, or sometimes non-existent, consultations. Although the data doesn’t yet identify individual purchases, it does provide a risk score doctors can use to highlight potential problems. Anyone remember the story from a couple of years ago about the dad who discovered his teen daughter was pregnant because Target mined her purchase data and sent her ads for baby products? This stuff, and the capabilities that brands have when it comes to data mining, isn’t new news.

While some patients might welcome such a pro-active approach like the Carolinas HealthCare example cited above, there will be many (me included) who will see it as an invasion of their privacy. Some health advocates also fear an erosion of the patient-doctor relationship, if medical professionals begin to interrogate apparently healthy patients about the consequences of their perceived lifestyle choices. Then, when you extrapolate from that and factor in insurance companies having access to this kind of data—that adds a whole different level of creepiness factor into the equation. Because, of course, insurance companies always have our best interests at heart, right? It’s not a giant leap to envision a future where people are turned away from health insurers based solely on risk scores developed from this type of data mining. It’s definitely a slippery slope.

The above highlights some of the ethical and privacy issues organizations leveraging big data are facing. The Forrester report suggests that, as well as tackling the technological aspects of handling big data, there is a real need to build competency in data governance into systems, and that’s something you want your IT team or managed service provider to be paying close attention to. In essence, what you want is to have “data stewards” working alongside the “data engineers” who will build the big data practices for the organization as a whole. Human error could make for a massive lawsuit.

This, I think, is where the role of a recent addition to the C-Suite, the Chief Data Officer (CDO), needs to come to the fore. This emerging executive position is something that my business partner Daniel Newman has recently been writing about at Forbes. Dan sees the CDO playing an increasingly pivotal role, bridging the gap between the sometimes conflicting responsibilities of the CMO and the CIO. I agree, but I also think a key responsibility of the CDO will be to make sure data collection practices stay on the right side of the ethical and legislative line.

That way the potential for financial loss, as well as the loss of reputation and trust, can be mitigated as data collection strategies are developed.

For those of us in the IT and big data space, it’s easy to see both sides of this coin: The potential benefits from the collection and analysis of data, as well as the potential privacy and security issues. What are the biggest challenges you and your team face when it comes to data collection and what steps do you see as important to ensure privacy and security for your customers? I’d love to hear your thoughts on this.

Other Resources on this Topic:

Big Data’s Big Meaning For Marketing
The Incredible Potential and Dangers of Data Mining Health Records
6 Ways Big Data Will Shape Online Marketing in 2015
How Companies are Mining Data to Mitigate Risks

Photo Credit: Jim Kaskade via Compfight cc

This post was brought to you by IBM for MSPs and opinions are my own. To read more on this topic, visit IBM’s PivotPoint. Dedicated to providing valuable insight from industry thought leaders, PivotPoint offers expertise to help you develop, differentiate and scale your business.

Originally seen on V3 Broadsuite Blog 

 

Shelly Kramer is a Principal Analyst and Founding Partner at Futurum Research. A serial entrepreneur with a technology centric focus, she has worked alongside some of the world’s largest brands to embrace disruption and spur innovation, understand and address the realities of the connected customer, and help navigate the process of digital transformation. She brings 20 years' experience as a brand strategist to her work at Futurum, and has deep experience helping global companies with marketing challenges, GTM strategies, messaging development, and driving strategy and digital transformation for B2B brands across multiple verticals. Shelly's coverage areas include Collaboration/CX/SaaS, platforms, ESG, and Cybersecurity, as well as topics and trends related to the Future of Work, the transformation of the workplace and how people and technology are driving that transformation. A transplanted New Yorker, she has learned to love life in the Midwest, and has firsthand experience that some of the most innovative minds and most successful companies in the world also happen to live in “flyover country.”

Comments

  1. Pingback: Ethics, Security, and the Big Data Dilemma - Millennial CEO

Leave a Comment