Home » Artificial Intelligence » Big Data and You

Big Data and You

January 27, 2014

supplu-chain

I’m a businessman; so, it is not difficult for me to see the beneficial side analyzing Big Data. I’m also an individual, a family man, and a citizen. In those roles, it’s easy to understand how the unethical or questionable use of Big Data can be either creepy or harmful. In an interview with Forbes magazine, Kord Davis, co-author (with Doug Patterson) of a book entitled “Ethics of Big Data,” stated, “Big data itself, like all technology, is ethically neutral. The use of big data, however, is not. While the ethics involved are abstract concepts, they can have very real-world implications. The goal is to develop better ways and means to engage in intentional ethical inquiry to inform and align our actions with our values.” [“The Ethics Of Big Data,” 21 June 2012] How organizations collect, use, and safeguard has been in the headlines a lot this past year.

 

Mike Wheatley believes that when large companies want to use Big Data analytics to enhance their business they should ensure they use it properly by hiring someone familiar with ethics to advise them. [“How Ethical is Your Big Data?Silicon Angle, 10 August 2013] He writes:

“If you’re a huge company dealing in Big Data gathered up from your consumers, you’ll probably want to pay attention to the ethics of its use, but how can you ensure this is done? One answer might be to hire a Big Data ethicist, an emerging and very specialized role which attempts to ensure any data gathering on your organization’s part doesn’t overstep the mark when it comes to your customer’s privacy. That’s if you can find one of course. Gartner analysts have predicted that by 2015, there will be 4.4 million IT jobs created globally to support Big Data. But unfortunately, though opportunities will be created, there might not be enough people to actually fill these positions. Right now, there just aren’t many specific classes you can take to become a Big Data scientist, let alone a Big Data ethicist. We might soon be facing a future wherein we have more data than our experts know what to do with, and if we get to that stage we could very well see angry consumers demanding that their data should not be used for any purpose whatsoever.”

Dan Darnell writes, “The benefits of e-commerce personalization are well known – increased conversion rates, larger order values, and more engaged customers.” [“The 7 deadly sins of personalization,iMedia Connection, 19 June 2013] But, as his headline clearly states, there are some questionable practices that should be avoided. He calls these “personalization sins” and he insists they are routinely “committed by e-commerce companies both large and small.” Not all of Darnell’s seven sins involve transgressions against the customer; some of them involve implementation sins. Of the seven sins, only the first two deal specifically with ethics. They are:

 

  • Collecting personally identifiable information (PII) without permission – One of the fastest ways to alienate customers is to collect PII without permission. Luckily, with advances in big data, machine learning, and real-time analytics, PII is not needed to provide a personalized e-commerce experience to individual customers. However, if a customer is willing to offer select PII by filling out a profile, you must use the data in an appropriate way.
  • Using customer data without permission – If a customer willingly provides you with PII, do not take it as an invitation to monetize and use that data in any way you see fit. Don’t personalize an experience using unauthorized consumer data without permission. For example, it would be creepy if someone who I had never met came up to me on the street and started talking to me about my time in the Peace Corps. This same creepiness applies online. When determining where to draw the line, put yourself in the customer’s shoes and ask yourself, “Would I provide my PII in order to receive a product or service?” If the answer is no, don’t do it.

 

Admittedly, there are a number of analysts who now insist that real privacy no longer exists for anyone who uses a computer or mobile device. There are software programs available that can readily identify who you are, where you’ve been, and predict where you are going. Sina Odugbemi writes, “Modern life is life on the grid: credit cards, smart phones, internet connections, social media presence and so on. And here is the truth: Life on the grid is life in a fishbowl erected on stilts in a bazaar. As a result, something that we once thought was important to us as citizens is not simply lost, it is irretrievably lost: it is the idea of privacy.” [“Privacy is so 20th Century,” People, Spaces, Deliberation, 27 June 2013] Even if that is true, it doesn’t mean that companies have carte blanche to act recklessly and unethically. “The onslaught of real-time social, local, mobile (SoLoMo) technology,” writes Brian Solis, “is nothing short of overwhelming. Besides the gadgets, apps, social networks and appliances that continue to emerge, the pace of innovation is only outdone by the volumes of data that each produce. Everything we share, everywhere we go, everything we say and everyone we follow or connect with, generates valuable information that can be used to improve consumer experiences and ultimately improve products and services.” [“The Human Algorithm: Redefining the Value of Data,” Social Media Today, 21 December 2012] All of that collected data inevitably erodes privacy. With the rise of wearable devices, the amount of data collected about us is only going to grow.

 

Wearable devices provide a great segue into the other side of the Big Data discussion, namely, data that individuals voluntarily post online as opposed to data gathered by organizations. You’ve probably read about people who have been turned down for jobs because of items they posted on Facebook or some comment they wrote on Twitter. So much personal data is voluntarily provided online, it’s easy to believe that people no longer care about privacy. Research conducted by analysts at Coleman Parks concluded that “almost two thirds’ of consumers aged between 18 and 34 ‘don’t care about privacy’, with 59 per cent of those aged between 35 and 44 equally unconcerned.” [“Consumers ‘don’t care about privacy’,” by Dawinderpal Sahota, Telecoms, 20 June 2013] The survey involved “3,900 consumers across 13 countries.” Jared Keller reports that younger people DO CARE about privacy, but not in the same way that older groups do. [“Teens Care About Online Privacy—Just Not the Same Way You Do,” Pacific Standard, 22 May 2013] He explains:

“The latest round of research on teenagers and digital privacy is out, this time in the form of a joint study by the Pew Research Center and the Berkman Center for Internet Society. The results of the study are similar to the results of past studies on youth and the Internet: teens are sharing more information about themselves. Interestingly, however, the report indicates that teens are also taking ‘a variety of technical and non-technical steps to manage the privacy of that information.’ … Today’s teenagers are, in the eyes of Pew, walking contradictions, increasingly open despite their understanding of privacy risks (and mastery of the tools needed to combat them). … Teens care about privacy in a social context, not a big data context. … Teens care less about data privacy and more about more socially oriented forms of privacy, those designed to protect the integrity of a community.”

More and more people are learning that a moment of online indiscretion can come back to haunt them. The latest revelation on this subject is that “more lending companies are mining Facebook, Twitter and other social-media data to help determine a borrower’s creditworthiness or identity, a trend that is raising concerns among consumer groups and regulators.” [“Borrowers Hit Social-Media Hurdles,” by Stephanie Armour, Wall Street Journal, 8 January 2014]. The bottom line is that both providers and users of data have a stake in what personal data is available and how it is used. Unfortunately, I agree with Odugbemi that privacy (as we once thought about it) is irretrievably lost. Nevertheless, organizations that analyze Big Data have an obligation to use the data ethically with a full understanding of the implications of such use. Enacting regulations may help (for the latest on that subject read “White House Launches Big Data, Privacy Review,” by Elena Malykhina, Information Week, 24 January 2014]; but, regulators will never manage to stay ahead of technology. The best way to deal with unethical practices is to expose them. That’s where the technological savvy of up and coming generations will play its most constructive role in the ongoing conundrum surrounding privacy in the age of information.

Related Posts: