Article

Big Data ethics: redefining values in the digital world

There is plenty of hype surrounding Big Data and for good reason. Big Data has the potential for exciting applications. However, has also raised a number of important questions about how to collect and use all this data that seems to keep a record of nearly everything we do – from tracking where we’ve been through phone GPS to personal habits through smart home devices. Even though we’ve always been surrounded by data (kept as paper records before computers) the amount of digital data being collected and stored has reached unprecedented rates. Over the last few years, the demand for Big Data analytics has increased. Big Data analytics has enormous power to reveal seemingly hidden patterns and provide insights that can even predict behavior. People are starting to ask questions, concerned about their privacy and how all this collected data is being used. As we enter into this next phase of the digital revolution, it’s vital to consider the values we hold today regarding private information and think about how we will protect these core values without impeding innovation.

Currently, there are laws that protect personal information but should be re-examined to consider how technology has changed the flow of personal data, its uses, and decisions derived from that data. We’ll cover some examples of how protecting privacy, confidentiality, identity, and transparency, which have long been core values, should be examined in our new digital world.

Some would argue that privacy is dead in the digital age. However, we would liken it more to a shift in social norms. Today, it’s become more acceptable to share personal data in the digital space in exchange for convenience – for example, sharing your fingerprint to unlock your phone. It can be argued that the same legal terms that govern privacy – invasion into protected spaces, collection of information, use of information, and disclosure of information – are just as relevant, if not more so, today.

The law has considered privacy by these terms for a long time, and we have a number of laws that protect privacy. Currently, the “Fair Information Principles” govern the collection, use, and disclosure of personal data. Data processors must disclose what they are doing with personal data, and people can opt-out of uses of their data. While good in theory, this standard practice is problematic because people don’t have the time or skills to review and comprehend the complex terms and conditions. Currently, there’s a need for more comprehensive laws that give people control over their personal information and compel those collecting the information to be transparent about their intended uses of that data.

Just because personal information has been shared once, it doesn't mean permission has been given to share that information everywhere. Confidential information should be held in trust based on the context of a relationship – such as providing your social security number to a new employer or health information with a physician. Today, information is often shared for convenience to the user, such as when cell towers track your location to make better calls. People share this information with the assumption the receiving party won’t make additional uses of their personal information.

Secondary uses of personal data pose the most risk and unintended harm to people. If private information is shared without knowledge or consent, people lose trust in the companies with whom they’ve shared their information. Historically, we have recognized that shared information can still be private and protected by laws. There are laws that protect information shared between lawyers and their clients, as well as confidentiality agreements that prevent employees from revealing trade secrets. These laws are designed to protect information shared in confidence and in context of relationships. Expanding current privacy and confidentiality laws to Big Data applications could help protect private information and strengthen trust between people and institutions collecting that data.

Transparency is key to maintaining trust between institutions collecting data and the people who own that data. However, there seems to be a lot of mystery around Big Data – people aren’t always aware of what personal information they’re sharing. The power of Big Data comes from analyzing collected data. Data brokers have been collecting personal data and sharing it in secondary ways, which drew scrutiny from the Federal Trade Commission (FTC). In February 2012, the FTC published a report that asked Congress to give people more control over their information.

Companies like Google have made efforts to create more transparency in the personal data they possess. Their Google Dashboard allows users to view and manage personal data Google has about them. A user can control certain items or their entire Google web history using their Google Dashboard, giving more control to the user of their personal data. We should expect to see more instances of data collectors or brokers being transparent about how they share, collect, use, or sell data. By building transparency into best practices, institutions improve their accountability and build trust.

There’s concern that Big Data analytics will threaten individual identity. The right to identify comes from the freedom to define the following. “I am me; I am anonymous. I am here; I am there. I am watching; I am buying. I am a supporter; I am a critic. I am voting; I am abstaining. I am for; I am against. I like; I do not like.” Historically, we’ve valued the right and the freedom to identify who we are as individuals.

Big Data analytics can make predictions about consumer behavior and determine future behavior before individuals make decisions. “I am and I like” risk becoming “you are” you will like” – a person defined by Big Data and not their own self. Big Data can provide more insights about a person than they know about themselves, which could open up the risk to people being pushed towards choices where data collectors want us to go rather than it being up to free choice. If we lack the power to say “I am” and choices are filtered to exclusively tailored recommendations based on how we’re identified, we lose our unique identities.

In order to protect our long-held core values around privacy, confidentiality, identity and transparency, current laws should be revisited to consider how all this information is being collected, used, and analyzed. While Big Data offers many opportunities to make services more convenient, we should also ask, “at what cost?” It’s important to think about what sort of big data predictions we want to be cautious of and their unintended consequences when it comes to predicting behavior. While giving more control to people over their personal information is a good place to start, it’s also important to acknowledge that by giving too much authority to Big Data, we lose the right to our personal identity, which may be too heavy a price.