Think Bigger About Big Data
A White House panel has spent the past few months trying to make sense of the zettabytes of data that the information-industrial complex collects every day. There is no shame in its failure -- no one has yet to make sense of all that data, and the panel asks all the right questions -- but its report illustrates how hard it will be for the government to respond to such a powerful new technology.
The panel notes, rightly, that big data is in many ways a modern blessing. Cities are using it to find potholes, prevent fire hazards, and concentrate police resources. The federal government is using it to improve climate models and detect espionage. Companies are using it to improve efficiency, streamline supply chains and increase productivity. Yet big data has also introduced some new and alarming conundrums.
Take those potholes. Last year, Boston hit upon a smart solution to a perennial urban problem: Release an app that uses mobile-phone accelerometers to detect and report divots in the road. It was a splendid idea, except for one thing -- the people most likely to download such an app were the city's young and tech-savvy, whose neighborhoods would have disproportionately benefited from city services. That yields a larger lesson that's going to become more urgent: Users of big data need to be aware of unfortunate consequences.
Big data can also be deliberately abused. The opaque methods many companies use to collect data and "score" individuals, the report says, could allow them to "digitally redline" unwanted groups in employment, insurance, credit offers and more. And the information is often so refined that predatory companies could easily seek out the vulnerable -- people identified as elderly, chronically in debt, unemployed, bereaved, alcoholic, recently divorced and so on.
Such discrimination is a manifestation of a deeper dilemma. Most Americans don't know very much about how their data is collected and used, and they can't easily opt out. And the idea that your data can be "anonymized" -- the cornerstone of many corporate privacy policies -- is effectively a myth. That means that the standard model for protecting consumers from abuse, called notice and consent (those impenetrable contracts you agree to without reading), will need to be reconsidered.
Many of the prescriptions offered in the White House report are sensible but unlikely to happen. It endorses the Consumer Privacy Bill of Rights, for example -- a well-meaning document that's unlikely to get anywhere. It suggests cracking down on the sleazier elements of the data-brokerage business, which no one should object to but many people will. It proposes giving consumers more power over how their data is used, which is way more complicated than it sounds. And it adds a few other inoffensive tweaks to current policy.
The problem is that these sensible suggestions aren't bold enough to keep pace with the power and speed of big data. (Although a parallel report by the president's science and technology council does a little better on this score.) A better approach might mean rethinking liability law to recognize that when consumers divulge data to technology companies, they're exchanging something of value, and thus should be entitled to civil recourse when that data is abused or stolen. It might mean wider use of ethical review boards, such as those that vet scientific experiments, to do cost-benefit analyses on big-data projects. Or it might mean advanced technological recourses -- the report mentions encryption and "perturbing" data to help anonymize it. It will probably require some combination of innovative approaches.
In other words: Think bigger, because big data isn't going away. As algorithms parse that data more minutely, businesses exploit it more effectively, and sensors and software gather it more widely, it's going to define modern commerce as never before.
--Editors: Timothy Lavin, Michael Newman.
To contact the editor on this story:
David Shipley at firstname.lastname@example.org