A Math Nerd Wants to Stop the Big Data Monster

Financial reformer Cathy O’Neil learned the hard way that algorithms can produce terrible consequences.
Illustration: Jay Daniel Wright

The decision to leave her job as a tenure-track math professor at Barnard College and join hedge fund D.E. Shaw in 2007 seemed like a no-brainer. Cathy O’Neil would apply her math skills to the financial markets and make three times the pay. What could go wrong?

Less than a year later, subprime mortgages imploded, the financial crisis set in, and so-called math wizards were targets for blame. “The housing crisis, the collapse of major financial institutions, the rise of unemployment—all that had been aided and abetted by mathematicians wielding magic formulas,” she writes in Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy (Crown, $26).

The book chronicles O’Neil’s odyssey from math-loving nerd clutching a Rubik’s Cube to Occupy Wall Streeter pushing for banking reform; along the way, she learns how algorithms—models used by governments, schools, and companies to find patterns in data—can produce nasty, or at least unintended, consequences (the WMDs of her title).

Her first move to rehabilitate math’s image came in 2009. Dismayed by the banking fiasco, O’Neil quit Shaw—the company traded in a wide range of securities—and vowed to fix the system. She took a job with RiskMetrics Group, which analyzes risk for banks, but soon discovered that clients had little interest in hearing about findings that cautioned against risky practices. By 2011 she’d left finance for good.

O’Neil’s next step was a tad curious. She rebranded herself as a data scientist and joined a startup that built models to predict consumer behavior on travel websites. Belatedly, she realized what should have been obvious from the beginning: Finance and e-commerce companies “replace people with data trails, turning them into more effective shoppers, voters, or workers to optimize some objective.” About the same time, O’Neil started a blog called MathBabe. She wanted to spread the word about WMDs and stop the use of sloppy statistics and biased models that tended to hurt the poorest people most, whether those stats were used to sentence prisoners or target consumers with predatory ads.

When Occupy Wall Street began in late 2011, she heard interviews with protesters who were often ignorant about the basic concepts of finance. So she decided to join them. “Soon I was facilitating weekly meetings of the Alternative Banking Group at Columbia University, where we discussed financial reform,” she writes. That’s the sole description of her work with the movement, and it’s a lost opportunity. A person with her background would have had unique insight into the group’s inner dynamics.

Some of the “math destruction” O’Neil writes about has been reported elsewhere, but it still provides ammunition for her argument. She describes companies using ZIP codes as a proxy for creditworthiness, which leads to predatory lending and hiring discrimination. And she discusses the model Starbucks created to staff stores that left employees with erratic schedules, causing child-care nightmares and sleep deprivation.

Other examples are more surprising, if only because they illustrate how small decisions can have huge consequences. She says the U.S. News & World Report college rankings have contributed to skyrocketing tuition because of a basic design flaw. The magazine came up with 15 proxies, including SAT scores, student-teacher ratios, and acceptance rates, that seemed to correspond with a successful institution. One factor U.S. News didn’t consider was cost. If it had, administrators might have been more concerned with fees, and a lot of us might be in a lot less debt.

Sometimes O’Neil’s comments on corporations are not as nuanced as you’d expect from a math-prof-turned-hedge-funder. “The model is optimized for efficiency and profitability, not for justice or the good of the ‘team.’ This is, of course, the nature of capitalism,” she writes. Yes, but there are responsible companies and others that behave badly. The book does raise acute awareness about the vigilance of data scientists. The choices they make in constructing models, she says, shouldn’t just be about logistics, profit, and maximizing efficiencies. They need to be about fairness, too.

Subscribe to Bloomberg Businessweek


Before it's here, it's on the Bloomberg Terminal.