Impress the Algorithm. Get $250,000
Ashley Carroll has a go-to story that epitomizes how grim things are for female entrepreneurs trying to raise money in Silicon Valley. Earlier this year, Carroll, a partner at the $2.5 billion venture firm Social Capital, was coaching a female startup founder ahead of a meeting with a potential investor. Let’s call the founder Jane Disruptsky and the investor John Ventureman. The pitch session seemed to go well, but afterward, while Disruptsky was waiting to hear whether the fund would back her, Carroll received a screenshot of a private text-message conversation someone else had had with Ventureman.
Ventureman began by praising Disruptsky’s startup on its merits. The gist, Carroll recalls, was: “Great business. I love everything.” Still, he was passing on the deal. The rest of the text chain read like an assessment of a blind date: “She didn’t seem warm,” for example, or, “She was just all about the business.”
Carroll laughs as she recalls the thread, without seeming particularly amused. “I was just like, ‘Says no one about a male entrepreneur, ever.’ ”
Early-stage tech companies raise money through a stubbornly analog process—an irony for an industry based on the conviction that computers will upend every aspect of human existence. Founders vie for personal introductions to venture capitalists, who in turn make decisions based on nebulous criteria. It’s widely understood that the decisions rely heavily on gut instinct; indeed, VCs tout their intuition when they’re out raising money from investors for their funds.
In a system like this, the people who cash the checks tend to look a lot like the people who write them. According to the National Venture Capital Association, 89 percent of partners at venture firms are male. And in 2017, data compiled by PitchBook Data Inc. showed that the industry poured $68.2 billion into companies founded by men, compared with just $1.9 billion for startups with solely female founders. Specific statistics on ethnic background are harder to come by, but the overwhelming majority of venture partners are white, and there’s little disagreement that the pool of entrepreneurs they fund isn’t very diverse, either. One obvious way to offset the clubbiness endemic to Silicon Valley’s investor class would be to replace the “guts” of white men with those of women and people of color. But another option—one that fascinates Carroll—is to eliminate gut instinct altogether.
Carroll, 35, isn’t quite an outsider. She has three degrees from Stanford and a career that includes stints at Amazon.com Inc. and two unicorn-tier startups. She joined Social Capital’s Palo Alto office in 2015, and last year she began building an automated system that would allow the fund to invest in startups that its partners had never met. The companies would upload data about themselves; if the algorithms liked what they saw, the venture fund would back them. The process, in theory, would keep bias from entering the equation. Within the firm, the system is known as Capital as a Service, or CaaS for short.
Similar tactics have brought promising results in other competitive fields. The most famous example comes from the 1970s, when five major orchestras began requiring musicians to stand behind a screen while auditioning. According to a study by researchers at Harvard, the proportion of women performing in those orchestras increased more than threefold from 1970 to 1993. Technologists have lauded automated decision-making as a way to further reduce human fallacy. But the optimism around supposedly objective algorithms has been challenged in recent years by evidence that some automated systems amplify bias because they’re trained on data reflecting historical inequities.
Social Capital kicked off a trial of Carroll’s model last year with a referral program of sorts—the fund asked other venture capital firms to direct promising early-stage companies to apply through the system. Most of the hopefuls came from outside the standard VC stomping grounds of the Bay Area and New York, and many were based overseas. The fund has since assessed 5,000 startups and invested in 60. Eighteen of the companies are run by women, and about 80 percent have nonwhite founders. The checks Social Capital is writing are small by its standards, from $50,000 to $250,000. But the firm plans to throw open the doors to anyone with a company and a few spreadsheets of operational data by the spring of 2019. Its official goal is to make 1,000 investments in 2018 and 10,000 next year. Chamath Palihapitiya, Social Capital’s chief executive, admits those levels are unattainable—that many investments would quickly add up to billions of dollars—and says they’re more of a message to the people building CaaS that the firm is serious. “It forces them to build something that can be mission-grade,” he says. “If I said, ‘Fund 10 companies,’ they could manually jury-rig some system. When I say 10,000, you can’t.”
Palihapitiya has a penchant for dramatic claims—some of which pan out. The snappily dressed former Facebook Inc. executive, who co-founded Social Capital in 2011, made early bets on Bitcoin, started a hedge fund, and raised $600 million to help startups go public through an unconventional vehicle known as a SPAC, or special purpose acquisition company. (The startups have yet to be chosen.)
Not everyone has wanted to come along on Palihapitiya’s adventures. His two co-founders, Mamoon Hamid and Ted Maidenberg, left the firm last year. But to Palihapitiya, the more he breaks the mold, the better. He describes the job of a venture capitalist with some disdain. “This isn’t meant to be pejorative, but you’re a classic middleman,” he says. In Palihapitiya’s view, the fate of middlemen is to reap an enormous profit until the market inevitably finds a way around them. “Venture capital,” he says, “will be no different.”
There’s an economic logic behind CaaS. Evaluating and backing startups is a labor-intensive process, significantly limiting the number of deals a firm can do. By using software to assess tens of thousands of companies annually, a firm can do each individual deal more cheaply. This makes modest successes worth its time, getting the firm around the need to bet only on companies that could be worth billions. It also eliminates the personal blind spots of its partners.
Palihapitiya doesn’t mind positioning himself as enlightened—the name of his firm is Social Capital, after all. But he insists he’s making a profit-maximizing move here. When asked how CaaS might help underrepresented groups, he winces, downplaying diversity as simply a “positive byproduct.” He adds, “Look, at the end of the day, our job is to get the most capable people to the starting line. Let’s say [that with] CaaS, it turned out it was all white dudes who got funded. That’d be OK, too.”
The initiative dovetails with work Palihapitiya did at Facebook, where he ran the growth team during a period of eye-popping expansion. His outfit was responsible for an approach that’s now seen as a core aspect of Facebook’s culture: shunning intuitive decision-making in favor of quantitative measurement, then relentlessly choosing whatever option drives the most engagement. Several of Palihapitiya’s Facebook colleagues have taken on key roles at Social Capital.
From the start, the firm sought to mathematically isolate the objective factors responsible for a startup’s success. Investors in publicly traded companies do this by poring over financial statements. But such data are often not available for startups, and when they are, they can be useless. At the stage when a company is seeking venture capital, it’s often intentionally bleeding money to gain a market foothold. Recognizing this, the Facebook veterans began building models that primarily compared how startups attract and retain users.
In 2015 one of Social Capital’s partners visited the offices of SurveyMonkey Inc., the online poll service, and learned that one of its former employees had designed a series of impressively elegant experiments to determine product pricing in different countries. The ex-employee was Carroll, and Social Capital quickly recruited her. As a numbers-first type, she clicked with the firm’s data scientists. Even Carroll’s primary personal activity, distance running, is basically an exercise in obsessive performance measurement and squeezing out tiny improvements in efficiency. At her peak, Carroll ran the marathon in under 2 hours and 45 minutes, which was fast enough to qualify her for the 2011 Olympic Trials. (She finished in the middle of the pack.)
Carroll became a partner at Social Capital in 2015 and settled into the standard investor’s life: visiting founders, sitting on a handful of boards, leading three investments. One way she evaluated companies was a tool Social Capital’s analysts had built called the Magic 8-Ball. The model examined common metrics such as user growth and engagement, as well as more bespoke things like “quality of revenue” by putting more weight on money from loyal customers. Carroll thought the firm wasn’t making the most of the Magic 8-Ball’s power, as it was implementing the tool relatively late in a process vulnerable to traditional biases. “It was used as validation,” she says. “One of us investors would get an intro, and then go have coffee, and then judge them ourselves, and then have another partner meet them, and then after all those steps do the Magic 8-Ball thing.”
Last spring, Carroll spent two weekends throwing together a prototype that linked the Magic 8-Ball to a single online form that companies could fill out on their own. It was a kludgy program involving two spreadsheet applications, a Google bot, and a service called If This, Then That that creates rudimentary scripts. “It was this totally duct-taped-together product I was pretty ashamed of,” Carroll says. Still, she sent the link to a handful of investors Social Capital had worked with, asking them to share it with companies that might be interested in having their businesses evaluated by software rather than people. She was immediately deluged with applications.
It would be an exaggeration to say CaaS is making purely automated investments. Entrepreneurs apply by going to Social Capital’s website, where they’re greeted by the motto “Raise capital based on the merits of your business, not your network.” A form asks them to choose their business model from a drop-down menu. Then they must disclose information such as how much they spend to acquire users, each customer’s lifetime value, and how much cash they have on hand. Finally, they upload spreadsheets detailing their revenue and engagement metrics.
About half the time, companies get through the process unassisted, and Carroll receives an automated email analyzing their promise. The rest of the time, companies will get in touch to argue that they don’t fit neatly into the provided categories or that their data is formatted incorrectly. Then personal attention is required. In cases where the algorithm recommends funding, Social Capital still has humans do things such as legal vetting and other back-office tasks.
Social Capital has three people working full time to build out CaaS. About a half dozen employees show up at a recent progress meeting in the glass cube that serves as the main conference room in the firm’s Palo Alto office. Someone rolls up a whiteboard to take notes. One employee begins a discussion on a seemingly metaphysical question: “When does a company truly exist?” (Knowing this helps to determine when to apply growth models.) Then the group puzzles over a company that raised a $4 million round from a rival firm shortly after the CaaS system had determined it was unfit for investment.
The idea of automated seed-stage investing isn’t entirely new. Many firms claim to use data-centric analysis, and a handful have adopted strategies based on making lots of small investments without face-to-face meetings. Dave Lambert, who for the last eight years has run Right Side Capital Management LLC, a small San Francisco-based fund, uses a largely automated system to review thousands of companies annually. He’s written about 2,000 checks in amounts up to $100,000. Unlike Social Capital, Lambert doesn’t claim to gain an edge through superior algorithms. “The hardest part is the psychology,” he says. “It’s so easy to make exceptions.”
Neither Palihapitiya nor Carroll aspires to cut humans completely out of the loop. Palihapitiya sees his firm’s overall operations as akin to those of a bank, which can process a credit card application with software but needs people to underwrite million-dollar mortgages. By handing out thousands and thousands of credit cards and small loans, these companies have been able to improve their own risk models. As they’re more confident about who’s likely to be good for the money, banks and credit card networks can make automated decisions on higher and higher dollar amounts. Palihapitiya thinks the same thing will happen at Social Capital. He says its computers could one day be writing multimillion-dollar checks. “We’re Diners Club in the 1950s,” he says, referring to the original credit card network, “and aspiring to be American Express in 2018.”
In November, Palihapitiya caused a stir when he told the audience at a Stanford Graduate School of Business event that he felt “tremendous guilt” about his time at Facebook. “The short-term, dopamine-driven feedback loops that we have created are destroying how society works,” he said.
Repentance aside, Palihapitiya’s experience at Facebook clearly continues to influence Social Capital. He marvels that the dominant global technology platforms all arrived at their current positions by gathering enormous amounts of data and using it to predict people’s behavior. Palihapitiya figures that if it works for social network users and online shoppers, it should work for startups, too.
Social Capital’s staff acknowledges that Facebook’s comeuppance is a cautionary tale about unintended consequences. “Certainly putting an application form out in the wild that says, ‘Enter data, get money!’ has all sorts of negative possibilities,” says Jonathan Hsu, Social Capital’s head of quantitative investing.
Among those possibilities is that automated investment decisions could actually aggravate the inequities of today’s venture capital system. Automated tools developed for criminal sentencing and policing, for instance, have given old wrongs new life by using flawed data to create their models, resulting in a propensity to overstate the dangerousness of black people. “There’s clearly a growing awareness of these problems in the technical community,” says Solon Barocas, an assistant professor at Cornell who studies ethical and policy issues related to artificial intelligence.
In all automated systems, the challenge in dealing with bias is that it’s hopelessly entangled with other factors. At Social Capital, that problem has manifested primarily in an internal debate over whether to try to model not only which businesses will succeed but which entrepreneurs will. Ray Ko, a partner who leads the firm’s tech development, is intrigued by the idea of pulling in information about founders that could serve as proxies for technical expertise, such as their histories in coding communities like GitHub Inc. and Stack Overflow, and seeing if that correlates with success.
The CaaS form already asks applicants for some personal information, such as LinkedIn profiles and educational background. Carroll is resistant to the idea of using such data to glean insights about businesses. Because well-educated white men have the easiest time raising money today, any model using demographics to predict success would favor them—the opposite of her intention. Still, Social Capital is experimenting with building personalized models anyway, though it hasn’t implemented any yet. Despite its high-minded name, the fund’s overriding objective is to achieve the biggest return on its portfolio companies. A more diverse set of CEOs atop those startups would be ideal. But for now—to use Palihapitiya’s words—they’re just a positive byproduct.