Is Wikipedia Woke?
The ubiquitous reference site tries to expand its editor ranks beyond the Comic Con set.
Bisi Adeleye-Fayemi is a 53-year-old British-Nigerian human-rights activist and, it’s fair to say, a person of some note. She’s co-founder of the African Women’s Development Fund, a nonprofit dedicated to promoting women’s rights throughout the continent. It’s not a huge charity, having distributed $26 million since 2001, but it does important work. In 2013, Leymah Gbowee, a Nobel Peace Prize laureate, credited the AWDF with helping to end the Liberian civil war. But until recently, Adeleye-Fayemi didn’t exist on Wikipedia, which meant that as far as many people were concerned, she didn’t exist at all.
The online encyclopedia, founded in 2001 and now published in 295 languages, includes about 40 million articles, all of them free. The site is a source of first resort for students writing term papers, for anyone who makes a bar bet, and—though they’ll deny it to their graves—for Bloomberg Businessweek writers who want to double-check how to spell the names of, say, recent Nobel Peace Prize winners. “It’s a first draft of history,” says Craig Newmark, founder of Craigslist and a longtime donor to Wikipedia who gave $1 million to the website’s new endowment fund in June. (Bloomberg LP, which owns Bloomberg Businessweek, also is a Wikipedia donor.) Newmark was initially taken by the project’s similarity to Isaac Asimov’s fictional Encyclopedia Galactica. “For a nerd like me,” he says, “it was obvious how important it would be in our world.”
Anyone can contribute to a Wikipedia article, which you’d think would open it to all manner of mischief. And you’d be right, although the site’s volunteer editors are remarkably adept at quickly removing vandalism, propaganda, and other falsehoods. Studies have generally shown that Wikipedia entries are about as accurate as professionally written articles in resources such as the Encyclopaedia Britannica or Germany’s 200-year-old Brockhaus Enzyklopädie. The site also remained mostly free of the conspiracy theories that deluged social media during the 2016 election. Edgar Welch, whom police say in early December fired an assault rifle inside a Washington pizzeria after attempting to “self-investigate” widely circulating reports that a child sex ring was operating there, would have found the truth on Wikipedia, which identified Pizzagate as a “debunked 2016 conspiracy theory.”
And so, Wikipedia is only as good as its community of editors. About 30,000 people contribute regularly to the English-language version of the site, an additional 45,000 to the other editions. Not surprisingly, given that the organization’s earliest supporters were software geeks, its entries often reflect the concerns and biases of a group that’s overwhelmingly white and, according to several surveys of Wikipedia editors, about 85 percent male. There are hundreds of comprehensive Wikipedia articles on various aspects of the Star Wars universe—a 1,300-word entry on Princess Leia’s home planet, Alderaan; a 2,900-word one on the people of Mandalore, which includes Boba Fett—but precious little about vast swaths of the human world. The makeup of the editorial pool affects classification, too. In 2013 a New York Times columnist noticed that Harper Lee appeared on the page for “American Women Novelists” but not on the one for “American Novelists.”
“It’s written entirely from the point of view of people sitting in the U.S. and Europe,” says Anasuya Sengupta, an activist from Bangalore who, until 2015, served as chief grant-making officer for the Wikimedia Foundation, which funds the operations of Wikipedia. Sengupta’s mandate at the foundation, which employs almost 300 people and has an annual budget of $66 million, included giving money to groups creating content for and about the developing world. Yet such efforts never seemed to amount to much. When Sengupta would think of a topic relating to feminism or global development or human rights, she’d look it up. Every time, either she wouldn’t find an entry, or she would find one that was incomplete or inaccurate.
While attending a conference for African Wikipedia editors in Johannesburg in 2014, Sengupta, who hadn’t written an article for Wikipedia before, decided to try solving the problem more directly. She opened up her laptop and started pecking out an article about Adeleye-Fayemi, whom she’d met at a handful of women’s-rights conferences. “As a philanthropist, she was very notable,” says Sengupta. Wikipedia rules require that all entries be about “notable” topics and every assertion be backed up by a reliable secondary source. This can be problematic for figures who aren’t well-known in the U.S., but Adeleye-Fayemi had been covered in the Nigerian press, and Sengupta included 11 footnotes in her first draft. She hit publish.
A few minutes later, she glanced at her laptop. The entry had been marked for speedy deletion, which means a Wikipedia editor had judged it to be, essentially, trifling. The only people qualified to remove this designation are the encyclopedia’s administrators—some 1,300 volunteers for the English entries who have the power, and the technical ability, to block users and delete articles.
Sengupta took to the article’s Talk page, a forum of sorts, to make her case for inclusion. She won her appeal, but only after drawing in Florence Devouard, a former chair of the Wikimedia Foundation, who happened to be sitting next to her at the conference.
The experience was sobering for Sengupta, who left the foundation the following year to start Whose Knowledge?, a campaign focused on promoting diversity on the internet. If the only way to get an article about the developing world published on Wikipedia was to know a former board member, it was hard to imagine how a random editor from Johannesburg or Bangalore would have any hope. “When your only frame of reference is your world, your language, your context,” Sengupta says, “where does that leave the rest of the world?”
This so-called filter-bubble problem, coined by Eli Pariser, co-founder of the viral video site Upworthy, is the idea that the internet can contribute to the insularity of certain communities. Filter bubbles have been blamed for the spread of misinformation during the 2016 presidential election and for the failure of pundits in the U.K. to anticipate Brexit. They’ve prompted soul-searching at Facebook over the degree to which the site should personalize its algorithms and caused some to worry that tech companies are losing touch with regular people. Wikipedia’s filter-bubble problem is a particularly acute threat for an organization whose stated mission is “to empower and engage people around the world.” “We need to bring in people who have not traditionally been Wikipedia editors,” says Jimmy Wales, the co-founder. “If we’re not providing a space for women, someone else will.”
Wikipedia’s origins date to 2000, when Wales, who also went by Jimbo then and was known primarily for operating a successful search engine for raunchy photos called Bomis, hired a philosophy Ph.D., Larry Sanger, to oversee Nupedia, a free general interest online encyclopedia with a handful of articles. The following year, Wales and Sanger adopted wiki software, allowing anyone to edit the site and create articles, and launched Wikipedia as a nonprofit. By 2013 the site was one of the most popular on the internet, with more than 20 billion page views and 500 million or so unique visitors each month.
Wales didn’t get rich from Wikipedia’s runaway success, but his position in the world rose. He’s married to Tony Blair’s former assistant, regularly attends the World Economic Forum in Davos, and no longer goes by Jimbo. The Wikimedia Foundation took in $77 million in donations for 2016, up from $45 million in 2013.
Even as Wikipedia’s fundraising machine and respectability have grown, its web traffic has declined. The project now has 16 billion monthly page views. It no longer discloses unique visitors, but its audience in the U.S. hasn’t grown, according to ComScore. The reasons for the drop-off are the subject of some debate among Wikipedia supporters, but they’re partly the result of the organization’s struggle to adapt to changes in how people use the internet. Google incorporates information from Wikipedia, such as birthdates and thumbnail biographies of famous people, into its search results, so users often are able to get information from Wikipedia without visiting its website. Digital assistants, such as Amazon.com’s Alexa and Apple’s Siri, also draw on Wikipedia data without sending users to the encyclopedia.
Wikipedia didn’t release an app for the iPhone until 2011, and even today it’s difficult to write a new entry on a mobile device. This was partly because editors, who mostly contribute to the encyclopedia on desktop computers, questioned the need for mobile apps on the grounds that users editing on their smartphones would be more likely to insert errors into carefully crafted articles. “The hard-core community members think editing should be hard—so they’re sure you’ve done your homework,” says Andrew Lih, a journalism professor at American University and the author of The Wikipedia Revolution: How a Bunch of Nobodies Created the World’s Greatest Encyclopedia. This pose can take on sexist overtones when male editors raise questions about whether a woman merits inclusion in Wikipedia. In addition, female editors, or editors who write articles about women, are frequently subjected to harassment or threats. “How can you get people to participate in an environment that feels unsafe, where identifying yourself as a woman, as a feminist, could open you up to ugly, intimidating behavior?” Lih asks.
This pattern has made it harder for Wikipedia to cover aspects of the world that aren’t of obvious interest to its biggest users. The 2011 wedding of Britain’s Prince William and Kate Middleton was a bonanza for media outlets around the world, attracting by some estimates 2 billion television viewers. Wikipedia failed to fully capitalize on the excitement, in part because its community leaders couldn’t agree on whether Middleton’s dress belonged in Wikipedia at all. A Wikipedia article about the dress, a celebrated gown designed by Alexander McQueen creative director Sarah Burton, was repeatedly deleted and undeleted, as the article’s Talk page devolved into something resembling a shouting match. “The sheer presence of this article is one of the lowest points ever reached by Wikipedia!” one editor declared.
The following year, at the annual Wikimania conference, Wales brought up the incident as evidence of the encyclopedia’s shortcomings. He noted that Wikipedia’s “typical tech geek male” users had biased the site in favor of certain topics at the expense of others, comparing the brouhaha over the dress to the more than 100 articles that had been written about various editions of the open source operating system Linux. “Why don’t we have the top 100 most famous dresses?” he asked. “It’s culturally very important.”
Wales, who holds the only permanent seat on the Wikimedia Foundation board, is sometimes referred to as the project’s “benevolent dictator for life.” Although he objects to the designation—and he has his critics within the Wikipedia community—his comments instantly settled the debate about whether dresses belonged on Wikipedia. It coincided with an awakening within the foundation. Talks were given, conferences organized, and surveys were commissioned that showed editors were overwhelmingly male. They were “an eye-opener,” says Pete Forsyth, an editor and former foundation staff member who runs a Wikipedia consulting business, and “spurred a lot of discussion in the community.”
The impact so far has been halting at best. In July 2015 the activist group Women in Red formed to encourage volunteers to address the site’s gender imbalance. (The name is a reference to the color Wikipedia uses to identify notable people who haven’t had articles written about them.) Since then the project’s 150 members have organized dozens of “editathons”—events where editors get together to add women to the site—and added about 45,000 entries covering notable women. But the share of women featured on Wikipedia’s English-language pages has increased only modestly, from 15 percent to 16.8 percent, since the group formed. (By comparison, 29 percent of the entries in the Gale Biography Resource Center, a database used in many school libraries, are about women, according to a 2011 paper.) “It’s just enough progress to make us feel frustrated,” Wales says. “We underestimated the scale of the problem.”
The scale of the problem was apparent at a Saturday morning editathon in mid-December organized by arts collaborative Black Lunch Table. The event, held at BRIC, a cultural center in Brooklyn, had been billed as an opportunity to “correct Wikipedia’s pervasive gender bias and inaccuracies.” Several of the dozen people who showed up had come under the mistaken impression they’d be able to write entries about themselves. Wikipedia prohibits editing or writing articles about “yourself, your family, friends, or foes”—a point that the organizer, Heather Hart, delicately noted.
“Anyone can edit,” Hart, an artist, told the group, “but there are several admin layers, and you have to understand them to do this. There are a lot of rules.” Wikipedia’s administrators closely scrutinize edits from new community members, and new articles are often initially rejected. The editing process itself is not especially intuitive, requiring users to employ special codes to add line breaks or citations.
Three hours later, the participants had made small improvements to eight articles. Kristen Williams, a Brooklyn artist who was new to Wikipedia, added a birthdate to an entry for the visual artist Jonathan Allen. Another participant worked on an entry about the writer June Jordan, adding details about one of her books and tweaking a subhead from “Consciousness of race, class, and gender identity” to “Concepts of race, class, and gender.”
The changes were modest, but Wikipedia executives and community members say that making history more inclusive is a slow process. The encyclopedia’s reliance on outside sources, primarily newspapers, means it will be only as diverse as the rest of the media—which is to say, not very. “We are a reflection of the world around us,” says Katherine Maher, who began as the foundation’s third executive director in June 2016. Wikimedia plans to organize more events to bring in new editors and find ways to make the editing process less forbidding. In December the board issued a statement for the first time that condemned harassment of editors. “It establishes a sense within the community that this is a priority,” Maher says. Still, she acknowledges, “it has to be more than words.”
The capacity of Maher—or anyone—to directly change the way Wikipedia operates is limited, however. Although the Wikimedia Foundation controls a substantial budget, it can’t commission articles or do much to regulate the content of the encyclopedia. Wikimedia’s previous executive director, Lila Tretikov, left last year after trying unsuccessfully to establish a feature, “superprotect,” that would have allowed the foundation to overrule editors. “Wikipedia has no real person in charge—it’s sort of like a tragedy of the commons problem,” says William Beutler, a contributor who runs a marketing consulting firm in Washington and advises clients such as Verizon and engineering giant Bechtel on interacting with Wikipedia. “The person willing to make the biggest jerk out of himself oftentimes wins.”
Sengupta has continued to edit her article about Adeleye-Fayemi. It has since received contributions from 16 other Wikipedia editors and now includes 18 footnotes. So far she’s created 28 pages, most recently for the Brazilian feminist Sonia Corrêa. Like many people in the “free knowledge movement,” as some in the Wikipedia world describe themselves, Sengupta has been discouraged by the rise of nationalism and anti-immigrant sentiments in the U.S. and Europe. But they see Wikipedia as a potential bulwark against those tides—if it can live up to its own ideals. “Making Wikipedia more plural and diverse in terms of who edits and what they edit is one of the most effective ways in which we can move beyond the stereotypes that exist all around us,” she says. “There is something very, very meaningful about this moment in time.”