Can Jigsaw’s Designers Make the Internet Safer?

Creating tools to protect people from online harassment is trickier than it sounds.

At Jigsaw—an incubator in New York City that’s part of Google parent Alphabet Inc.—a team of designers and engineers has a sweeping mission: to create applications and promote technology that protects Internet users from online censorship, harassment, and extremist attacks. “The high-level strategy is, if we can make products that are robust for users in these sorts of situations, then that can make the internet stronger,” says CJ Adams, a product manager at Jigsaw.

Illustration: Tomi Um

It’s not as simple as it sounds, as Jigsaw staffers well know. Adams has focused on such issues within Google since 2012, when he began working at Jigsaw’s forerunner, Google Ideas. About three years ago, Adams and a team started to think about filter bubbles—the way that algorithms in search engines and social media platforms serve up personalized news results based on where you are and past searches or clicks. They released in March 2016. The tool is an interactive map of the world that highlights the stories most missed by the local media where a user is based.

The tool hasn’t caught on, Adams says, because people in the midst of the U.S. presidential election and Brexit in the U.K. were more concerned with local, domestic filter bubbles than the country-level differences Unfiltered explores. “If we had stayed focused on the users’ problems, the project would have adapted with them, but it didn’t,” says Adams.

The site is one example of how easy it is to miss the mark with the sort of project Jigsaw takes on. “What people don’t see about the design process is how often we were wrong,” says Jigsaw designer Dana Steffe. “Trying to figure out what is most right,” she says, is a process.

Steffe’s first assignment after joining Jigsaw in January was a redesign of Project Shield, a service that protects independent media, human-rights organizations, and election-monitoring websites from being shut down by digital attacks. This summer, Steffe traveled to Kenya with experience researcher Dalila Szostak and Jigsaw engineers to gather data. Kenya was about to hold national and local elections, and tensions were running high—ethnic clashes, intimidation, and fraud have marred previous votes in the East African nation. The Jigsaw team wanted to talk to human-rights activists on the ground about how they experience these threats online, including the kinds of website attacks Shield is meant to stop. “Designers spend a lot of time thinking about the mindset of the user,” Steffe says. “How does somebody feel when they come to Shield? Do they feel confident? We want people to trust what we do.”

But potential users in Kenya weren’t thinking about digital safety—they were preoccupied with physical danger. “A lesson that we learned is that going to Kenya during an election, to learn about digital security, might not be the best thing to do,” says Szostak. The team returned to New York with insights, just not about Shield. Conversations on hate speech and the toxic language fouling up online debates about the elections were potential fodder for a separate machine-learning research project.

This isn’t surprising, says Nathan Freitas, who works on mobile security for human-rights defenders and activists through the Guardian Project and has collaborated with Google in the past. “They realize both that there are many little improvements that need to be made, but also that it may not even be Google technology” that’s needed.

Some tools have caught on, thanks to careful user research. Adams and a team traveled to Ukraine in May 2015 to meet with independent media organizations and publishers under pressure from a government intent on avoiding media scrutiny. The team asked the journalists, “show us what’s happening, show us what you’re up against,” Adams recalls. Many of the journalists mentioned phishing emails—in the midst of violence, they relied on tips and information from fellow reporters and activists to help keep them safe from physical threats, and attackers were posing as trusted contacts to compromise accounts and steal information.

The team presented a recently released tool, Password Alert, to the journalists. The Chrome browser extension displays when a website is trying to steal your credentials. Adams and his team explained that it could provide protection from the phishing threat. “It was that validating moment when the user-first focus paid off,” he says. Since then, more than half a million people have downloaded the tool, which is free and available to anyone.

“The real thing that Jigsaw does that’s important is look at people who are experiencing some sort of insecurity,” says Sara Sinclair Brody, who worked at Google Ideas for a year before becoming executive director of Simply Secure, a nonprofit that promotes secure technology design. “Security is only defined as a function of a particular threat,” she says. “So if you’re talking about a piece of software being secure, you have to say, secure for whom, in what context?”

    BOTTOM LINE - An Alphabet technology incubator is analyzing the needs of internet users around the world and applying its research to projects promoting security and free speech.
    Before it's here, it's on the Bloomberg Terminal.