Source: Getty Images, Photo Illustration: Tom Hall/Bloomberg

First Firms Blocked Porn. Now They Scan for Child Sex Images

Only Ericsson wanted to talk about it.

The first alarm came within a week. It meant an Ericsson AB employee had used a company computer to view images categorized by law enforcement as child sexual abuse.  

“It was faster than we would have wanted,” says Nina Macpherson, Ericsson’s chief legal officer.

In a bid to ensure none of its 114,000 staff worldwide were using company equipment to view illegal content, in 2011 the Swedish mobile networks pioneer installed scanning software from Netclean Technologies AB. While many companies since then have adopted similar measures, few have been willing to discuss their experience publicly. 

Ericsson’s move may have made it the first big company to scan employee’s computers for indecent images of children rather than just blocking online pornography, according to Michael Moran, a director at Interpol's child exploitation unit. The key difference is that child-abuse material depicts a crime being carried out, and notifying police helps them find the people making it and prosecute those viewing it.

“You can actually save a kid from the abuse they are experiencing by recognizing, reporting and removing it.”

Netclean’s software scans web searches, e-mail, hard drives and memory sticks for specific images or videos already classified as child pornography. It uses image fingerprinting to ensure that it recognizes blacklisted photos even if they are moved around the internet, between computers, or modified. Netclean says its software finds illegal images on about 1-in-1,000 computers across the hundreds of clients it now serves.

Since installing the system, Ericsson says it has been dealing with around one alarm each month – each one flagging an act that could lead to prosecution. 

“Our aim is not to become a law enforcement agency, but we want to get this stuff out of our system. It’s an unacceptable use of our networks, illegal, and against our policies,” Macpherson adds.

Netclean's ProActive software in use.
Netclean's ProActive software in use.
Source: Netclean

‘No False Positives’ 

The alerts – invisible to the person who triggers them – are sent via e-mail and text message to Ericsson’s group security adviser, Patrik Håkansson, a former detective chief inspector from Sweden’s National Police IT Crime Squad. He’s confident that the digital fingerprint system means the software only raises the alarm when it detects images already on an international child abuse blacklist.

“There are no false positives; the technology won’t show up any pictures of children on the beach,” says Håkansson. 

His job is to confirm that the illegal pictures have indeed been handled on company equipment, and by whom. In the U.S. the FBI must be called immediately. In other markets Ericsson can carry out some internal investigations before involving law enforcement. 

The perpetrator is typically fired, unless digital forensic investigators can establish there has been a genuine mistake, Macpherson says.

 


Child Sexual Abuse Content - The Numbers

Sources: Internet Watch Foundation, U.K. National Crime Agency


Netclean, a private Swedish tech firm established in 2003 and still owned by its founders, as well as original and angel investors, says that few firms initially accept its assertion that 1-in-1,000 corporate machines are used to view illegal images.

“A lot of companies don’t believe us when we say that number. But if you don’t look you won’t find it,” founder Christian Berg says.

Alarm, Raid, Conviction 

In late August 2012 an Ericsson network engineer in Texas remotely logged onto the firm’s network using a company laptop, triggering a Netclean alarm. Ericsson reported the incident to local law enforcement on Sept. 7, according to court documents from a prosecution in the Western District of Texas Court.

FBI officers (file picture)
FBI officers (file picture)
Source: Eric Theyer/Redux

On Dec. 14, 2012, 17 law enforcement officers, including FBI agents, arrived at the engineer’s house. He refused to let them in, so they broke down the door and began searching the building while he and his wife lay on the floor, the court documents show. 

Told that his employer had traced child pornography to his laptop, the engineer said they would find more images on his flash drive. Even as the search continued, he prepared a written statement: yes, he said, he had visited child sexual abuse websites, but he had never harmed any children. He planned to beg his family for forgiveness, he said.

Court documents show that the engineer later told the court he had been coerced into making the statement. The judge rejected this and denied his request to have the evidence struck out. The engineer’s attorney didn’t respond to e-mails and phone calls seeking comment. 

In March this year the engineer – no longer an Ericsson employee – pleaded guilty to receipt of child pornography, after losing his application in front of the judge. On Aug. 26 he was sentenced to five years in federal prison at Bastrop, Texas. 

When he’s released he will have to attend a sex offender treatment program. He faces 20 years of supervision by probation officers.  

No Privacy

Netclean says its software is unique and effective, but British lawyer Myles Jackman urges caution, warning that companies could trigger a “witch hunt” based on what he sees as potentially flawed evidence. Jackman, an obscenity law specialist, denounces genuine indecent material, calling it “reprehensible.” But while the technology may not deliver false positives, he says he knows of numerous occasions where police have misclassified images as illegal. Misclassified images, once in the police database, would still trigger a Netclean alarm. 

“The reality is that the basic data-entry inputting is done by humans and humans get things wrong. It happens in every database,” says Jackman.

He also highlights a “grey area” in countries such as the U.K., Canada and parts of the U.S., where the age of consent is lower than the age of majority.

“Between 16 and 18 a child has the right to have sex, but as soon as they create an image of that act they are in possession of an illegal image and are committing an offence by distributing it. That disparity is a problem and our legal system simply has not caught up with morality and social values,” Jackman adds.

New technology giving companies the power to monitor employees’ online behavior can raise uncomfortable questions for managers who spot their workers viewing illegal content.

The Internet Watch Foundation, a British industry body set up with a global remit to tackle criminal content, says misuse of corporate equipment is a “genuine issue” – particularly now that more people work from home. 

Ericsson employees sign a form consenting to being observed. Does that equate to spying on staff? As long as companies are upfront and explain to employees they are being monitored, there “can’t be any expectation of privacy,’’ says Stuart Neilson, a London-based employment lawyer. 

That’s important, because there are also risks for any company that knows its equipment is being used illegally and doesn’t act. “If the organization has evidence that an employee has been accessing these sites but has done nothing with that evidence, then the employer might be liable,’’ Neilson says. 

Before it's here, it's on the Bloomberg Terminal. LEARN MORE