Ride the Lightning

Cybersecurity and Future of Law Practice Blog
by Sharon D. Nelson Esq., President of Sensei Enterprises, Inc.

Government Tosses Child Porn Cases Rather Than Talk About its Torrent Tracking Software in Court

April 22, 2019

A new investigation by ProPublica revealed earlier this month that more than a dozen cases were dismissed after defense attorneys asked to examine, or raised doubts about, computer programs that track illegal images to internet addresses.

Using specialized software, investigators traced explicit child pornography to Todd Hartman’s internet address. A dozen police officers raided his Los Angeles-area apartment, seized his computer and arrested him for files including a video of a man ejaculating on a 7-year-old girl. But after his lawyer contended that the software tool inappropriately accessed Hartman’s private files, and asked to examine how it worked, prosecutors dismissed the case.

Near Phoenix, police with a similar detection program tracked underage porn photos, including a 4-year-old with her legs spread, to Tom Tolworthy’s home computer. He was indicted in state court on 10 counts of committing a “dangerous crime against children,” each of which carried a decade in prison if convicted. Yet when investigators checked Tolworthy’s hard drive, the images weren’t there. Even though investigators said different offensive files surfaced on another computer that he owned, the case was tossed.

At a time when at least half a million laptops, tablets, phones and other devices are viewing or sharing child pornography on the internet every month, software that tracks images to specific internet connections has become a vital tool for prosecutors. Increasingly, though, it’s backfiring.

Drawing upon thousands of pages of court filings as well as interviews with lawyers and experts, ProPublica found more than a dozen cases since 2011 that were dismissed either because of challenges to the software’s findings, or the refusal by the government or the maker to share the computer programs with defense attorneys, or both. Tami Loehrs, a forensics expert who often testifies in child pornography cases, said she is aware of more than 60 cases in which the defense strategy has focused on the software.

Defense attorneys have long complained that the government’s secrecy claims may hamstring suspects seeking to prove that the software wrongly identified them. But the growing success of their counterattack is also raising concerns that, by questioning the software used by investigators, some who trade in child pornography can avoid punishment.

“When protecting the defendant’s right to a fair trial requires the government to disclose its confidential techniques, prosecutors face a choice: Give up the prosecution or give up the secret. Each option has a cost,” said Orin Kerr, an expert in computer crime law and former Justice Department lawyer. “If prosecutors give up the prosecution, it may very well mean that a guilty person goes free. If prosecutors give up the secret, it may hurt their ability to catch other criminals. Prosecutors have to choose which of those outcomes is less bad in each particular case.”

In several cases, like Tolworthy’s, court documents say that the software traced offensive images to an Internet Protocol address. But, for reasons that remain unclear, those images weren’t found on the defendant’s computer. In others, like Hartman’s, defense lawyers said the software discovered porn in areas of the computer it wasn’t supposed to enter, and they suggested the police conducted an overly broad search.

These problems are compounded by the insistence of both the government and the software manufacturers on protecting the secrecy of their computer code, so as not to imperil other prosecutions or make trade secrets public. Unwilling to take the risk that the sensitive programs could leak publicly, they have rejected revealing the software even under strict court secrecy.

Nevertheless, the software is facing renewed scrutiny: In another case where child pornography identified by the software wasn’t found on the suspect’s computer, a federal judge in February allowed a defense expert to examine it. And recently, the nonprofit Human Rights Watch asked the Justice Department to review, in part, whether one suite of software tools, the Child Protection System, had been independently tested.

“The sharing of child-sex-abuse images is a serious crime, and law enforcement should be investigating it. But the government needs to understand how the tools work, if they could violate the law and if they are accurate,” said Sarah St.Vincent, a Human Rights Watch researcher who examined the practice.

“These defendants are not very popular, but a dangerous precedent is a dangerous precedent that affects everyone. And if the government drops cases or some charges to avoid scrutiny of the software, that could prevent victims from getting justice consistently,” she said. “The government is effectively asserting sweeping surveillance powers but is then hiding from the courts what the software did and how it worked.”

The dismissals represent a small fraction of the hundreds of federal and state child pornography prosecutions since 2011. More often, defendants plead guilty in exchange for a reduced sentence. (Of 17 closed cases brought since 2017 by the U.S. attorney’s office in Los Angeles, all but two resulted in plea deals, ProPublica found.) Even after their charges were dropped, Tolworthy and Hartman are both facing new trials. Still, the dismissals are noteworthy because challenges to the software are spreading among the defense bar and gaining credence with judges.

In cases where previously flagged porn isn’t turning up on a suspect’s computer, investigators have suggested the files have merely been erased before arrest, or that they’re stored in encrypted areas of a hard drive that the police can’t access. Defense attorneys counter that some software logs don’t show the files were ever downloaded in the first place, or that they may have been downloaded by mistake and immediately purged.

Defense lawyers are given a bevy of reasons why porn-detection software can’t be handed over for review, even under a protective order that limits disclosure to attorneys and their experts. Law enforcement authorities often say that they’re prohibited from disclosing software by their contracts with the manufacturer, which considers it proprietary technology.

Prosecutors are also reluctant to disclose a coveted law enforcement tool just to convict one defendant. A Justice Department spokeswoman referred ProPublica to a government journal article, which argued peer-to-peer detection tools “are increasingly targeted by defendants through overbroad discovery requests.”

“While the Department of Justice supports full compliance with all discovery obligations imposed by law,” wrote lawyers for the Justice Department and the FBI, “those obligations generally do not require disclosure of sensitive information regarding law enforcement techniques which, if exposed, would threaten the viability of future investigations.”

One former Justice Department prosecutor said the government has shielded software in criminal cases for fear that disclosure could expose investigators’ capabilities or classified technology to criminals.

“They don’t want to reveal that in a case because it can be the last time they use it,” said the lawyer, who requested anonymity because of the sensitive nature of the topic. “It sounds like they may, in some circumstances, be using programs that are never intended to see the light of day in the criminal justice system.”

The government’s reluctance to share technology with defense attorneys isn’t limited to child pornography cases. Prosecutors have let defendants monitored with cellphone trackers known as Stingrays go free rather than fully reveal the technology. The secrecy surrounding cell tracking was once so pervasive in Baltimore that Maryland’s highest
c
ourt rebuked the practice as “detrimental.” As was first reported by Reuters in 2013, the U.S. Drug Enforcement Administration relied in investigations on information gathered through domestic wiretaps, a phone-records database and National Security Agency intercepts, while training agents to hide those sources from the public record.

“Courts and police are increasingly using software to make decisions in the criminal justice system about bail, sentencing, and probability-matching for DNA and other forensic tests,” said Jennifer Granick, a surveillance and cybersecurity lawyer with the American Civil Liberties Union’s Speech, Privacy and Technology Project who has studied the issue.

“If the defense isn’t able to examine these techniques, then we have to just take the government’s word for it — on these complicated, sensitive and non-black-and-white decisions. And that’s just too dangerous.”

The software programs used by investigators scan for child porn on peer-to-peer networks, a decentralized connection of computers on the internet where users share files directly with one another. Those networks behave similarly to software like Napster, the popular file-sharing program used to download music in the early days of the commercial internet.

Although Napster may have faded, the trading of child pornography on peer-to-peer networks hasn’t. To keep up, police rely on modified versions of popular peer-to-peer programs to flag IP addresses of suspected child pornography, enabling investigators to subpoena the internet provider and unearth the internet subscriber. They then obtain a search warrant for computers at the physical location they say is involved in sharing porn.

One common suite of software tools, the Child Protection System, is maintained by the Florida-based Child Rescue Coalition. Although the coalition says it’s a nonprofit, it has ties to for-profit data brokers and the data company TLO. (TransUnion, the major credit-reporting agency, has acquired TLO.) CRC has hosted some of its computer servers at TransUnion since 2016, according to a review of internet records collected by the firm Farsight Security.

A redacted user manual filed in a federal case, portions of which were un-redacted by Human Rights Watch and confirmed by ProPublica, indicates that the Child Protection System draws on unverified data gathered by these firms. It says TLO “has allowed law enforcement access to data collected on internet users from a variety of sources,” with enhanced information that includes “marketing data that has been linked to IP addresses and email accounts from corporate sources.”

“No logs are kept of any law enforcement query of corporate data,” the manual continued. It cautioned that subscriber data was unconfirmed, and that it should “be confirmed through other investigative means that are acceptable with your agency and prosecuting attorney.”

Software that relies on unconfirmed information from big data brokers, civil liberties advocates say, may not only point police to the wrong internet address owner, but it also enables them to gather a mountain of personal details about a suspect without a court order, sidestepping constitutional protections.

The software’s makers have resisted disclosure of its coding. In May 2013, TLO asked a federal court in El Paso, Texas, to quash a subpoena to reveal the software known as the Child Protection System in a child-porn case. The materials sought, they said, “are protected under the law enforcement privilege and trade secrets laws.” After the judge ordered the software produced, prosecutors instead agreed to a plea deal that favored the defendant; he was sentenced to three years he had already served for “transportation of obscene material.”

CRC says on its website that its software is used in every state and more than 90 countries, and has tracked more than 54 million offenders. CRC President William Wiltse, a former Oregon police officer, has testified for the prosecution in cases in which investigators relied on the Child Protection System.

CRC did not respond to phone and email inquiries from ProPublica this month about its software. It told Human Rights Watch this year, “As a policy, we do not publicly share details of how we identify sex offenders online, as we do not want predators to learn better ways to hide their illegal activity.” A spokesman for TransUnion, which now owns TLO, said the company “supports Child Rescue Coalition in its work with law enforcement to protect children from sexual exploitation online.”

I've said it before, but it bears repeating. "Black box" technology is simply "trust me" technology. Again and again, it has been proven that there are issues with keeping technology secret, issues which might well reveal a defendant's innocence or a reasonable doubt. And where we see "profit" and "proprietary" in the same sentence involving the manufacturer, it is perfectly clear that justice is not really the driver here.

E-mail:    Phone: 703-359-0700
Digital Forensics/Information Security/Information Technology
https://www.senseient.com
https://twitter.com/sharonnelsonesq
https://www.linkedin.com/in/sharondnelson
https://amazon.com/author/sharonnelson