Is Your Phone Safe? The Dangers of Police Access to Private Digital Data

Print More

Photo by Thomas Gorman via Flickr

A recent report from the Washington, D.C. nonprofit Upturn highlights the privacy and civil rights concerns arising from the use of mobile device forensic tools (MDFTs) by law enforcement agencies in the United States.

“Mass Extraction: The Widespread Power of U.S. Law Enforcement to Search Mobile Phones” documents the use of MDFTs by more than 2,000 state and local law enforcement agencies.

By combing through a combination of purchase orders, search warrant affidavits, and other public records requests, researchers documented:

    • How law enforcement uses MDFTs as a standard investigative tool in both major and minor cases.
    • The technical details of how MDFTs work and the types of data they capture.
    • How search warrants and consent searches enable police to acquire an intimate picture of an individual’s life from their device.
    • The lack of detailed policies that govern the use of MDFTs in many agencies.

Observing that many lower-income Americans, and Americans of color, rely on their mobile devices for internet access, the report argues that such a powerful investigative tool could disproportionately impact the privacy of citizens from those communities.

Amid continued calls in some quarters to “defund the police,” the report also recommends five preliminary steps toward reducing the use of MDFTs.

Among its recommendations:

    • Ban consent searches of mobile devices, and abolish the “plain view search” exception of the Fourth Amendment for digital devices.
    • Require easy-to-understand audit logs and their transparent communication to the public.
    • Enact robust requirements for the retention, deletion, and sealing of data.

The Gray Area Between Forensics and Investigations

Mobile forensics evolved as an offshoot of computer forensics—a scientific process that had already been deemed admissible in courts. Admissibility depends on authenticity: the ability of a witness to say that evidence is what it’s claimed to be.

When it comes to scientific methods, an expert’s ability to authenticate evidence is grounded in a process that collects the evidence without changing it.

“An analyst must compare DNA samples and determine, based on the similarity of the samples, whether the samples are similar enough that they could have come from the same individual,” said Alicia Loy, an attorney with the National White Collar Crime Center.

“MDFTs produce a binary result: the evidence either exists or it does not exist.”

That result, Loy continued, has led to some investigators taking the technology for granted, rather than digging deeper to determine the authenticity of digital evidence—whether the suspect put the evidence on the device.

Like finding DNA at a crime scene and needing to establish that it was deposited during the commission of the crime, finding suspicious location or photographic evidence on a device doesn’t mean the device owner put it there.

Historically, making that determination is the province of digital forensic examiners, who use the principles of computer science to generate results that are repeatable and reproducible by other forensic experts.

However, digital evidence doesn’t only share forensic value with other forensic traces, like DNA or fingerprints. Because mobile devices are used in the planning and commission of crimes, obtaining the data could mean using their intelligence value to prevent much more serious crimes.

A classic example is the child predator who grooms their victim via text or app messaging. Intercepting that predator before they can abuse their victim hands-on is a compelling argument.

For another example, using data from the mobile phone of an opioid overdose victim could help find their dealer. In turn, taking the dealer out of operation could prevent additional deaths.

However, those are largely investigative functions—not forensic responsibilities.

Moreover, at the same time as its real or perceived value to criminal investigations has increased, digital technology’s advancement has outpaced forensic experts’ ability to keep up with it.

‘Democratization’ of Forensic Technology

That’s led to the “democratization” of digital forensic technology, in which MDFTs have found their way into investigators’ hands.

Again, authenticating evidence—using scientific processes to ascertain the likelihood that a particular person put the evidence on the device—is forensic: meant to withstand a challenge in a court of law.

Quality assurance standards help to ensure that forensic scientific processes are repeatable and reproducible. To that end, standards govern tool and methodology testing—as well how often it’s performed—personnel training, certifications, and proficiency; written procedures that help guide the processes; the documentation of any deviations or exceptions from those procedures; and managerial oversight. Peer technical review can additionally help to validate processes.

These standards all exist for digital forensics, too. It’s just that law enforcement agencies don’t appear to be applying them: one of the “Mass Extraction” report’s key findings is the lack of consistent policies regarding MDFT use across law enforcement agencies.

“Many departments have no policies at all — despite using these tools for years,” the report reads. “Nearly half of the departments that responded to our records requests (40 out of 81) indicated they had no policies in place. Even when policies exist, they are often remarkably vague….”

Echoing the Upturn researchers, the NW3C’s Loy believes individual agencies’ policies should determine how and in which types of cases MDFTs are used. “These policies should balance privacy with the utility of the data,” she added.

However, even that could be fraught.

“Should we only use MDFTs to retrieve the ‘useful’ information, or should we use MDFTs to retrieve ALL information, pick out what we need, and then dump the rest?” Loy asked.

She added that both approaches have their problems: determining what’s “useful”—both inculpatory and exculpatory—first demands an investigator know what’s on the device to begin with, raising privacy concerns.

On the other hand, “cherry-picking” data could result in questions around whether exculpatory data was properly disclosed—potentially making collection of all data the “safer” option, said Loy.

Forensic science isn’t without its controversies. The Innocence Project exists—and has exonerated numerous wrongfully convicted people—because DNA evidence can be incomplete, misinterpreted, even misapplied.

When it comes to digital forensics, the kind of transparency called for in the “Mass Extraction” report is difficult because the technology is difficult to explain to laypeople. While digital forensic fundamentals remain the same, mobile technology—and resulting forensic processes—change constantly.

But these shortcomings don’t mean digital forensics can’t improve. Take policies, for example. In court, Loy said, policies can help “to increase the credibility of the forensic examiner because the examiner can point to the fact that he followed the policies to the letter.”

As American law enforcement grapples with its public image, its propensity to treat the personal, private data of American citizens as “intelligence” only feeds the perception of a militarized force that’s suspicious of the people it’s sworn to serve.

To that end, standards and policies aren’t sexy. But they’re the foundation of the kind of credibility many law enforcement leaders purport to seek.

Christa Miller is a journalist who covers privacy issues and technology.

Leave a Reply

Your email address will not be published. Required fields are marked *