Search engines, social networks and cloud storage are rife with opportunities for child-pornography users and child abusers to exploit, but the tech industry has consistently failed to take aggressive steps to shut it down, The New York Times reports. The companies have the technical tools to stop the recirculation of abuse imagery by matching newly detected images against databases of the material. Yet the industry does not take full advantage of the tools.
Amazon, whose cloud storage services handle millions of uploads and downloads every second, does not even look for the imagery. Apple does not scan its cloud storage, according to federal authorities, and encrypts its messaging app, making detection virtually impossible. Dropbox, Google and Microsoft’s consumer products scan for illegal images, but only when someone shares them, not when they are uploaded. And other companies, including Snapchat and Yahoo, look for photos but not videos, even though illicit video content has been exploding for years. The largest social network in the world, Facebook, thoroughly scans its platforms, accounting for over 90 percent of the imagery flagged by tech companies last year, but the company is not using all available databases to detect the material. And Facebook has announced that the main source of the imagery, Facebook Messenger, will eventually be encrypted, vastly limiting detection. The Times created a computer program that scoured Bing and other search engines. The automated script found dozens of images that Microsoft’s own PhotoDNA service had flagged as known illicit content. Bing even recommended other search terms when a known child abuse website was entered into the search box.