EDITORIAL: Microsoft Turns in Child Predator in Penn., Internet Critics Cry
August 8, 2014 12:21 AM
comment(s) - last by
Failing to compare anonymous image fingerprints protects only one group of customers -- child predators
A little over a week ago, a Google Inc. (
led to the arrest of a previously convicted child rapist
in Texas who police say had been hoarding child pornography in his Gmail account and secretly videotaping children at a local Denny's Corp. (
) restaurant in Pasadena, Texas where he worked.
I. Microsoft Helps Authorities Catch a Predator in Pennsylvania
Now just days later, Microsoft Corp. (
) has helped authorities catch another child predator, this time in Pennsylvania. And as in the Google incident the arrest has triggered outrage among some internet commenters who claim that Microsoft, Google, and Apple, Inc. (
) are violating their "rights" and privacy.
The Smoking Gun
first reported the recent arrest
. It published a copy of the affidavit:
The affidavit reveals the identity of the defendant -- Tyler James Hoffman, 20. He was arrested on April 24 by the
Pennsylvania State Police Computer Crime Task Force
Pennsylvania Internet Crimes Against Children Task Force
(ICAC), after an investigation determined he had solicited and obtained multiple pornographic images of pre-pubescent children aged 7 to 13 engaged in sex acts or being sexually assaulted.
A search of Pennsylvania's sex offender registry did not indicate Mr. Hoffman to have been previously convicted.
The case has been widely reported on, even
receiving coverage from
II. Police: Suspect Sought Out Images of Girls Aged 6 to 13
Pennsylvania State Police Trooper Christopher Hill confirmed the arrest and reported that a Microsoft tip triggered it. In a sworn affidavit police investigators say they were alerted to the man's behavior after a Microsoft script that scans uploaded files for child pornography detected not one, but two uploads of known child pornography.
Pennsylvania State Police picked up the accused child predator in late April. [Image Source: CCJ Magazine]
The first image involved a girl between the ages of 7 and 10 performing a sex act. Police officers indicated to Philly.com that Mr. Hoffman also obtained an image of a slightly older underage female -- between the ages of 10 and 13 -- having intercourse.
Microsoft detected Mr. Hoffman uploading these images to Skydrive -- its cloud storage service, which was in February was
rebranded as "OneDrive"
After being questioned, Mr. Hoffman admitted to officers, according to court documents, that he obtained the images via contacts on Kik, a cross-platform messenger service. After uploading the pictures to Microsoft's cloud storage service, he then tried to send the images via his Microsoft live.com email account to another user, further establishing his intent to possess and redistribute the illegal images.
[Image Source: Penn. State Police (left) / Facebook (right)]
According to police, he was arrested on July 31 at work and admitted to the crime after being questioned that he had been "trading and receiving images of child pornography on his mobile cellular device".
III. The Boy That Cried Wolf
The report yet again has provoked some internet commenters to explosions of outrage, such as some commenters on a
post on the arrest. "Jane Q. Public"
It isn't so much a matter of "Look! they did something great!" (and they did)... it's more a matter of: look at the shitty (sic) privacy intrusion they've committed on hundreds of thousands, if not millions, of people, in order to accomplish that one great thing.
Another user, "Ol Olsoc", seems to liken it to government officials raping someone's wife, commenting:
I worked with a guy who once said. "I don't care if they come into my bedroom and f*ck my wife, as long as they keep the country secure". He was willing to give up any semblance of freedom for his "security".
Slashdot users were among those suggesting that child predators be afforded special privacy protections that help hide their crimes.
Another commenter "thieh" says they're fine with Microsoft and Google reading their emails for profit (advertising), but performing a much less intrusive scan to catch child predators crosses the line. According to this commenter, Microsoft should have warned the defendant that molesting children or sharing illegal pictures of them might result in jail time. The commenter writes:
The problem usually comes down to that "personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection" didn't include "days in court" nor "jail time" as their catalog of "personally relevant product features".
But such comments ignore the realities of cloud storage and email services.
III. Critics Appear Ignorant of Reality: Hashing is Part of the Normal Storage Process
First, top online firms like Apple, Google, and Microsoft virtually all had hashing projects that predated the anti-child porn efforts. These hashing technologies were originally adopted to improve power efficiency, storage, and access times. Many images on the internet (think popular memes) are regularly downloaded, reuploaded, and sent via email by users.
By recognizing images already stored on their servers, top internet firms enjoy cost savings and can provide better (faster) service. Private images are typically new, so the temporary hash created at upload provides little identifiable insight that could be used to "target" users outside of very specific scenarios, despite the
current rash of uninformed fear mongering
Digital fingerprinting is primarily used to improve the efficiency of cloud storage and had little use to law enforcement concerning most crimes as it does not provide personally identifiable data.
[Image Source: Microsoft]
The only current scenario in which images are checked
for illegal activity is in the case of child pornography.
Much of the limitations to the intrusiveness of these kinds of checks are due to the inherent technical details. Versus techniques employed by the
U.S. National Security Agency
(NSA) and other sophisticated police entities, hash checks are unable to identify wholly new images, as they don't use any type of shape-based recognition algorithm
such as facial recognition.
This means to flag a user for individual inspection, the image must match a database of known images of offensive nature.
IV. Dangers are Mitigated by Transparency
There are potential applications in a dystopian future -- such as a police state "detecting" and arresting those who distribute known anti-government imagery or used by government agents to hunt down those who leak images that show illegal government behavior. But Google and Microsoft actively oppose such uses.
In other words, there is valid cause to be watchful and vigilante to avoid abuse. But there's a difference between vigilance and baseless paranoia and false accusations.
Emma Carr, a director of a non-profit pro-privacy advocacy,
Big Brother Watch
, appears to be a reasonable voice. While she warns of dangers (such as the aforementioned political suppression) she implies that in the current context the efforts appear appropriate -- something the most vitriolic critics appear to be ignoring.
Microsoft must do all that it can to inform users about what proactive action it takes to monitor and analyze messages for illegal content, including details of what sorts of illegal activity may be targeted.
It is also important that all companies who monitor messages in this way are very clear about what procedures and safeguards are in place to ensure that people are not wrongly criminalized, for instance, when potentially illegal content is shared but has been done so legitimately in the context of reporting or research.
While some internet critics aren't placated, privacy advocates say Microsoft is doing the right thing in explaining its policies and protecting the privacy of the average consumer. [Image Source: Bloomberg]
Microsoft and Google fortunately appear to be increasingly trending towards transparency in terms of this hash scanning detection effort. Microsoft like Google already explicitly warned users that their data would be subject to automatic detection algorithms to combat child pornography:
Microsoft [reserves the right to utilize] automated technologies to detect child pornography or abusive behaviour that might harm the system, our customers, or others.
Following the two recent arrests,
Microsoft and Google
have come forward to shed more light on the technology and to explain why they don't believe it poses any significant threat to customer privacy in its current form.
V. Limited Tools, Joint Effort Protects Privacy and Children Alike
Compared to richer analysis techniques like facial recognition and examination by human analysts (whose
brains are equipped with advanced facial and shape recognition
algorithms), this is a relatively crude tool that requires a very specific and obvious focus.
In Google and Microsoft's case, the supporting data set comes courtesy of the
National Center for Missing and Exploited Children
(NCMEC). NCMEC, an anti-child abuse group who runs the "Missing Kids" campaign, in 2011 received government permission to compile a database of signatures (hashes) of known child pornography images.
When child predators are caught (typically after engaging in real world sex crimes or attempts at such crimes) search warrants are often obtained to examine their device. Such searches face few ethical objections if the defendant left the devices unprotected by a password or offers investigators access. Based on these kinds of inspections, law enforcement officials in the U.S. daily acquire a large amount of child pornography imagery that is then passed off to NCMEC, which has exclusive permission to legal keep signatures of these images.
VI. In-Depth: How Microsoft Technology Preserves Anonymity
Microsoft's technology, for example, is called "
by digital forensics firm Netclean and Microsoft Labs, this advanced technology aims to protect the privacy of customers while preventing the transmission of images of child sexual abuse. To do that it uses familiar technologies to protect anonymity, while preserving the ability to search for a small, targeted group of images.
Microsoft's Digital Crimes Unit chief, Mark Lamb, comments on the recent cases:
Child pornography violates the law as well as our terms of service, which makes clear that we use automated technologies to detect abusive behaviour that may harm our customers or other. In 2009, we helped develop PhotoDNA, a technology to disrupt the spread of exploitative images of children, which we report to the National Center for Missing and Exploited Children as required by law.
To efficiently analyze a massive amount of information, Microsoft's algorithm first converts a temporary copy of the upload in memory to grayscale or black and white. It then breaks it into a grid of cells and uses histograms of the color intensity (brightness) of pixels in each cell to assign a "base" value to each cell. Together the cells in an image make up its "DNA" -- a unique signature.
Microsoft's PhotoDNA technology does not compromise customer privacy (see images). The only customers who suffer inspection are those who have an image that appears extremely likely to be child pornography.
This approach has a few advantages.
First, as mentioned it does not allow for more intrusive forms of analysis like facial recognition that pose more thorny privacy risks. Second, it's faster and cheaper than more complicated algorithms. Third, by using histograms based on cell-divide color intensity Microsoft can potentially detect images even if pedophiles make crude attempts to disguise them by applying blur filters to part of the image or by lightening/darkening the image (as the relatively values would all shift by an offset or possibly remain unchanged in a blur, depending on the blur type and cell size).
Google appears to be using PhotoDNA for its own efforts, although it too had worked on similar efforts in-house. Facebook, Inc. (
) and Twitter also use PhotoDNA and the NCMEC database to watch for child pornography. Apple, a Microsoft licensee, may use it as well but did not respond to comments about its involvement.
While it's tempting to jump to conclusions and field slippery slope arguments, it's important to view these efforts in context. First, Google, Microsoft, Facebook, Apple, and others are not automatically alerting authorities when a user receives an offensive image. Generally the policy at all of these firms appears to be to scan uploaded images using the hashing technology, and then inspect the user's account contents, if the scripts raise an alert.
While this industry wide effort may be viewed as a ubiquitous form of limited surveillance it is a voluntary one as the companies traditionally clearly warn against such uploads in their contracts -- legally binding documents.
VII. Microsoft Spending its Own Money to Fight Unconstitutional Mass Spying
Thus, the initiation of the process is highly different from the NSA spying or other ubiquitous surveillance campaigns. When
the NSA collects data from Americans
or American companies,
it's in effect robbing them
as the targets traditionally have never authorized the agency to collect or store their personal text or images. Thus in government terms
privacy is supposed to mean
"don't look at my stuff unless I commit a crime and you have a warrant".
Citizens never gave the government permission to spy on them and abandon due process.
[Image Source: Nation of Change]
Microsoft, Google, and Facebook have all
filed suits against the government
looking to block overreaching government data requests. In fact, Microsoft’s condemnation is particularly ironic given that perhaps it's the corporation that is
fighting the hardest in court
spending its own money to protect users
from NSA spying.
This contrasts sharply with other companies like Oracle Corp. (
) and Amazon.com, Inc. (
not only condone government spying
on U.S. citizens, but
pay politicians lobbying money
in favor of it. Microsoft's corporate enemies
profit off violating users' privacy on a massive scale for the government
; Microsoft is spending its hard earned profit to try to protect that privacy.
And yet still some condemn it for cooperating with one extremely reasonable and limited law enforcement effort.
Surely Microsoft managers and engineers must be throwing their hands up in frustration at the ingratitude. If only customers could see the difference, they must think -- if only they could see how this campaign has no impact on the privacy of non-child predators.
Microsoft -- and unbiased technically aware observers -- will perhaps first recognize a startling dissimilarity between
the NSA's data dominancy campaigns
and Microsoft, et al.'s PhotoDNA. PhotoDNA does its best to protect anonymity and privacy. In that regard it's
closer to the ACLU
than the NSA.
Unsurprisingly, like many criminal elements the NSA has
already been caught regularly
own web of lies
. Not only does its agents break the law,
its leaders lied to Congress
with no remorse
. It has made every effort to fight transparency.
The NSA is anti-transparency, its agents break the law (admittedly) thousands of times a year, and its leaders lie to Congress. Microsoft's anti-predator program is pro-transparency, obeys U.S. laws, and is forthright to the citizens it protects. [Image Source: tumblr.com]
By contrast, Microsoft has made every effort to promote transparency and clearly publicize the technology behind its child protection efforts to anyone who cares to bother to learn about it.
Another key difference? Microsoft's program clearly is catching criminals in a lawful way that respects due process. By contrast the NSA
can't seem to provide solid evidence
that its mass spying has succeeded in catching a single criminal, yet it did admit to Congress that
its agents were committing thousands of "accidental" crimes against Americans
every year. If you can't see the difference there, then you need to reread this paragraph.
VIII. Customers Told Google, Microsoft, et al. to Store and Process Their Data
Where the NSA takes without permission, internet firms participating in the PhotoDNA effort are only using what users give them. Users explicitly authorize Microsoft, Google, and other service providers to access, process, and store their images. Without such permission, no cloud storage or email services would be possible.
There's no such thing as a completely secret email service in the modern context as since the 1980s messages and attachments have been shuffled to long-term storage that allows the user to read them at their convenience. So to some degree Google, Microsoft, Facebook, and others have to "scan" and run your images through scripts and store them in order to fulfill your requests.
Users may be unaware of the technical realities (likely many are). But when you sign up for Dropbox, OneDrive, Gmail, Outlook.com, iCloud, Facebook, or any other major service you are authorizing the company to store and process your images. Its behavior is generally limited by
the contract you sign with it -- your terms of service agreement
While some have confused image fingerprinting with more advanced surveillance tactics, image fingerprinting is very different as it's done on the fly and is anonymous. In a legal context it's only been successful in combatting one crime -- sexual abuse of children. [Image Source: CSherpa]
But that document is a two-way street. It also binds the user to not engage in certain practices deemed unacceptable (i.e. uploading child pornography).
If a user accepts the contract and then breaches it, they've committed a civil crime/infraction against the tech company, which absolutely has every right to complain to authorities if it decides that's the best thing to do.
Second, it should be said that Microsoft, Google, and others offer most of their services to individual users free of charge. Generally it should be frowned upon for these companies to inspect your data more closely and scrutinize it, but in the name of monetization such inspection inevitably occurs and is traditionally warned of in the user contract (ToS).
IX. Even Most Pro-Privacy Nations Support Hash-Based Checking for Child Porn
Probably the most reasonable restriction is to forbid the service provider from sharing personally identifiable information with third parties. This is the perspective that pro-privacy EU states such as Germany and the Netherlands have pushed Google, Facebook, Microsoft, and others to adopt.
But even these adamantly pro-privacy nations have also supported local efforts similar to the NCMEC/U.S. law enforcement campaign against pedophiles. That support is not inconsistent for a couple of reasons that were already partially outlined.
First, detecting widely distributed child pornography is perhaps the easiest crime to detect. A company already, as mentioned, runs lossless compression and anti-duplication scripts on your data to efficiently store it. It's easy to inject one more simple script into it. That script produces no personally identifiable info.
Second, while a handful of internet commenters may disagree, the vast majority of citizens even in strongly pro-privacy states believe it should be illegal to possess and distribute pornographic images of children. There are gray areas (e.g. photos of babies that are artistically posed, but nude), but when it comes to images that depict violent sex crimes against children (as many of these child pornography images do), few would condone them.
Third, detection can be done in a way so as to protect the anonymity, political freedoms, and free speech of law abiding citizens, hence it is not inconsistent with the end goals of even the most aggressive pro-privacy efforts.
Fourth, detection only is the first of many steps that could eventually lead to prosecution. Detection leads to the company who found it inspect the data you authorized them to store and process. Google, Microsoft, and others are aware that sometimes people are sent these kinds of images as malicious pranks or as spam.
In most, if not all the cases where they've alerted authorities it's been because there's easily spotted patterns of egregious behavior. For instance, Google indicated in the recent Texas case that the user was identified as a registered sex offender and had discussed in his Gmail messages sexual fantasies about performing illegal sex acts on children. Law abiding users generally have nothing to worry about since the process is so well controlled and rigorous.
X. Innocent Typically Go Free in Rare Cases of Elaborating Framing
Finally, even if some innocent individual is fingered by Google or Microsoft due to an elaborate framing effort or some sort of rare and catastrophic failure of the companies' analysis protocols, at worst the authorities have only been given at tip to investigate the user with supporting evidence. The user still has every right to plead their case both before any trial and in court should they be charged. In most, if not all such cases, wrongfully accused individuals did successfully plead their innocence, even in a nation like the U.S. where courts can be rather technically ignorant and biased at times.
For example, a couple in the state of New York in early 2011 was implicated in child pornography but later found innocent. While it is true they went through quite an ordeal, investigators
were able to find the true culprit in the end
-- a 25-year-old college student who lived in nearby apartments and was squatting on their Wi-Fi.
Likewise, in a more wild and well-publicized incident NBA player Christopher "Birdman" Andersen was
targeted in a wild scheme
that not only appeared to attempt to frame him as possessing child pornography, but also to extort the child involved -- an underage Californian girl.
Free bird: Chris Andersen's case shows even elaborate framing efforts struggle to convict innocent men for child sex crimes. Convictions typically come in light of glaring, overwhelming evidence.
[Image Source: First to Flyy/NBA]
After an extensive investigation in which authorities struggled with inconsistencies of the case, police finally traced IP logs from the communications in the case to a woman living in the Canadian province of Manitoba. The woman, who had conducted the entire outlandish scheme remotely, was arrested and charged with various crimes.
There's always the possibility of an innocent person being accused, but so far the Google and Microsoft driven partnership -- which has been embraced by Facebook, Twitter, and others -- has been quite effective in limiting false accusations. To date we were unable to locate any report of a case where a pedophile was accused based on a report from Google or Microsoft in which the evidence did not sound compelling enough to conduct a police investigation.
Further, the ultimate onus in both the Texas and Pennsylvania case will lie on the judge and jury in the courts they're charged in. It is up to them that make sure criminals are found guilty and punished and that the innocent go free. Google and Microsoft's job is respectively simple -- they just have to one extra script on the data that their users willingly provide them with and ask them to process/store.
XI. Business and Governments -- Not the Same Legally, Not the Same in Terms of Privacy Obligations
A bit of fear is understandable -- it's often hard for people to wrap their brains around the difference between responsible law enforcement and Orwellian spying in the digital age. But as a general rule of thumb it's always best to consider non-digital examples/analogies when try to assess these new scenarios with a fair and levelheaded gaze.
There is no real pre-digital analogy to the NSA spying -- a government collecting, storing, and programmatically examining every single piece of communication its citizens right and attempting to open and inspect every item they send.
But there's a long history of business owners tipping authorities when they see one of their customers engaging in questionable behavior. After all, sensible business owners even in the pre-digital era traditionally did precisely what Microsoft did -- have customers sign contracts to limit their liability should customers use their services for illegal behavior.
As a business owner -- whether online or offline -- it is wise to respect consumer privacy as you depend on your customers' trust to stay in business. On the other hand trust is a two way street. Liability is only so limited for nonprofits and corporations alike, should they turn a blind eye to sex crimes against minors.
Companies legally endanger themselves when they purposefully ignore child porngraphy being trafficked by their users. [Image Source: WordPress]
The U.S. was founded on the premise that governments were subject to privileges that businesses were not, but also that those privileges came with restrictions. The government, within the confines of due process (with warrants) can force you to be at a specific location, to give it affects, and to share information with it. But all those privileges come with great demands for respect to due process and privacy.
By contrast a private business or nonprofit cannot force you to do business with it. It cannot force you to give up your property or give it sensitive information. But it is also not traditionally legally bound to protect your privacy, particularly if you're committing a crime. In fact, if the services it's providing are involved in the commission of a crime it can be held civilly liable, if not criminally liable.
XII. Failing to Compare Image Fingerprints Would Only Protect Predators' Privacy
Liability is often based on "obviousness".
If a person uses Google's Gmail to plan a robbery or if they're buying zip ties and duct tape one Amazon to kidnap someone, the business who provided service is unlikely to be in any legal trouble, even in a civil sense because the pending crime was not obvious. It would only be detectable by engaging in complex surveillance of every single customer, often by human eyes.
On the other hand, in the case of child pornography it's a far different story. Due to the fact that most cloud storage providers already hash and perform crude analysis on files to fingerprint them to eliminate duplication, it's clear that Google, Microsoft, and others are already analyzing the files in script. Unlike nearly any other crime, in this case identification is typically painfully simple -- comparing widely distributed illegal images' fingerprints with the one involved.
Protecting privacy is necessary in the cloud; but Microsoft's effort respects privacy as it produces no identifiable results for law-abiding users.
If Microsoft and Google would follow that policy that some suggest -- to not compare the non-identifiable fingerprints of illegal child porn files to those it's creating for the storage of uploaded images -- it would not be protecting the privacy of the majority of its customers as these image fingerprints are not human inspected and are not personally identifiable unless they match a know image of child sexual abuse.
Thus if Google, Microsoft, Facebook, Twitter, Apple, and others were to bow to their critics, they would only be protecting one very small group of users -- those who prey on children sexually. And any organization who made such a choice would be willfully turning a blind eye to crime, and likely liable on the basis of obviousness.
In digital form, the issue of liability when it comes to cloud services used in crimes against children has never been truly tested, but the threat is certainly there, particularly given non-digital precedents.
XIII. What's the Cost of Complicity in Sex Crimes Against Children? Ask Penn State
The question of businesses willfully ignoring obvious signs of child sex abuse and the legality of such silence is a salient one, particularly given that the case in question occurred in the state of Pennsylvania. The state's most prominent institution --
Penn State University
-- was recently marred by an ugly sex crime investigation and trial.
In this case the business offline did exactly what critics say Google should have done online -- choose to ignore easy to detect signs that a customer or employee was engaging in sex crimes against children. It didn't stick its nose in football defensive coordinator, Gerald Arthur "Jerry" Sandusky's business even after employees relayed disturbing accounts. It "protected" Mr. Sandusky's "privacy".
Penn State did exactly what Microsoft's critics wish it did -- ignore sex crimes against children.
[Image Source: Deadspin]
Eventually the truth came out. Mr. Sandusky was a serial sexual predator, who had raped and molested at least ten young boys that he had carefully chosen, based on their vulnerable emotional or financial status. He had committed dozens of sex crimes, at least 20 of which were believed to occur on Penn State's campus.
Reports indicate that officials are conducting ongoing investigations into allegations that Mr. Sandusky distributed pornography both online and to contacts in other states via the U.S. Postal Service. Authorities
reportedly found child pornography images
on Mr. Sandusky's computer and photographs he had possibly taken and was intending to mail.
The onus isn't really on the U.S. Postal Service. Mr. Sandusky used the service via a sealed envelope -- the pre-digital equivalent of an encrypted email attachment. That's very different from someone who hands a business an image that's obviously child pornography -- an image that would be easy to notice both in the online and offline case without compromising customer privacy.
To make an analogy, the recent Google and Microsoft cases would be analogous to if Mr. Sandusky had handed local FedEx employees a handful of pictures depicting sex crimes against children, and then the employees without question packaged those photos in an envelope and mailed them to his requested destination.
At that point the business has crossed the line from protecting privacy to being complicit in sexual abuse of children.
XIV. No Need for Emotional Arguments -- Microsoft's Actions are Ethical
Mr. Hoffman reportedly made no attempt to disguise his uploads. He brazenly posted them to his account and tried to send them to friends with similar interests. Microsoft did what it needed to do to make sure privacy of more young children was not violated.
How well does it work out to follow the critics’ advice and turn a blind eye to easily detectable abuse? Perhaps ask Penn State.
Even as Mr. Sandusky sits in prison having been found guilty of dozens of sex crimes against children, the university he invested so many years in -- but also committed crimes at -- is reeling from punishments for what a Judge recently called "a conspiracy of silence".
Virtually every staff member involved in the cover-up was fired or forced to resign. Joe Paterno, who prior to NCAA sanctions was the winningest NCAA football coach in history was fired. Athletic director Tim Curley was fired. University president Graham Spanier and vice president Gary Schultz resigned.
But the consequences for those involved in the stunning serial abuse may yet run far deeper than mere career loss. For Coach Paterno the shame ended quickly, after he succumbed to lung cancer months after his firing. But today his colleagues -- Mr. Curley, Schultz, and Spanier
are awaiting trial
on numerous crimes including perjury, obstruction, endangering the welfare of children, failure to properly report suspected abuse, and conspiracy. After losing their jobs they may soon find themselves losing their freedom, if a jury chooses to send them to the same place as it sent Mr. Sandusky whose crimes they stand accused of turning a blind eye to.
Allowing Jerry Sandusky to commit crimes at its facilities will potentially cost Penn State hundreds of millions, if not billions of dollars. The effort to cover his acts up ruined the careers of those involved and may leave them sentenced to time in prison. [Image Source: AP]
The case was also a huge blow to the university's finances and reputation. It was fined $60M USD by the NCAA (a fine which was set aside for programs to prevent child abuse like NCMEC, incidentally) and lost hundreds of millions in TV revenue from being banned from the post-season for four years. At least one victim has settled for a "sizeable sum"; meanwhile,
an even bigger settlement is still being negotiated with other victims
who rejected the university's initial offers. These settlements are expected to reach hundreds of millions as well.
Critics be damned, Microsoft is clearly doing the right thing here. It doesn't need to rely on emotional "think of the children" arguments because upon thorough closer inspection its policies are perfectly respectful of the privacy of its customers and are ethically sound. it would be insane to abandon this effort and return to casting a blind eye to child sex crimes, as some are advocating.
Microsoft is protecting customers' privacy by using ethical tools to purge child predators from its networks and from the streets. It's caught a predator and deserves praise, not condemnation. [Image Source: Dateline]
What's the cost of protecting giving special privileges that only protect the privacy of pedophiles? Ask Penn State, perhaps.
The Smoking Gun
This article is over a month old, voting and posting comments is disabled
RE: Hashing for Signatures is a Violation of Privacy
8/8/2014 5:38:43 PM
In this case MS does have more authority than the legal system. The picture was stored on MS server. And as it is MS property, it is LEGAL for them to search it, process it, do whatever the hell it want with it. It's you who are at fault for USING their server.
RE: Hashing for Signatures is a Violation of Privacy
8/11/2014 4:49:26 PM
Let me make three analogies of what is happening because you are not getting it.
1) Government puts out a hash list of "sensitive content" that require vendors to report all users who have content on that list. The system is ripe for abuse under any pretext. Journalist gets documents about high level federal corruption, hash matches, journalist is on watch list.
2) You receive unsolicited email that contains "sensitive content" in your spam folder. You backup all your emails onto a service provider's system. The service provider sends notification to the government about your "sensitive content" and a warrant is provided for all of your data.
3) Terms of Service is not the equivalent of a notorized contract. By uploading it to Microsoft's server, I provided non-legally binding access to the files to store. I did not transfer ownership of the files to Microsoft. Microsoft can try to legally enforce their TOS but it will most likely be tossed from the court of overreach.
In this specific case, sensitive content is child porn. However, that does not mean it will always be the case. It merely sets precedent that this kind of system is allowed.
"It seems as though my state-funded math degree has failed me. Let the lashings commence." -- DailyTech Editor-in-Chief Kristopher Kubicki
Internet Commenters Blast Google for Turning in Child Predator
August 5, 2014, 9:10 PM
EDITORIAL: Obama Administration Caught Lying About Effort to Suppress NSA Leaks
July 14, 2014, 12:00 PM
Google Updates Terms of Service to Disclose Email Scanning for Targeted Ads
April 15, 2014, 9:29 AM
Editorial: Tax and Spy Pt. II: Obama Plans to Privatize, Expand Spying on Americans
March 28, 2014, 12:00 AM
New Facebook Software Uses AI to Match Faces with (Almost) Human Accuracy
March 18, 2014, 4:35 PM
ISIS Supporters Threaten "Charlie Hedbo style" Attack Against Twitter Employees
March 3, 2015, 4:26 PM
Lenovo Vows to Drop "Adware" and "Bloatware" From Its PCs
February 27, 2015, 3:09 PM
Google Steps up Snub of Adobe Flash, Auto-Converting Flash Ads to HTML5
February 25, 2015, 6:16 PM
Quick Note: Microsoft Gives Dropbox Users Extra 100 GB of Free OneDrive Storage
February 20, 2015, 9:48 AM
Anonymous vs. the ISIS Cyber Caliphate -- War in the Middle East Goes Digital
February 12, 2015, 8:54 PM
Jeb Bush Releases Years of Emails, Leaking Names, Social Security Numbers
February 11, 2015, 8:30 PM
Most Popular Articles
Windows 10 Adds USB 3.1 for Dual-Role Peripherals, External Display Support
February 27, 2015, 11:39 AM
FCC Bans Data Discrimination, Defies Comcast, Adopting Net Neutrality Regulation
February 26, 2015, 4:03 PM
Australian Engineers Successfully Developed 3D-Printed Jet Engines
March 2, 2015, 11:08 AM
Samsung is Bringing Sexy Back w/ Launch of Razor-Thin Galaxy S6 and S6 Edge
March 3, 2015, 2:25 AM
Lenovo Vows to Drop "Adware" and "Bloatware" From Its PCs
February 27, 2015, 3:09 PM
Latest Blog Posts
Sceptre Airs 27", 120 Hz. 1080p Monitor/HDTV w/ 5 ms Response Time for $220
Dec 3, 2014, 10:32 PM
Costco Gives Employees Thanksgiving Off; Wal-Mart Leads "Black Thursday" Charge
Oct 29, 2014, 9:57 PM
"Bear Selfies" Fad Could Turn Deadly, Warn Nevada Wildlife Officials
Oct 28, 2014, 12:00 PM
The Surface Mini That Was Never Released Gets "Hands On" Treatment
Sep 26, 2014, 8:22 AM
ISIS Imposes Ban on Teaching Evolution in Iraq
Sep 17, 2014, 5:22 PM
More Blog Posts
Copyright 2015 DailyTech LLC. -
Terms, Conditions & Privacy Information