backtop


Print 71 comment(s) - last by EricMartello.. on Aug 12 at 2:58 PM

Failing to compare anonymous image fingerprints protects only one group of customers -- child predators

A little over a week ago, a Google Inc. (GOOG) tip led to the arrest of a previously convicted child rapist in Texas who police say had been hoarding child pornography in his Gmail account and secretly videotaping children at a local Denny's Corp. (DENN) restaurant in Pasadena, Texas where he worked.
 
I. Microsoft Helps Authorities Catch a Predator in Pennsylvania
 
Now just days later, Microsoft Corp. (MSFT) has helped authorities catch another child predator, this time in Pennsylvania.  And as in the Google incident the arrest has triggered outrage among some internet commenters who claim that Microsoft, Google, and Apple, Inc. (AAPL) are violating their "rights" and privacy.
 
The Smoking Gun first reported the recent arrest.  It published a copy of the affidavit:
Pennsylvania affidavit

The affidavit reveals the identity of the defendant -- Tyler James Hoffman, 20.  He was arrested on April 24 by the Pennsylvania State Police Computer Crime Task Force and Pennsylvania Internet Crimes Against Children Task Force (ICAC), after an investigation determined he had solicited and obtained multiple pornographic images of pre-pubescent children aged 7 to 13 engaged in sex acts or being sexually assaulted.
 
A search of Pennsylvania's sex offender registry did not indicate Mr. Hoffman to have been previously convicted.  
 
The case has been widely reported on, even receiving coverage from the BBC News.
 
II. Police: Suspect Sought Out Images of Girls Aged 6 to 13
 
Pennsylvania State Police Trooper Christopher Hill confirmed the arrest and reported that a Microsoft tip triggered it.  In a sworn affidavit police investigators say they were alerted to the man's behavior after a Microsoft script that scans uploaded files for child pornography detected not one, but two uploads of known child pornography.

Penn State Police
Pennsylvania State Police picked up the accused child predator in late April. [Image Source: CCJ Magazine]

The first image involved a girl between the ages of 7 and 10 performing a sex act.  Police officers indicated to Philly.com that Mr. Hoffman also obtained an image of a slightly older underage female -- between the ages of 10 and 13 -- having intercourse.
 

Microsoft detected Mr. Hoffman uploading these images to Skydrive -- its cloud storage service, which was in February was rebranded as "OneDrive".

OneDrive Microsoft

After being questioned, Mr. Hoffman admitted to officers, according to court documents, that he obtained the images via contacts on Kik, a cross-platform messenger service.  After uploading the pictures to Microsoft's cloud storage service, he then tried to send the images via his Microsoft live.com email account to another user, further establishing his intent to possess and redistribute the illegal images.

Tyler Hoffman
[Image Source: Penn. State Police (left) / Facebook (right)]

According to police, he was arrested on July 31 at work and admitted to the crime after being questioned that he had been "trading and receiving images of child pornography on his mobile cellular device".
 
III. The Boy That Cried Wolf
 
The report yet again has provoked some internet commenters to explosions of outrage, such as some commenters on a Slashdot post on the arrest.  "Jane Q. Public" writes:

It isn't so much a matter of "Look! they did something great!" (and they did)... it's more a matter of: look at the shitty (sic) privacy intrusion they've committed on hundreds of thousands, if not millions, of people, in order to accomplish that one great thing.

Another user, "Ol Olsoc", seems to liken it to government officials raping someone's wife, commenting:
 
I worked with a guy who once said. "I don't care if they come into my bedroom and f*ck my wife, as long as they keep the country secure". He was willing to give up any semblance of freedom for his "security".

Microsoft predator

Slashdot users were among those suggesting that child predators be afforded special privacy protections that help hide their crimes.

Another commenter "thieh" says they're fine with Microsoft and Google reading their emails for profit (advertising), but performing a much less intrusive scan to catch child predators crosses the line.  According to this commenter, Microsoft should have warned the defendant that molesting children or sharing illegal pictures of them might result in jail time.  The commenter writes:

The problem usually comes down to that "personally relevant product features, such as customized search results, tailored advertising, and spam and malware detection" didn't include "days in court" nor "jail time" as their catalog of "personally relevant product features".

But such comments ignore the realities of cloud storage and email services.
 
III. Critics Appear Ignorant of Reality: Hashing is Part of the Normal Storage Process
 
First, top online firms like Apple, Google, and Microsoft virtually all had hashing projects that predated the anti-child porn efforts.  These hashing technologies were originally adopted to improve power efficiency, storage, and access times.  Many images on the internet (think popular memes) are regularly downloaded, reuploaded, and sent via email by users.  
 
By recognizing images already stored on their servers, top internet firms enjoy cost savings and can provide better (faster) service.  Private images are typically new, so the temporary hash created at upload provides little identifiable insight that could be used to "target" users outside of very specific scenarios, despite the current rash of uninformed fear mongering.

Microsoft data center
Digital fingerprinting is primarily used to improve the efficiency of cloud storage and had little use to law enforcement concerning most crimes as it does not provide personally identifiable data.
[Image Source: Microsoft]

The only current scenario in which images are checked for illegal activity is in the case of child pornography.
 
Much of the limitations to the intrusiveness of these kinds of checks are due to the inherent technical details.  Versus techniques employed by the U.S. National Security Agency (NSA) and other sophisticated police entities, hash checks are unable to identify wholly new images, as they don't use any type of shape-based recognition algorithm such as facial recognition.  This means to flag a user for individual inspection, the image must match a database of known images of offensive nature.
 
IV. Dangers are Mitigated by Transparency
 
There are potential applications in a dystopian future -- such as a police state "detecting" and arresting those who distribute known anti-government imagery or used by government agents to hunt down those who leak images that show illegal government behavior.  But Google and Microsoft actively oppose such uses.
 
In other words, there is valid cause to be watchful and vigilante to avoid abuse.  But there's a difference between vigilance and baseless paranoia and false accusations.
 
Emma Carr, a director of a non-profit pro-privacy advocacy, Big Brother Watch, appears to be a reasonable voice.  While she warns of dangers (such as the aforementioned political suppression) she implies that in the current context the efforts appear appropriate -- something the most vitriolic critics appear to be ignoring.
 
She tells BBC News:

Microsoft must do all that it can to inform users about what proactive action it takes to monitor and analyze messages for illegal content, including details of what sorts of illegal activity may be targeted.

It is also important that all companies who monitor messages in this way are very clear about what procedures and safeguards are in place to ensure that people are not wrongly criminalized, for instance, when potentially illegal content is shared but has been done so legitimately in the context of reporting or research.

Microsoft new sign
While some internet critics aren't placated, privacy advocates say Microsoft is doing the right thing in explaining its policies and protecting the privacy of the average consumer. [Image Source: Bloomberg]

Microsoft and Google fortunately appear to be increasingly trending towards transparency in terms of this hash scanning detection effort.  Microsoft like Google already explicitly warned users that their data would be subject to automatic detection algorithms to combat child pornography:

Microsoft [reserves the right to utilize] automated technologies to detect child pornography or abusive behaviour that might harm the system, our customers, or others.

Following the two recent arrests, Microsoft and Google have come forward to shed more light on the technology and to explain why they don't believe it poses any significant threat to customer privacy in its current form.
 
V. Limited Tools, Joint Effort Protects Privacy and Children Alike
 
Compared to richer analysis techniques like facial recognition and examination by human analysts (whose brains are equipped with advanced facial and shape recognition algorithms), this is a relatively crude tool that requires a very specific and obvious focus.
 
In Google and Microsoft's case, the supporting data set comes courtesy of the National Center for Missing and Exploited Children (NCMEC).  NCMEC, an anti-child abuse group who runs the "Missing Kids" campaign, in 2011 received government permission to compile a database of signatures (hashes) of known child pornography images.

Missing children logo


When child predators are caught (typically after engaging in real world sex crimes or attempts at such crimes) search warrants are often obtained to examine their device.  Such searches face few ethical objections if the defendant left the devices unprotected by a password or offers investigators access.  Based on these kinds of inspections, law enforcement officials in the U.S. daily acquire a large amount of child pornography imagery that is then passed off to NCMEC, which has exclusive permission to legal keep signatures of these images.
 
VI. In-Depth: How Microsoft Technology Preserves Anonymity
 
Microsoft's technology, for example, is called "PhotoDNA".  Jointly developed by digital forensics firm Netclean and Microsoft Labs, this advanced technology aims to protect the privacy of customers while preventing the transmission of images of child sexual abuse.  To do that it uses familiar technologies to protect anonymity, while preserving the ability to search for a small, targeted group of images.
 
Microsoft's Digital Crimes Unit chief, Mark Lamb, comments on the recent cases:

Child pornography violates the law as well as our terms of service, which makes clear that we use automated technologies to detect abusive behaviour that may harm our customers or other.  In 2009, we helped develop PhotoDNA, a technology to disrupt the spread of exploitative images of children, which we report to the National Center for Missing and Exploited Children as required by law.

PhotoDNA logo

To efficiently analyze a massive amount of information, Microsoft's algorithm first converts a temporary copy of the upload in memory to grayscale or black and white.  It then breaks it into a grid of cells and uses histograms of the color intensity (brightness) of pixels in each cell to assign a "base" value to each cell.  Together the cells in an image make up its "DNA" -- a unique signature.

PhotoDNA
Microsoft's PhotoDNA technology does not compromise customer privacy (see images). The only customers who suffer inspection are those who have an image that appears extremely likely to be child pornography.

This approach has a few advantages.  
 
First, as mentioned it does not allow for more intrusive forms of analysis like facial recognition that pose more thorny privacy risks. Second, it's faster and cheaper than more complicated algorithms.  Third, by using histograms based on cell-divide color intensity Microsoft can potentially detect images even if pedophiles make crude attempts to disguise them by applying blur filters to part of the image or by lightening/darkening the image (as the relatively values would all shift by an offset or possibly remain unchanged in a blur, depending on the blur type and cell size).

Microsoft Digital Crimes unit

Google appears to be using PhotoDNA for its own efforts, although it too had worked on similar efforts in-house.  Facebook, Inc. (FB) and Twitter also use PhotoDNA and the NCMEC database to watch for child pornography.  Apple, a Microsoft licensee, may use it as well but did not respond to comments about its involvement.
 
While it's tempting to jump to conclusions and field slippery slope arguments, it's important to view these efforts in context.  First, Google, Microsoft, Facebook, Apple, and others are not automatically alerting authorities when a user receives an offensive image.  Generally the policy at all of these firms appears to be to scan uploaded images using the hashing technology, and then inspect the user's account contents, if the scripts raise an alert.
 
While this industry wide effort may be viewed as a ubiquitous form of limited surveillance it is a voluntary one as the companies traditionally clearly warn against such uploads in their contracts -- legally binding documents.  
 
VII. Microsoft Spending its Own Money to Fight Unconstitutional Mass Spying
 
Thus, the initiation of the process is highly different from the NSA spying or other ubiquitous surveillance campaigns.  When the NSA collects data from Americans or American companies, it's in effect robbing them as the targets traditionally have never authorized the agency to collect or store their personal text or images.  Thus in government terms privacy is supposed to mean "don't look at my stuff unless I commit a crime and you have a warrant".

Government spying
Citizens never gave the government permission to spy on them and abandon due process.
[Image Source: Nation of Change]

Microsoft, Google, and Facebook have all filed suits against the government looking to block overreaching government data requests.  In fact, Microsoft’s condemnation is particularly ironic given that perhaps it's the corporation that is fighting the hardest in court, spending its own money to protect users from NSA spying.
 
This contrasts sharply with other companies like Oracle Corp. (ORCL) and Amazon.com, Inc. (AMZN) that not only condone government spying on U.S. citizens, but pay politicians lobbying money in favor of it.  Microsoft's corporate enemies profit off violating users' privacy on a massive scale for the government; Microsoft is spending its hard earned profit to try to protect that privacy.


And yet still some condemn it for cooperating with one extremely reasonable and limited law enforcement effort.
 
Surely Microsoft managers and engineers must be throwing their hands up in frustration at the ingratitude.  If only customers could see the difference, they must think -- if only they could see how this campaign has no impact on the privacy of non-child predators.
 
Microsoft -- and unbiased technically aware observers -- will perhaps first recognize a startling dissimilarity between the NSA's data dominancy campaigns and Microsoft, et al.'s PhotoDNA.  PhotoDNA does its best to protect anonymity and privacy.  In that regard it's closer to the ACLU than the NSA.  
 
Unsurprisingly, like many criminal elements the NSA has already been caught regularly in its own web of lies.  Not only does its agents break the law, its leaders lied to Congress, seemingly with no remorse.  It has made every effort to fight transparency.

Yes we Scan
The NSA is anti-transparency, its agents break the law (admittedly) thousands of times a year, and its leaders lie to Congress.  Microsoft's anti-predator program is pro-transparency, obeys U.S. laws, and is forthright to the citizens it protects. [Image Source: tumblr.com]

By contrast, Microsoft has made every effort to promote transparency and clearly publicize the technology behind its child protection efforts to anyone who cares to bother to learn about it.
 
Another key difference?  Microsoft's program clearly is catching criminals in a lawful way that respects due process.  By contrast the NSA can't seem to provide solid evidence that its mass spying has succeeded in catching a single criminal, yet it did admit to Congress that its agents were committing thousands of "accidental" crimes against Americans every year.  If you can't see the difference there, then you need to reread this paragraph.
 
VIII. Customers Told Google, Microsoft, et al. to Store and Process Their Data
 
Where the NSA takes without permission, internet firms participating in the PhotoDNA effort are only using what users give them.  Users explicitly authorize Microsoft, Google, and other service providers to access, process, and store their images.  Without such permission, no cloud storage or email services would be possible.
 
There's no such thing as a completely secret email service in the modern context as since the 1980s messages and attachments have been shuffled to long-term storage that allows the user to read them at their convenience.  So to some degree Google, Microsoft, Facebook, and others have to "scan" and run your images through scripts and store them in order to fulfill your requests.
 
Users may be unaware of the technical realities (likely many are).  But when you sign up for Dropbox, OneDrive, Gmail, Outlook.com, iCloud, Facebook, or any other major service you are authorizing the company to store and process your images.  Its behavior is generally limited by the contract you sign with it -- your terms of service agreement
 
Google privacy
While some have confused image fingerprinting with more advanced surveillance tactics, image fingerprinting is very different as it's done on the fly and is anonymous.  In a legal context it's only been successful in combatting one crime -- sexual abuse of children. [Image Source: CSherpa]
 
But that document is a two-way street.  It also binds the user to not engage in certain practices deemed unacceptable (i.e. uploading child pornography).  
 
If a user accepts the contract and then breaches it, they've committed a civil crime/infraction against the tech company, which absolutely has every right to complain to authorities if it decides that's the best thing to do.
 
Second, it should be said that Microsoft, Google, and others offer most of their services to individual users free of charge.  Generally it should be frowned upon for these companies to inspect your data more closely and scrutinize it, but in the name of monetization such inspection inevitably occurs and is traditionally warned of in the user contract (ToS).
 
IX. Even Most Pro-Privacy Nations Support Hash-Based Checking for Child Porn
 
Probably the most reasonable restriction is to forbid the service provider from sharing personally identifiable information with third parties.  This is the perspective that pro-privacy EU states such as Germany and the Netherlands have pushed Google, Facebook, Microsoft, and others to adopt. 
 
But even these adamantly pro-privacy nations have also supported local efforts similar to the NCMEC/U.S. law enforcement campaign against pedophiles.  That support is not inconsistent for a couple of reasons that were already partially outlined.
 
First, detecting widely distributed child pornography is perhaps the easiest crime to detect.  A company already, as mentioned, runs lossless compression and anti-duplication scripts on your data to efficiently store it.  It's easy to inject one more simple script into it.  That script produces no personally identifiable info.
 
Second, while a handful of internet commenters may disagree, the vast majority of citizens even in strongly pro-privacy states believe it should be illegal to possess and distribute pornographic images of children.  There are gray areas (e.g. photos of babies that are artistically posed, but nude), but when it comes to images that depict violent sex crimes against children (as many of these child pornography images do), few would condone them.
 
Third, detection can be done in a way so as to protect the anonymity, political freedoms, and free speech of law abiding citizens, hence it is not inconsistent with the end goals of even the most aggressive pro-privacy efforts.
 
Fourth, detection only is the first of many steps that could eventually lead to prosecution. Detection leads to the company who found it inspect the data you authorized them to store and process.  Google, Microsoft, and others are aware that sometimes people are sent these kinds of images as malicious pranks or as spam.  
 
In most, if not all the cases where they've alerted authorities it's been because there's easily spotted patterns of egregious behavior.  For instance, Google indicated in the recent Texas case that the user was identified as a registered sex offender and had discussed in his Gmail messages sexual fantasies about performing illegal sex acts on children.  Law abiding users generally have nothing to worry about since the process is so well controlled and rigorous.
 
X. Innocent Typically Go Free in Rare Cases of Elaborating Framing
 
Finally, even if some innocent individual is fingered by Google or Microsoft due to an elaborate framing effort or some sort of rare and catastrophic failure of the companies' analysis protocols, at worst the authorities have only been given at tip to investigate the user with supporting evidence.  The user still has every right to plead their case both before any trial and in court should they be charged.  In most, if not all such cases, wrongfully accused individuals did successfully plead their innocence, even in a nation like the U.S. where courts can be rather technically ignorant and biased at times.
 
For example, a couple in the state of New York in early 2011 was implicated in child pornography but later found innocent.  While it is true they went through quite an ordeal, investigators were able to find the true culprit in the end -- a 25-year-old college student who lived in nearby apartments and was squatting on their Wi-Fi.
 
Likewise, in a more wild and well-publicized incident NBA player Christopher "Birdman" Andersen was targeted in a wild scheme that not only appeared to attempt to frame him as possessing child pornography, but also to extort the child involved -- an underage Californian girl.  

Chris Andersen
Free bird: Chris Andersen's case shows even elaborate framing efforts struggle to convict innocent men for child sex crimes.  Convictions typically come in light of glaring, overwhelming evidence.
[Image Source: First to Flyy/NBA]

After an extensive investigation in which authorities struggled with inconsistencies of the case, police finally traced IP logs from the communications in the case to a woman living in the Canadian province of Manitoba.  The woman, who had conducted the entire outlandish scheme remotely, was arrested and charged with various crimes.
 
There's always the possibility of an innocent person being accused, but so far the Google and Microsoft driven partnership -- which has been embraced by Facebook, Twitter, and others -- has been quite effective in limiting false accusations.  To date we were unable to locate any report of a case where a pedophile was accused based on a report from Google or Microsoft in which the evidence did not sound compelling enough to conduct a police investigation.
 
Further, the ultimate onus in both the Texas and Pennsylvania case will lie on the judge and jury in the courts they're charged in.  It is up to them that make sure criminals are found guilty and punished and that the innocent go free.  Google and Microsoft's job is respectively simple -- they just have to one extra script on the data that their users willingly provide them with and ask them to process/store.
 
XI. Business and Governments -- Not the Same Legally, Not the Same in Terms of Privacy Obligations
 
A bit of fear is understandable -- it's often hard for people to wrap their brains around the difference between responsible law enforcement and Orwellian spying in the digital age.  But as a general rule of thumb it's always best to consider non-digital examples/analogies when try to assess these new scenarios with a fair and levelheaded gaze.  
 
There is no real pre-digital analogy to the NSA spying -- a government collecting, storing, and programmatically examining every single piece of communication its citizens right and attempting to open and inspect every item they send.
 
But there's a long history of business owners tipping authorities when they see one of their customers engaging in questionable behavior.  After all, sensible business owners even in the pre-digital era traditionally did precisely what Microsoft did -- have customers sign contracts to limit their liability should customers use their services for illegal behavior.
 
As a business owner -- whether online or offline -- it is wise to respect consumer privacy as you depend on your customers' trust to stay in business.  On the other hand trust is a two way street.  Liability is only so limited for nonprofits and corporations alike, should they turn a blind eye to sex crimes against minors.

sex offenders
Companies legally endanger themselves when they purposefully ignore child porngraphy being trafficked by their users. [Image Source: WordPress]

The U.S. was founded on the premise that governments were subject to privileges that businesses were not, but also that those privileges came with restrictions.  The government, within the confines of due process (with warrants) can force you to be at a specific location, to give it affects, and to share information with it.  But all those privileges come with great demands for respect to due process and privacy.
 
By contrast a private business or nonprofit cannot force you to do business with it.  It cannot force you to give up your property or give it sensitive information.  But it is also not traditionally legally bound to protect your privacy, particularly if you're committing a crime.  In fact, if the services it's providing are involved in the commission of a crime it can be held civilly liable, if not criminally liable.
 
 XII. Failing to Compare Image Fingerprints Would Only Protect Predators' Privacy
 
Liability is often based on "obviousness".
 
If a person uses Google's Gmail to plan a robbery or if they're buying zip ties and duct tape one Amazon to kidnap someone, the business who provided service is unlikely to be in any legal trouble, even in a civil sense because the pending crime was not obvious.  It would only be detectable by engaging in complex surveillance of every single customer, often by human eyes.
 
On the other hand, in the case of child pornography it's a far different story.  Due to the fact that most cloud storage providers already hash and perform crude analysis on files to fingerprint them to eliminate duplication, it's clear that Google, Microsoft, and others are already analyzing the files in script.  Unlike nearly any other crime, in this case identification is typically painfully simple -- comparing widely distributed illegal images' fingerprints with the one involved.
Cloud privacy
Protecting privacy is necessary in the cloud; but Microsoft's effort respects privacy as it produces no identifiable results for law-abiding users.

If Microsoft and Google would follow that policy that some suggest -- to not compare the non-identifiable fingerprints of illegal child porn files to those it's creating for the storage of uploaded images -- it would not be protecting the privacy of the majority of its customers as these image fingerprints are not human inspected and are not personally identifiable unless they match a know image of child sexual abuse.  
 
Thus if Google, Microsoft, Facebook, Twitter, Apple, and others were to bow to their critics, they would only be protecting one very small group of users -- those who prey on children sexually.  And any organization who made such a choice would be willfully turning a blind eye to crime, and likely liable on the basis of obviousness.
 
In digital form, the issue of liability when it comes to cloud services used in crimes against children has never been truly tested, but the threat is certainly there, particularly given non-digital precedents.
 
XIII. What's the Cost of Complicity in Sex Crimes Against Children? Ask Penn State
 
The question of businesses willfully ignoring obvious signs of child sex abuse and the legality of such silence is a salient one, particularly given that the case in question occurred in the state of Pennsylvania.  The state's most prominent institution -- Penn State University -- was recently marred by an ugly sex crime investigation and trial.
 
In this case the business offline did exactly what critics say Google should have done online -- choose to ignore easy to detect signs that a customer or employee was engaging in sex crimes against children.  It didn't stick its nose in football defensive coordinator, Gerald Arthur "Jerry" Sandusky's business even after employees relayed disturbing accounts.  It "protected" Mr. Sandusky's "privacy".

Pedobear Penn State
Penn State did exactly what Microsoft's critics wish it did -- ignore sex crimes against children.
[Image Source: Deadspin]

Eventually the truth came out.  Mr. Sandusky was a serial sexual predator, who had raped and molested at least ten young boys that he had carefully chosen, based on their vulnerable emotional or financial status.  He had committed dozens of sex crimes, at least 20 of which were believed to occur on Penn State's campus.
 
Reports indicate that officials are conducting ongoing investigations into allegations that Mr. Sandusky distributed pornography both online and to contacts in other states via the U.S. Postal Service.  Authorities reportedly found child pornography images on Mr. Sandusky's computer and photographs he had possibly taken and was intending to mail.
 
The onus isn't really on the U.S. Postal Service.  Mr. Sandusky used the service via a sealed envelope -- the pre-digital equivalent of an encrypted email attachment.  That's very different from someone who hands a business an image that's obviously child pornography -- an image that would be easy to notice both in the online and offline case without compromising customer privacy.
 
To make an analogy, the recent Google and Microsoft cases would be analogous to if Mr. Sandusky had handed local FedEx employees a handful of pictures depicting sex crimes against children, and then the employees without question packaged those photos in an envelope and mailed them to his requested destination.
 
At that point the business has crossed the line from protecting privacy to being complicit in sexual abuse of children.
 
XIV. No Need for Emotional Arguments -- Microsoft's Actions are Ethical
 
Mr. Hoffman reportedly made no attempt to disguise his uploads.  He brazenly posted them to his account and tried to send them to friends with similar interests.  Microsoft did what it needed to do to make sure privacy of more young children was not violated.
 
How well does it work out to follow the critics’ advice and turn a blind eye to easily detectable abuse?  Perhaps ask Penn State.
 
Even as Mr. Sandusky sits in prison having been found guilty of dozens of sex crimes against children, the university he invested so many years in -- but also committed crimes at -- is reeling from punishments for what a Judge recently called "a conspiracy of silence".
 
Virtually every staff member involved in the cover-up was fired or forced to resign.  Joe Paterno, who prior to NCAA sanctions was the winningest NCAA football coach in history was fired.  Athletic director Tim Curley was fired. University president Graham Spanier and vice president Gary Schultz resigned.  
 
But the consequences for those involved in the stunning serial abuse may yet run far deeper than mere career loss.  For Coach Paterno the shame ended quickly, after he succumbed to lung cancer months after his firing.  But today his colleagues -- Mr. Curley, Schultz, and Spanier are awaiting trial on numerous crimes including perjury, obstruction, endangering the welfare of children, failure to properly report suspected abuse, and conspiracy.  After losing their jobs they may soon find themselves losing their freedom, if a jury chooses to send them to the same place as it sent Mr. Sandusky whose crimes they stand accused of turning a blind eye to.

Jerry Sandusky
Allowing Jerry Sandusky to commit crimes at its facilities will potentially cost Penn State hundreds of millions, if not billions of dollars.  The effort to cover his acts up ruined the careers of those involved and may leave them sentenced to time in prison. [Image Source: AP]

The case was also a huge blow to the university's finances and reputation.  It was fined $60M USD by the NCAA (a fine which was set aside for programs to prevent child abuse like NCMEC, incidentally) and lost hundreds of millions in TV revenue from being banned from the post-season for four years.  At least one victim has settled for a "sizeable sum"; meanwhile, an even bigger settlement is still being negotiated with other victims who rejected the university's initial offers.  These settlements are expected to reach hundreds of millions as well.
 
Critics be damned, Microsoft is clearly doing the right thing here.  It doesn't need to rely on emotional "think of the children" arguments because upon thorough closer inspection its policies are perfectly respectful of the privacy of its customers and are ethically sound.  it would be insane to abandon this effort and return to casting a blind eye to child sex crimes, as some are advocating.

To catch a Predator
Microsoft is protecting customers' privacy by using ethical tools to purge child predators from its networks and from the streets.  It's caught a predator and deserves praise, not condemnation. [Image Source: Dateline]

What's the cost of protecting giving special privileges that only protect the privacy of pedophiles?  Ask Penn State, perhaps.

Sources: The Smoking Gun, Philly.com, BBC News, via Slashdot



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Liability
By Samus on 8/8/2014 3:56:23 AM , Rating: 2
quote:
Microsoft should have warned the defendant that molesting children or sharing illegal pictures of them might result in jail time.


That's a ridiculous slashdot comment. Microsoft is in no position to take legal authority to enforce state or federal law by threatening jail time. This person clearly has no idea how the legal system works. In fact, anybody condemning Microsoft or Google for tipping off authorities has no idea how the legal system works.

Possessing Child Porn is like possessing an unregistered firearm used in a crime. Everyone who comes into contact with that weapon is put at risk. Microsoft is in the same boat here. If there is illegal content stored on their servers, they are at risk. This is how Kazaa and Kimdotcom got in trouble.

The interesting thing about Microsoft and Google TOS is they don't seem to care about most illegal content (even I have some "illegal" stuff in Onedrive such MP3's) but only the content that would implicate them as an accessory to a crime.




RE: Liability
By Omega215D on 8/8/2014 6:02:05 AM , Rating: 2
Slashdot has become less about real tech discussion and instead a geek status quo circle jerk. Kinda like reddit. In some cases many of the posters sound like "liberals" which are far from liberal, unless you go along with their ideology.


RE: Liability
By Reclaimer77 on 8/8/14, Rating: -1
RE: Liability
By karimtemple on 8/8/2014 10:15:45 AM , Rating: 2
This has nothing to do with the "scroogled" ad campaign lmfao. PhotoDNA doesn't check your e-mails for ways to advertise to you. It just maintains the anonymity of your photos while making sure they don't look like known child porn.


RE: Liability
By Reclaimer77 on 8/8/14, Rating: -1
RE: Liability
By ACE76 on 8/8/14, Rating: -1
RE: Liability
By JasonMick (blog) on 8/8/2014 3:46:44 PM , Rating: 5
quote:
Your missing the point..In the end Microsoft is obtaining evidence that the police would need a search warrant for
UMM you are in fact missing the point. It's "obtaining" your illegal kiddie porn because you gave it to Microsoft to process and store.

If I offer you a free locker to store stuff but I say "wait as an agreement you have to give me all the stuff you want me to put in it ... I'll keep it safe, but I got to check it's not illegal."

And you say "Oh okay."

And we shake on it and sign a contract....

.... then I have every right to enforce that contract.

Let's say you hand me your kilo of crack you're planning on smoking next month after you use up this month's supply and then I turn you in to the cops.

I did not "obtain" your crack in the sense that law enforcement "obtains" illegal items. You gave me your crack! I'm not in the wrong!

I'm protecting myself from liability AND I even told you I was going to do this.

This is precisely the situation Microsoft is in. It's a shame you can't see it.

Microsoft didn't get a warrant, invade Google's network, and find the offensive image. It simply looked at the image this foolish criminal GAVE IT.

You really think a business should be unable to turn you in if you give it obvious pictures depicting sexual abuse of children??

Oh brother...


RE: Liability
By NellyFromMA on 8/8/2014 10:15:49 AM , Rating: 4
You're turning this into something it isn't. That campaign doesn't regard the topic hand. The issues it does target are valid.

Just because MS highlights valid privacy concerns doesn't prohibit it from taking action in cases like this, which are entirely different as elaborated on endlessly by the editor. Nor does Google's similar action yesterday give it a free pass to commit its actual violations of personal privacy.

Let's not turn this into a who's worst troll-a-thon, let's learn more about the particular practice cited in the editorial and then we can all resume our days at what is more or less not news in the first place.


RE: Liability
By Labotomizer on 8/9/2014 11:48:42 AM , Rating: 1
It's funny that he's trying to turn it around. It was all about how awesome Google was for turning the guy in (which they were). Now because Microsoft did the same thing the backlash is karma for the Scroogled campaign? Why wouldn't you defend Microsoft taking the same actions here? I think most sane people believe that both Microsoft and Google did the right thing in these situations. I'm glad both companies did. Neither has anything to do with privacy.

Oh well, if there was ever even the slightest question R77 tried to be unbiased between Google and Microsoft this pretty much ends that discussion. Crazy. Not even I expected him to turn this into MS vs Google.


RE: Liability
By Reclaimer77 on 8/11/2014 11:22:08 AM , Rating: 1
The OP was wondering why someone, a so-called journalist, could post such a ridiculous thing. I simply gave a possible motivation.

You guys have horrible reading comprehension. I didn't "turn" this into anything.

People are gullible and stupid. They literally believe that Google scans your mail and "invades" your privacy, but Microsoft doesn't. Microsoft even said so in an ad campaign. They couldn't say it it wasn't true, right? That's what the average idiot thinks.

If you read between the lines, even though Google and Microsoft did the same thing (rightly), you see a completely different approach to how these are being covered by the bloggers are "Internet advocacy" groups.

Honestly it's so sad that you took my post and tried to turn it into some goddamn nonsense with the logic and IQ of a 5 year old. But what did I expect from Daily Tech?


RE: Liability
By Labotomizer on 8/12/2014 9:19:25 AM , Rating: 2
Oh yes, you're the victim here... Give me a break. You turned this into karma from running the Scroogled campaign. It's as simple as that. You can try to say you were commenting on the difference, or the lack of it, between Google and Microsoft but that's not what you said.

Both were proper actions. But it would absolutely KILL you to say something positive here so instead you try to spin this in a negative light. How biased can you possibly be? And how can you be so blind as to not see it?


RE: Liability
By p05esto on 8/8/2014 11:53:55 AM , Rating: 3
What in the world does this have to do with the ScrooGooled campaign???? Answer: nothing!!


RE: Liability
By kleinma on 8/8/2014 2:39:34 PM , Rating: 2
Sure take one thing, compare it to something totally different, call them the same to make a ridiculous point. Although that is your specialty.


RE: Liability
By EricMartello on 8/8/14, Rating: -1
RE: Liability
By inighthawki on 8/8/2014 12:56:10 PM , Rating: 3
You do not get a reasonable expectation of privacy on a third party's server, which they provide to you free of charge in exchange for agreeing to their terms of service. You agree, by using that service, to allow them to scan your content for the purposes of advertising, et al. If you want privacy for your content, you should find a service that does not do this. period. Nobody is entitled to free email accounts by Google, Microsoft, or anyone else.


RE: Liability
By EricMartello on 8/8/14, Rating: 0
RE: Liability
By chripuck on 8/11/2014 1:56:20 PM , Rating: 2
If you store your belongings at my house I not only have the right to look at them, I have the right to turn you in if you store a kilo of coke there. Your property has zero right to privacy in my home, especially if we agree beforehand that I can look at all of it.

Nobody has the right, without a warrant, to look at the kiddie porn you keep on your personal computer. But cloud storage is not your computer, it's someone else's, period. And when you sign up for it you explicitly give them the right to look at it, period.

Don't like that, get another service.


RE: Liability
By Rukkian on 8/8/2014 1:00:15 PM , Rating: 2
I almost don't know where to begin.

At the end, are you truly saying that Being Gay is equivalent to being a pedophile? If you are, I feel very sorry for you and anybody who knows you. One is a choice of something to do in private with another person who chose the same thing, while the other is exploitation of a person that has no choice in the matter.

You also suggest that having and distributing child porn should not be illegal, which imho is just wrong. I don't agree with the op saying it is the same thing as owning a gun, not even close, but to say it should not be against the law is idiotic.

As for scanning private data - First thing is they do this to keep from having the same image saved 1 million times (which would happen with some photos, videos, etc). It is a process called deduplication and is the only way storage could be given for the price (free in some cases) that is available. Second thing is that it is not a private service, you authorize them to scan the items when you sign up for the service. In some cases, they use the scanned info to make money, other cases (like this one) it is simply a way to cut storage space(in some cases 90+%). If you want privacy, run your own server, and fully secure it, and don't send your data out to a service that you agree will scan it.


RE: Liability
By Samus on 8/8/2014 1:55:17 PM , Rating: 2
I didn't say its the same thing as "owning a gun" I said it's like coming into contact with one that is illegal (unregistered - every state I've lived in required you to register every firearm with the state police) and was used in a crime.

The analogy being child pornography is illegal and a crime was committed to produce it. I think the analogy is sound but easily taken out of context if one reads it as a violation of their rights; a firearm used in a crime must be turned over to authorities, otherwise you are withholding evidence from the legal system; a felony.


RE: Liability
By EricMartello on 8/8/14, Rating: 0
RE: Liability
By inighthawki on 8/8/2014 7:29:51 PM , Rating: 2
quote:
Why should it be against the law? If you have an image of people being murdered, does it mean you committed those murders? What if said pictures of dismembered bodies gets you off...is that "evil" if you are doing it in a private setting and not actually murdering people?

I mean, if people were getting murdered solely to take those pics, then yeah, it's just as wrong on many levels. If you're simply arguing "I like to look at gruesome dismembered things" - As digusting as it is, it's different. People and animals die and get dismembered all the time from nobody's fault. The images can be obtained through legal means without ever committing a crime.
Young children do not get abducted and forced to pose nude by accident. Someone HAD to have committed a crime in order to obtain the pictures.


RE: Liability
By EricMartello on 8/8/2014 9:17:19 PM , Rating: 1
quote:
I mean, if people were getting murdered solely to take those pics, then yeah, it's just as wrong on many levels. If you're simply arguing "I like to look at gruesome dismembered things" - As digusting as it is, it's different. People and animals die and get dismembered all the time from nobody's fault. The images can be obtained through legal means without ever committing a crime.


Murder is illegal in most civilized societies that I know of. The question is, if someone has pictures of an illegal act should they face criminal charges even if they had no involvement with said criminal act being committed? All they have are copies of photos that the perpetrator took. Should people face criminal charges for writing stories or song lyrics that depict illegal acts?

In my view, no to both. I don't think possessing information - whatever it may be - warrants any criminal charges, aside from special scenarios like stolen state secrets.


RE: Liability
By inighthawki on 8/8/2014 9:54:55 PM , Rating: 2
I'm not sure why you view these similarly. People do not murder someone to take their picture and feed into weird fetishes. The only images of murder victims out there are images taken after the fact by bystanders, law enforcement, military personnel, etc. Such things occur all the time and are the norm in the world.

With child pornography the photographer is ALWAYS knowingly committing a pretty heinous act involving forcing an under-aged child to partake in sexual acts against their will, and take photographs specifically to fulfill a fetish based entirely on that illegal act. The existence of the market for such media drives more abductions of young children for this purpose.

quote:
Should people face criminal charges for writing stories or song lyrics that depict illegal acts?

That is just *so* far from the point, I don't even know how to reply to that.


RE: Liability
By EricMartello on 8/9/2014 11:39:18 PM , Rating: 3
quote:
I'm not sure why you view these similarly. People do not murder someone to take their picture and feed into weird fetishes. The only images of murder victims out there are images taken after the fact by bystanders, law enforcement, military personnel, etc. Such things occur all the time and are the norm in the world.


I'm not sure why you're having such a hard time following... I asked you a pretty specific question: IF someone records video or photographs themselves in the act of murdering someone, and then a third party who had nothing to do with said murder acquires the media, should the third party be as guilty as the murderer?

It's really not that hard to play along, so answer the question.

quote:
With child pornography the photographer is ALWAYS knowingly committing a pretty heinous act involving forcing an under-aged child to partake in sexual acts against their will, and take photographs specifically to fulfill a fetish based entirely on that illegal act. The existence of the market for such media drives more abductions of young children for this purpose.


We're not talking about the pornographer; we're talking about the pedophile who simply downloaded a photo depicting such an act.

You don't seem to get that what you are saying because you are distracted by the details of the issue rather than assessing it from a broader perspective.

You are saying that the mere possession of information or media that depicts and illegal act should come with criminal consequences, even if the possessor had nothing to do with the crime.

I can assure you that pedophiles and homosexuals existed long before there was any kind of "porn market" so your suggestion that there is some justification in criminalizing the possession of pedo porn is without merit.

The people who perpetrate these acts suffer from a mental disorder and will continue their behavior regardless of whether or not there is some underground network for sharing pedo pics and videos.

quote:
That is just *so* far from the point, I don't even know how to reply to that.


- It is a crime to possess pedo porn media, regardless of whether or not you participated in the act depicted in the media.

- By that logic, it also be a crime to write songs, stories, books or create any other kind of media that depicts illegal acts.

Why? Because in both cases, you have people with media depicting illegal acts and in both cases neither possessor committed the crimes themselves. True that in one instance a crime was committed, but it doesn't mean criminal charges should be assigned to anyone who downloads the media if they had nothing to do with the crime.

Surely you're not that dense that you fail to see the slippery slope we're on.


RE: Liability
By inighthawki on 8/10/2014 1:55:07 PM , Rating: 1
quote:
I'm not sure why you're having such a hard time following... I asked you a pretty specific question: IF someone records video or photographs themselves in the act of murdering someone, and then a third party who had nothing to do with said murder acquires the media, should the third party be as guilty as the murderer?

I did answer your question, just maybe not directly. Yes they should be guilty. Maybe not charged with the murder itself, but you were an accomplice to the crime and you actively participated.

The rest of your post is just so absurd I won't bother. The sheer fact that you are advocating the idea that obtaining and owning child pornogrtaphy is OK suggests that you don't understand the differences in your own examples.


RE: Liability
By EricMartello on 8/10/2014 5:36:35 PM , Rating: 3
quote:
I did answer your question, just maybe not directly. Yes they should be guilty. Maybe not charged with the murder itself, but you were an accomplice to the crime and you actively participated.


Accomplice means that you indirectly enabled the crime to happen, or were aware of a crime taking place and did not intervene or report it. Merely owning video or photos of a crime in progress does not make someone an accomplice to anything, especially if said media was obtained AFTER the crime was committed and the perpetrator has been locked up.

quote:
The rest of your post is just so absurd I won't bother. The sheer fact that you are advocating the idea that obtaining and owning child pornogrtaphy is OK suggests that you don't understand the differences in your own examples.


What, exactly, is absurd about anything I said? I'm not advocating pedophilia; if advocacy is what you read then it's possibly a freudian slip on your part.

My point is quite obvious - possession of information or media depicting a crime should not in itself be a crime if the possessor had nothing to do with committing said crime. The reason has nothing to do with the advocacy of pedophilia (or any crime) - which is your line of thinking - it has to do with the protection of civil liberties from arbitrary criminal prosecution.

If someone printed out some pedo porn photos and put them in your bag - without you knowing - and then proceeded to call the cops and report that he saw you looking at these photos then stash them in your bag, you'd likely have criminal charges brought against you when the cops suddenly show up and find the photos in YOUR possession...but hey, keep tooting the "save the children" horn while you're getting reamed up the buttox by bubba for that heinous "crime" you committed.

Due process exists for a reason - and no matter how wretched a crime may be, there is never any valid reason for due process to be circumvented or abbreviated just to appease a figurative lynch mob.


RE: Liability
By inighthawki on 8/11/2014 12:53:10 AM , Rating: 2
quote:
Merely owning video or photos of a crime in progress does not make someone an accomplice to anything, especially if said media was obtained AFTER the crime was committed and the perpetrator has been locked up.

That was my entire point. You can't have child pornography that happened "after the fact." It requires taking photography during the act of the crime. I guarantee you that having pictures of a murder in progress would land you with just as much questioning. Ever hear of a snuff film? They're highly illegal in almost every corner of the world.

quote:
What, exactly, is absurd about anything I said? I'm not advocating pedophilia; if advocacy is what you read then it's possibly a freudian slip on your part.

Are you actually aware of what a fruedian slip is?

quote:
My point is quite obvious - possession of information or media depicting a crime should not in itself be a crime if the possessor had nothing to do with committing said crime.

Child pornography exists due to the demand for it. Why else would people take and distribute or sell pictures of it? So by all means, having copies of child pornography is producing demand in the industry for it to happen.

quote:
If someone printed out some pedo porn photos and put them in your bag - without you knowing - and then proceeded to call the cops and report that he saw you looking at these photos then stash them in your bag, you'd likely have criminal charges brought against you when the cops suddenly show up and find the photos in YOUR possession...but hey, keep tooting the "save the children" horn while you're getting reamed up the buttox by bubba for that heinous "crime" you committed.

Yeah, because everyone conveniently carries around printed copies of child porn in their backpacks at all times. Very believable scenario. It would take all of an hour at the police station to sort out that the person is highly unlikely to be guilty.

quote:
Due process exists for a reason - and no matter how wretched a crime may be, there is never any valid reason for due process to be circumvented or abbreviated just to appease a figurative lynch mob.

When did I say it shouldnt?


RE: Liability
By EricMartello on 8/12/2014 2:46:48 PM , Rating: 2
quote:
That was my entire point. You can't have child pornography that happened "after the fact." It requires taking photography during the act of the crime. I guarantee you that having pictures of a murder in progress would land you with just as much questioning. Ever hear of a snuff film? They're highly illegal in almost every corner of the world.


I don't think you're picking up what I'm putting down. The photos of the crime are not in and of themselves the act of crime.

The pornographer commits a crime and records it, makes the photos available, is caught and prosecuted...ends up in jail. Case close...however, the media he created still remains in the hands of others who have downloaded it. This is AFTER the fact, and so explain to me how possession of the media should be held in the same regard as the crime. You have yet to establish this concept that you claim to support.

quote:
Are you actually aware of what a fruedian slip is?


In this situation you are reading what you want to see rather than what's actually written.

quote:
Child pornography exists due to the demand for it. Why else would people take and distribute or sell pictures of it? So by all means, having copies of child pornography is producing demand in the industry for it to happen.


No it doesn't exist because of some nebulous demand. It exists because there are mentally ill people who have an urge to sexually abuse and assault children which they are unwilling or unable to control. These abuses will continue to happen regardless of whether or not there is media portraying acts of pedophilia.

Your basic premise, that the availability of pedo porn is increasing the frequency at which children are victimized, is inherently flawed. It's going to happen at the same rate regardless of whether or not media of the crime is illegal.

quote:
Yeah, because everyone conveniently carries around printed copies of child porn in their backpacks at all times. Very believable scenario. It would take all of an hour at the police station to sort out that the person is highly unlikely to be guilty.


Really? You're having trouble understanding the concept of criminal framing, where someone deliberately wants to implicate you in a crime you did not commit? Even this concept you struggle with?

How are you going to prove the photos are not yours when they are in your possession ? Remember, you just got done arguing that the pictures should be illegal and merely having them in your possession warrants criminal charges against the possessor.

You have them in your bag, you know they are illegal and the cops expect you to say something like, "These aren't mine; I never saw them before." If all it takes is having the illegal photo in your possession for a crime to have been committed, then you're guilty.

You're not really getting how much of a slippery slope it is to criminalize the possession of information - whatever it may be.

quote:
When did I say it shouldnt?


Due process was circumvented in this instance, where the person's private information was "spied" on without a warrant or any legal justification, reported to the police resulting in him being arrested.

MS would have been perfectly fine in keeping logs of his activity, should the cops suspect him of assaulting children and later requesting the logs from MS to support their case...but the way it happened in this instance is quite counter to the 4th Amendment procedures of obtaining a warrant first.


RE: Liability
By Rukkian on 8/11/2014 3:41:33 PM , Rating: 2
You can there is no precedence for getting busted after the fact, but this is the same as receiving stolen goods. You may not have stole them, but that does not mean you have to right to receive/possess them and can be convicted if you are found with stolen goods.

Your argument about having somebody stuff pedo porn in your bag while walking down the street would be the same is somebody walked by and threw some stolen goods in your bag.

In this case, it was not somebody else uploading it and trying to frame him unless they had his username and password. In that case, the prosecuters job is to show that he did upload it, and it is the defenses job to show that he did not, somebody else did.


RE: Liability
By EricMartello on 8/12/2014 2:58:46 PM , Rating: 2
quote:
You may not have stole them, but that does not mean you have to right to receive/possess them and can be convicted if you are found with stolen goods.


This is not the same as having media of a crime in progress, because stolen goods are a direct component of the crime of theft. Stolen goods do not become "unstolen" even if the original thief is caught, and if someone unknowingly buys stolen goods there may be reduced or no criminal charges filed against them.

In the case of media depicting an illegal act, if the criminal case related to the illegal act is closed - the perpetrator(s) in jail - then the media is only documentation of an event that occurred and not a direct component of the event.

quote:
Your argument about having somebody stuff pedo porn in your bag while walking down the street would be the same is somebody walked by and threw some stolen goods in your bag.


No it's not; see above.

quote:
In this case, it was not somebody else uploading it and trying to frame him unless they had his username and password. In that case, the prosecuters job is to show that he did upload it, and it is the defenses job to show that he did not, somebody else did.


This is exactly why criminalizing privately held information is a bad practice. There was no due process resulting in the apprehension of this person. The service provider was snooping on his private files, to which he DOES have reasonable expectation of privacy, and reported him to the police without any kind of verification in place.

This is completely backwards from how the legal system should be working. If the police suspected this person of committing illegal acts such as those related to pedophilia, they could have obtained a warrant from a judge and then searched his home as well as any online storage service he uses.

Now, IF, this person was PUBLICLY DISTRIBUTING this material using MS's storage/sharing services I would fully support MS actions, because by making his information publicly available he forfeits his right to a reasonable expectation of privacy...but that's not what happened, is it?


RE: Liability
By kickoff on 8/10/2014 8:42:55 PM , Rating: 1
quote:
Pedophiles are in the same boat as homosexuals as far as mental disorders go.


Congratulations, eric, you are the biggest KHUNT on the internet. And that is really saying something.

And your "slippery slope" argument? Yes, definitely a KHUNT if you think that's more important than protecting children from sexual abuse. Seriously, please feel free to kill yourself. Really, please do.


RE: Liability
By MZperX on 8/8/2014 1:34:24 PM , Rating: 2
While I agree with the general point you made in your post about the slashdot comment, the above quote is a terrible analogy. Possession of firearms is a Constitutionally protected right (here in the US). Even if the gun was used in a crime at some point, the person possessing the gun may not be aware of that, and be completely innocent of any crime themselves. Say they purchased a revolver at a pawn shop that, unbeknownst to them, was at some point before used in a crime but never "caught". There is no way for the buyer to know that just by looking at the gun. It is not even close to the case of child pornography.

You cannot be in possession of child pornography and not break the law. The mere act of possession is illegal. Also the material itself is such that there is no way to be oblivious to its criminal nature. There is no way to possess, copy, store, reproduce, or share child pornography all the while not knowing what it is.

Back on topic: IMO anyone complaining about this specific use of PhotoDNA or similar technologies to expose child pornography either does not understand the process or is willfully ignorant. The right to privacy protects the individual against GOVERNMENT searches and intrusion without due process. When you hand over your files/photos to Microsoft or Google to store on their corporate servers you voluntarily sign up to their terms of service. These ToS clearly state that you are not allowed to use their services for illegal activities. Why is this so hard for people to understand?


RE: Liability
By DT_Reader on 8/11/2014 1:07:59 PM , Rating: 2
quote:
Microsoft is in no position to take legal authority to enforce state or federal law by threatening jail time. This person clearly has no idea how the legal system works. In fact, anybody condemning Microsoft or Google for tipping off authorities has no idea how the legal system works.

You're right, if Microsoft or Google have evidence of a crime, they should report it. But they are not the police and should not go looking for it. That's not their job, unless they have a court order making it their job.


Hashing for Signatures is a Violation of Privacy
By dsx724 on 8/8/14, Rating: 0
By amanojaku on 8/8/2014 10:06:51 AM , Rating: 2
Don't be an idiot and READ the terms of service. Not only did users authorize MS to scan their data, they ALSO authorized the ability to remove that data, to block receipt and transmission of that data, to ban the user from the service, and to share that data with law enforcement.

Such policies are standard among free service hosts, and even some private hosts.

Hashing also predated child porn and legal action. It was originally used to keep pirated files off of file hosts. No one alerted authorities, they just tossed your file in the garbage. I prefer prosecution when it comes to pedophiles, however.


RE: Hashing for Signatures is a Violation of Privacy
By ACE76 on 8/8/2014 2:24:34 PM , Rating: 1
I think you are the one being an idiot...you think Microsoft's TOS will mean anything in a criminal trial that has the potential to put someone in prison? You honestly think Microsoft or Google has the right to obtain evidence against an individual that the police or any law enforcement agency in the country needs a search warrant for? Where was the probable cause in this case? What reason did the police have to be looking at this guy's data? When his lawyer starts asking this in a court room and says the evidence was illegally obtained, you can bet it gets thrown out....unless you are stupid enough to believe Microsoft has more authority than the police.


By superPC on 8/8/2014 5:38:43 PM , Rating: 2
In this case MS does have more authority than the legal system. The picture was stored on MS server. And as it is MS property, it is LEGAL for them to search it, process it, do whatever the hell it want with it. It's you who are at fault for USING their server.


By dsx724 on 8/11/2014 4:49:26 PM , Rating: 2
Let me make three analogies of what is happening because you are not getting it.

1) Government puts out a hash list of "sensitive content" that require vendors to report all users who have content on that list. The system is ripe for abuse under any pretext. Journalist gets documents about high level federal corruption, hash matches, journalist is on watch list.

2) You receive unsolicited email that contains "sensitive content" in your spam folder. You backup all your emails onto a service provider's system. The service provider sends notification to the government about your "sensitive content" and a warrant is provided for all of your data.

3) Terms of Service is not the equivalent of a notorized contract. By uploading it to Microsoft's server, I provided non-legally binding access to the files to store. I did not transfer ownership of the files to Microsoft. Microsoft can try to legally enforce their TOS but it will most likely be tossed from the court of overreach.

In this specific case, sensitive content is child porn. However, that does not mean it will always be the case. It merely sets precedent that this kind of system is allowed.


By ritualm on 8/8/2014 7:41:39 PM , Rating: 2
quote:
I think you are the one being an idiot...you think Microsoft's TOS will mean anything in a criminal trial that has the potential to put someone in prison?

It does. The person being incriminated agreed to a CIVIL, legally binding contract with another party. That person violated certain terms within that contract and was thus held legally liable.

You are in effect advocating the person in question should not face responsibility for their own willful actions. Remember, every time you agree to buy a phone from your carrier on a 2-year contract, you sign on the dotted line, you have an obligation to fulfill the terms of the contract. You cannot simply switch from one carrier to another, and decide that since you're not on your previous carrier anymore, you don't have to honor the contract. It just does not work that way, sir.


By GatoRat on 8/8/2014 11:43:36 AM , Rating: 2
I am aware of the PhotoDNA project. Microsoft (or any other organization) doesn't submit hashes to the government. Rather, the FBI publishes a list of hashes of known child pornography files. Microsoft checks the hashes it calculates against this list. If a hit is positive, only then do they notify authorities (I suspect they have a list of formal legal steps to do first. Based on the story, it appears that even after they notify authorities, they require the government to produce a warrant before letting them access the actual image in question.)


RE: Hashing for Signatures is a Violation of Privacy
By ACE76 on 8/8/2014 2:37:20 PM , Rating: 2
and this warrant that your talking about gets issued how? This is looking like a classic "fruit from a poisonous tree" scenario...they know he has child porn and have already got the evidence from Microsoft or Google or whoever...so now they go to a judge and say "hey, we have probable cause that this guy has child porn because Microsoft told us" The judge hands the warrant and the guy subsequently gets arrested. Except, what no seems to realize is that the police had already gotten the evidence before ever going to a judge. They absolutely knew he had child porn and the very place they eventually get their evidence was already given to them by Microsoft...that folks is illegal search and seizure and it's a violation of the 4th amendment....any lawyer will easily be able to say that the hashes Microsoft is using is no different than going to his Skydrive and looking at the data....which is exactly what Microsoft and Google are doing...lawyers are going to have a field day with this...perhaps Microsoft and Google should stop trying to do law enforcement's job for them and simply delete any child porn they think they have on their severs??


By superPC on 8/8/2014 5:54:29 PM , Rating: 2
1. You are using their server. Therefore you agree to use it in accordance to it's terms of service (TOS). Think of it like a rental agreement. Just because you rent a house, doesn't give you the right to torn down a wall, or replace the windows with stained glass masterpiece. It depends on your rental agreement right?

2. IF email providers delete child porn than they become an accomplice to a crime because SIMPLY HAVING a child porn is ILLEGAL therefore any cover up of that crime meant obstruction of justice (read the Penn State example in the article)


By chripuck on 8/11/2014 2:05:16 PM , Rating: 2
Dude, if you don't like it DO NOT USE THEIR SERVICE.

This is not rocket science... Microsoft was NOT hashing files stored on their home computer or phone. They were only hashing files stored on their physical machines.

To put it "old time terms" if I confessed to a crime, in writing, and mailed it to the wrong address, they could obtain the letter and use it against me to obtain a warrant. It's irrelevant that I intended it to be private, I sent it to the wrong person, whom made it public.


Warning: NSFW Graphic Description
By zirk65 on 8/8/2014 8:27:07 AM , Rating: 2
Dear Jason Mick,

Please pixelate the part of the affidavit JPG that describes the offending uploaded image, or take the JPG down but keep The Smoking Gun link with a NSFW or similar warning.

Thank you.




RE: Warning: NSFW Graphic Description
By inighthawki on 8/8/2014 11:13:39 AM , Rating: 2
Sorry, this is not considered NSFW material. The whole point of the NSFW tag is to warn people before accessing materials that are inappropriate to view in the workplace because others around you might see it. An image of an affidavit - aka a piece of paper with text - does not fall under that category. There is no need to censor text image because they contain words like "vagina" or simply describe something that you may not be comfortable with.


RE: Warning: NSFW Graphic Description
By karimtemple on 8/8/2014 11:28:59 AM , Rating: 2
It is pretty hilarious to see an official document use the word "dildo" though, LOL. They could've said, like, "sexual implement" or something.


RE: Warning: NSFW Graphic Description
By GatoRat on 8/8/2014 11:50:03 AM , Rating: 2
Law enforcement and DAs prefer not having their cases thrown out of court of use of misleading or overly vague terms (which does, and should, happen to protect our rights.)


By karimtemple on 8/8/2014 1:40:57 PM , Rating: 2
I'd LOVE to see one example in the history of everything of a judge throwing out a case because a document said "sex toy" instead of "dildo." lmfao.


Logic fail
By CZroe on 8/8/2014 3:03:09 PM , Rating: 2
"Failing to compare anonymous image fingerprints protects only one group of customers -- child predators"

I'm glad this guy was caught, but this logic fails on every level.

"Failing to compare anonymous video/audio fingerprints protects only one group of customers -- copyright pirates"

"Failing to compare anonymous text fingerprints protects only one group of customers -- Terrorist who dare to use flagged words"

"Failing to compare anonymous state criticism fingerprints protects only one group of customers -- People exercising their right to dissent"

"Failing to compare anonymous [whatever you send] fingerprints protects only one group of customers -- [whatever group you belong to"

Who cares if the ruling political party doesn't have the context of your political dissent or criticism before they identify it and act with tyranny? You aren't "anonymous" once they turn you in, are you?




RE: Logic fail
By JasonMick (blog) on 8/8/2014 3:37:26 PM , Rating: 2
quote:
"Failing to compare anonymous video/audio fingerprints protects only one group of customers -- copyright pirates"
You come the closest here, but there's one fundamental problem with your logic -- it all comes down to fair use and backup copies. If I own an Apple IIe Odell Lake disc (which I do), I can legally download and use a backup copy for Android. It's perfectly legal for someone to send me that copy as well.

Yes, you can do relatively similar fingerprinting of copyrighted audio or video -- based on histograming colors in frames for video, sound pitch/etc. in short snippets for audio). But because of the fair use exemptions I would think Microsoft and Google would tread carefully to avoid public backlash.

I haven't heard horror stories yet, and I myself have never gotten flagged on my Gmail nor my Live.com mail despite being sent numerous copies of copyrighted academic papers (PDF) (which I have legitimate access to via a university I'm affiliated with, however I didn't always download them myself).

Again logically your example fails somewhat, though.

I agree copyrighted works and child porn are both easy to identify.

HOWEVER legal owners of copyrighted works (at least those without DRM) are LEGALLY allowed to own copies of the work and the copyright owner can LEGALLY send copies to individuals of their choosing. That means that many people are legally allowed to receive or send such files.

By contrast NO ONE is legally allowed to own child pornography in the U.S. When I say child pornography, I'm talking graphic images of children being raped or otherwise sexually abused.

The situations are similar in some ways, but entirely different in that regard. There is no moral gray area, as unlike the copyright detection, every single person who transmits child porn images is breaking the law.
quote:
"Failing to compare anonymous text fingerprints protects only one group of customers -- Terrorist who dare to use flagged words"

"Failing to compare anonymous state criticism fingerprints protects only one group of customers -- People exercising their right to dissent"

"Failing to compare anonymous [whatever you send] fingerprints protects only one group of customers -- [whatever group you belong to"
And here you just fall into ridiculousness.

First off, this would be pretty darn obvious if the top service providers began engaging in mass political suppression. Maybe in China it would be accepted, but in any nation with a historic freedom like the U.S., it would be met with much backlash and like a revolt.

Second, the technology involved is entirely different. An audio/image/video fingerprint obfuscates and anonymizes information in such a way such that it can detect a narrow subset of media that stored fingerprints in a database.

By contrast speech is organic. Your results would either be mostly garbage OR you would have to resort to techniques like natural language processing or human inspection. This is on the extreme opposite end of the privacy spectrum in terms of surveillance.

In terms of technology generally:

Least intrusive >>>>>>>>>>>>> ;>>>>>>>>>>>>&g t;>>>>> Most intrusive

Histogram Fingerprinting >>>> Keyword Searching >>>> Human inspect/Natural language processing

So your point may indeed be a concern but it has no bearing to the CURRENT discussion as it's an entirely different technology (natural language parsing) from a technical standpoint and further it's admittedly much more concerning from a privacy standpoint.

Of your four examples only 1 of 4 is even valid; the rest use entirely different technology (fingerprinting would be useless). Even the 1 that technically could use fingerprinting is limited as it is not universally illegal to own. So really you gave no valid scenarios where THIS particular technology is a concern.

Note I absolutely oppose mass government surveillance and keyword hunting. But that's an entirelly differnt technology. We want our government to be aware of these nuances right? So we better try to recognize them ourselves.

What Microsoft did is almost universally acceptable unless you're biased against it or are yourself child pornography consumer. The material it's target is universally illegal in the U.S. and can be detected without complex inspection and without storing everyone's data.

Please try to reread and understand the nuances here before posting more false hypotheticals.


RE: Logic fail
By CZroe on 8/8/2014 4:15:17 PM , Rating: 2
You seriously believe that only media file attachments can have "fingerprints" that are looked for?

I can calculate CRCs for the word "bomb" followed by various proper nouns and have that "fingerprint" checked against the content of the email.

As I said, I am glad this person is in jail.Your attempt to shame me into agreeing with you is laughable. It has nothing to do with how ludicrous it would be for service providers to start doing this now and everything to do with it being a slippery slope.

SunTrust just canceled the accounts of a gun/pawn shop owner because his was a "Prohibited Business Type." Wanna know who prohibited it? Look up "Operation Choke Point." If you can't stop those you politically oppose legally, just influence a bunch of private partners to put them out of business. That's what's going on here.


RE: Logic fail
By JasonMick (blog) on 8/8/2014 4:48:35 PM , Rating: 2
quote:
You seriously believe that only media file attachments can have "fingerprints" that are looked for?

I can calculate CRCs for the word "bomb" followed by various proper nouns and have that "fingerprint" checked against the content of the email.
First off that's NOT histogram fingerprinting , which is what this article is about. You're talking about matching regular expressions. Technically that's an entirely different topic so you can't just compare apples and oranges.

I get your point in the way because the NSA did propose to the DOD/White House to data mine and scan based on keywords, which may sound similar to a layman. Again, in reality those are two fundamentally different algorithms. Their only common thread is they're search algorithms.

But let's be frank. I think it knew it would produce meaningless results and it proposed it anyways for two reasons -- first the data contracts are a special interests payout and second, it offers an excuse to collect everyone's data for targeted political suppression.

Aside from your latest example having little technical relation to what Microsoft is doing, it's also quite a stretch on a legality grounds. It's illegal for ANYONE to possess child pornography. It not inherently illegal in all cases to use even the most "provocative" words such as bomb.
quote:
That movie was a real bomb.
<-- NOT ILLEGAL.

I'm limiting my discussion to scenarios where fingerprinting produces actual meaningful data.

If your goal is to produce false positives yea search the occurences of "bomb" and proper nouns. You do know most actual criminals -- whether terrorists or street criminals -- use codewords to avoid detection? Your scheme would produce meaningless data.

So again, aside from the technical difference, you're illustrating why in most scenarios outside this very narrow one, data mining is useless for fair, efficient law enforcement. You have no argument from me there.

But don't try to cast Microsoft's very specific usage into something entirely technically different and very different too in terms of ethicalness.

Microsoft's technique is actually able to reliably catch child pornography consumers. Copyright fingerprinting is at best 75-25 in terms of true positives to false positives, given fair use. Word mining to try to find criminal behavior is laughably useless, as it will produce basically all false positives.

The concern you're expressing over NSA surveillance is a concern over mass data collection. By buying into the NSA's false logic (that its goal is regex searches to predictively find some group), you're just playing into their game, as they know all too well that tactic has never worked.

Their real goal (and why they are able to engage in targeted political suppression) is because they are collecting everyone's data in personally identifiable form and know who they're looking for.

They aren't just finding random folks who feel a certain way, as virtually no one has successfully managed to do that yet with data mining.
quote:
SunTrust just canceled the accounts of a gun/pawn shop owner because his was a "Prohibited Business Type." Wanna know who prohibited it? Look up "Operation Choke Point." If you can't stop those you politically oppose legally, just influence a bunch of private partners to put them out of business. That's what's going on here.
Sure, you're talking about the problem of the U.S. slipping away from being a free market by turning a blind eye to corporate collusion, bribery of politicians, and overly protectionist laws/policies.

But what does that have to do with Microsoft?

Microsoft issue is that a handful of customers are trying to put illegal materials on its servers. That's clearly an issue of a company and its client, not multiple companies ganging up on another company.

I guess if you want to cast a wide net and just bring up random examples of societal corruption sure by all means mention that, but you're kind of all over the place with your points....


RE: Logic fail
By tng on 8/8/2014 5:10:51 PM , Rating: 2
quote:
But what does that have to do with Microsoft?


Nothing on the face if it, but I can understand where he is coming from. While MS really is out there defending customer data from corrupt politicians, rouge government agencies and the like, there are plenty of companies that just give in because they don't have the resources that MS does.

Also as time moves on things change and some day soon MS may not believe that it is the shareholder interest to continue to fight against what they still know are illegal request for customer data.

Not saying that I condone the acts of the guy in the story, but as the poster said, it is a slippery slope. This could be used in the future as an example of precedent.


Pointless "outrage"
By bsim50 on 8/8/2014 3:22:59 AM , Rating: 3
No idea what the "outrage" is all about. If your e-mail account is free - then every e-mail you've sent or read through it from day one is scanned anyway for "contextual advertising", and you openly agreed to that from Day One in the T&C's.

Cracking down on child porn is neither immoral nor "illegal" since there are also clauses in your e-mail providers T&C's to share various type of info with law enforcement, (which you also agreed to). That most people don't bother reading the T&C's is besides the point. In fact, section 3.6 of Hotmail's T&C specifically declares:-

"3.6. What type of Content or actions aren't permitted? In order to protect our customers and the Services, we have established this Code of Conduct governing the use of the Services. Content or actions that violate this Agreement aren't permitted.

i. Don't use the Services to do anything illegal.

ii. Don't engage in any activity that exploits, harms, or threatens to harm children "


As well as several other things (stalking, Internet fraud, spam, etc). Some people need to go back to school if they think just because the Internet isn't run by one country, service providers cannot impose their own rules that happen to coincide with national laws.




RE: Pointless "outrage"
By NellyFromMA on 8/8/2014 10:25:18 AM , Rating: 2
You're generally right, but you're approach is wrong.

People aren't stupid for not knowing the in's and out's of their TOS/T&C for every service they use, and if they did they would likely be paralegal or above. If you are trying to say you know the ins and outs of all of the services you use, I'd be skeptical.

There are some general concepts, of course. Like, in this case, you aren't entitled to a right to use infrastructure to commit felonies.

The core issue here isn't that "people are stupid" as you suggest crudely. Rather, the issue is that the majority of users of the internet continue to have a deficit of "street smarts" when it comes to the internet.

Instead of bashing people for not being on your level, bring some people up to speed.

Just because someone has a HS diploma or a degree doesn't mean they know a thing about the topic at hand. And, interestingly, many of the people that DO have a clue tend to exhibit several behaviors attributed to low IQ or mental deficiencies in various areas, like compassion, for example.


RE: Pointless "outrage"
By tng on 8/8/2014 4:13:02 PM , Rating: 2
quote:
The core issue here isn't that "people are stupid" as you suggest...


Well lets be honest,the 20 year old that they caught here probably is not the brightest bulb. They caught him early and hopefully stopped him with just looking at pictures.

The problem is that yes, not all of them are stupid and what and how do you catch the ones that use snail mail or FedEx to send pics (or worse)?


RE: Pointless "outrage"
By Solandri on 8/8/2014 2:27:58 PM , Rating: 2
quote:
No idea what the "outrage" is all about. If your e-mail account is free - then every e-mail you've sent or read through it from day one is scanned anyway for "contextual advertising", and you openly agreed to that from Day One in the T&C's.

Folks, this isn't just true of free email accounts. It's true for everything you do on the Internet that isn't encrypted. When you send an email (or grab a pic from a website or an ftp site or any other means over the Internet), the sending mail server does not establish a direct connection to the receiving mail server. It establishes a connection across the Internet. That is, your email is sent through usually a dozen or two dozen computers and their networks before reaching the final destination.

All of this is done for "free" in the sense that someone else has paid for it, or your payment to your ISP is partially paying for it indirectly. Any of those computers along the route the mail takes can scan it for "contextual advertising." Many do not out of principle or for financial reasons (it would take a massive amount of CPU power to scan everything traveling over some of the tier 1 networks). But there is no legal reason prohibiting them from doing this. If ISPs were classified as common carriers, they might be (and net neutrality would be a foregone conclusion). But they're currently classified as information services.

The old adage when I got my first email account in the 1980s was that email isn't like mailing a letter. It's like mailing a postcard. Anyone along the route the postcard takes to its destination can read what it says. That's still true, and it's true for all unencrypted Internet traffic, not just email. That's why things like Google encouraging https for all web sites (encrypts the traffic between the site and your browser) is a lot more important than them handing someone emailing child porn over to the feds.


Wish MS could hide that it is the source
By stm1185 on 8/8/2014 1:11:17 AM , Rating: 3
Then these criminals wouldn't have the chance to wise up.




RE: Wish MS could hide that it is the source
By NellyFromMA on 8/8/2014 10:19:36 AM , Rating: 2
Well, I think the desired outcome is to instill fear in offenders so as to curtail or eliminate the victims' (the children) hardship in the first place.

It's not like a drug sting where you basically set up a micro-economy for the police to justify their jobs by baiting dealers and users endlessly and then busting them, and "no innocents" are perceived to be hurt in the process.

Unlike with drug offenses, the primary goal here isn't to endlessly bust the "bad guys" so much as it is to cease the activity from taking place altogether.


By stm1185 on 8/8/2014 1:28:20 PM , Rating: 2
I doubt being unable to use major E-Mail providers will prove to stop pedophiles from pursuing their desires. I am sure they will find some E-Mail service that will gladly take their business.

Isn't Yahoo supposedly doing heavy encryption soon? So it might even be free.


Scary implications
By djdjohnson on 8/8/2014 11:45:05 AM , Rating: 2
I'm absolutely for scanning for child pornography and reporting it to law enforcement. But, this has scary implications.

Considering how heavy handed the MPAA and RIAA are in trying to protect copyrighted content, how long do we think it will be before they start to fight for the ability to have these scanning systems identify copyrighted material in email and in online file storage services? Wouldn't they just love to get lists of people who email copyrighted files, or upload them to their Google Drive, DropBox, or OneDrive account?

It's too tempting a target for them to ignore.




RE: Scary implications
By karimtemple on 8/8/2014 11:50:09 AM , Rating: 2
"I'm absolutely for scanning for child pornography" --djdjohnson

lol Anyway, your argument sucks. First of all, it's not 'scary' that someone would try to protect their intellectual property. That's their right as Americans. Secondly, they already do what you're talking about lol. It only applies to media in which files are shared.


RE: Scary implications
By The Von Matrices on 8/8/2014 7:33:58 PM , Rating: 2
There's a clear difference between the two types of content. There's no way you can legally be in possession of child pornography, so when it is found there's always going to be a crime committed by someone. This is even if, as cited in the article, that someone framed the holder of the pornography, in which case the framer would be guilty of a crime.

In the case of music or films, in many cases the content is legal, and in the cases where it is not, the illegality is hard to prove. You can't just sue everyone who uploads a movie or song. It's very likely that the upload was a legally ripped or legally downloaded copy of a music or video file for personal use since these legal copies would be indistinguishable from the pirated copies. In that case, the burden of proof would be on the RIAA or MPAA to prove that you indeed did pirate the file, and they know they don't have a strong case. This is why you rarely see the MPAA and RIAA going to court; they always scare people into settlements before it ever gets to court and rarely prosecute people who refuse settlements because they know that they have little chance of winning against a good lawyer.


Huh?!?
By SDBud on 8/8/2014 2:11:56 AM , Rating: 3
Only a TOTAL MORON would complain about such a damnable criminal being taken off the streets!




RE: Huh?!?
By kamk44 on 8/8/2014 3:45:21 PM , Rating: 2
Absolutely agree. This is basically a long article to say a criminal was caught as he should have been and some don't like it but they are wrong.


A big concern
By mike8675309 on 8/8/2014 4:51:09 PM , Rating: 2
My biggest concern coming from all of this is what happens if something somehow causes things I have in my account to trigger a note to the police saying I have something that I really do not. We already have American's that can not fly because somehow their name ended up on a government list. How do we as American citizens make sure our names don't mistakenly end up on a new government list?




RE: A big concern
By Milliamp on 8/9/2014 3:23:05 AM , Rating: 2
If it got a false positive it would just flag some file and when someone looked at it they would say "Oh, this is a false positive" that that would be the end of it.


This is actually good
By Milliamp on 8/9/2014 3:21:23 AM , Rating: 2
In short this works the same way as a virus scan, it looks for definitions of known files which means its not really much of an invasion on peoples privacy.

At first I was thinking some employees were running some image recognition software to find nudes and go "Hey bob, how old does this chick look to you?"

Just like virus scanning I am sure there are occasional false positives but because it's just looking at sequences of 1's and 0's its not like those false positives are necessarily even other nudes or anything highly personal.




heres a tip
By HostileEffect on 8/9/2014 6:05:30 PM , Rating: 2
How about people stop leaking information about themselves or anything they do on the internet. Encrypt your dirt with GNUPGP before storing it online. Recordless VPNs for the dark side of the web.

The only privacy you have anymore is what you take into your own hands while telling the rest of the world to shove off.




Biased attitude
By DT_Reader on 8/11/2014 1:05:53 PM , Rating: 2
quote:
Slashdot users were among those suggesting that child predators be afforded special privacy protections that help hide their crimes.

Man, your bias is showing. A more accurate caption would be: "Slashdot users were among those suggesting that everyone deserves privacy protections, a side effect being that it might help criminals hide their crimes, stating that is a price they are willing to pay for their privacy."

The question is "How far are you willing to go?" You are clearly OK with Microsoft scanning everyone's files and email for evidence of child pornography. Would you be willing to let them scan your email and files for evidence of jaywalking? Running red lights? Movie piracy? Embezzling? Drug use? Drug trafficking? Murder? Just where do you draw the line? Because the very act of drawing it at all means you condone them reading all your emails and scanning all your files.

If anyone finds evidence of a crime they should report it, but Microsoft and Google are not the police and should not go looking for it without a court order.




The Vatican
By roykahn on 8/8/14, Rating: -1
"So if you want to save the planet, feel free to drive your Hummer. Just avoid the drive thru line at McDonalds." -- Michael Asher














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki