backtop


Print 94 comment(s) - last by Christopher1.. on Feb 2 at 12:51 AM

Stunning new filtering plan contradicts its “Your World” marketing campaign

Speaking at the World Economic Forum in Davos, Switzerland, AT&T CEO Randall Stephenson confirmed that the telecom and internet giant is “very interested” in a “technology based solution” to monitor data passing through its networks for rogue peer-to-peer traffic.

“It’s like being in a store and watching someone steal a DVD,” said Stephenson. “Do you act?”

Such a move would affect more than just AT&T’s subscribers, as the company’s network investments represent a sizable chunk of the internet’s backbone – which results in almost all Internet data passing through its network at some point. Given that AT&T has, so far, been pensive about the scope of such a project, many are assuming the worst.

More importantly, AT&T may forfeit its end of the deal in what Slate’s Tim Wu calls “the grand bargain of common carriage:” legal immunity from whatever claims might arise from data its network transports, in exchange for offering network service to anyone in a nondiscriminatory fashion. “AT&T's new strategy reverses that position and exposes it to so much potential liability that adopting it would arguably violate AT&T's fiduciary duty to its shareholders,” writes Wu.

In an absence of any official word on why AT&T wants to implement such a project, many people think that the primary motivator is an alarmed response to the growing percentage of traffic attributable to P2P activity; various surveys claim that anywhere from 30 to 90 percent of all internet traffic is P2P related. Lately, ISPs both large and small have been testing the waters with a variety of traffic-shaping initiatives, including Comcast, which last year found itself in the middle of a scandal over how it handles BitTorrent traffic.

According to AT&T – as well as anecdotal reports and commentary from other ISP employees – Internet users should expect a more managed Internet experience in the near future, as technology is finally becoming sophisticated enough to allow for such large-scale projects.

“We recognize we are not there yet but there are a lot of promising technologies,” said AT&T executive James Cicconi, “but we are having an open discussion with a number of content companies … to try to explore various technologies that are out there.”

If anyone has the expertise to deploy such a large filtering project, it would be AT&T: the company was already caught red-handed with powerful data-mining hardware, which it used to gather information on the nation’s web traffic for the NSA.

“The volume of peer-to-peer traffic online, dominated by copyrighted materials, is overwhelming. That clearly should not be an acceptable, continuing status,” said NBC Universal’s general counsel, Rick Cotton. “The question is how we collectively collaborate to address this.”



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

Studies that do NOT know technology....
By Fallen Kell on 1/27/2008 8:24:48 PM , Rating: 2
quote:
various surveys claim that anywhere from 30 to 90 percent of all internet traffic is P2P related


I just love it when some "study" makes a claim like this, and then it gets reported, and mis-reported. Too bad that 100% (ONE HUNDRED PERCENT) of the "internet" is Point to Point, since, I (being at one point which is my local computer), am requesting from in this case, DailyTech, hosted on another point which is determined by DNS and or load sharing from a farm, but in the end, just another point, for this very website that then gets transmitted back to my local computer. The very implementation of the internet is a POINT TO POINT system. All packets have a "to address" which is the point that the information is meant to go (some will even have the "from address", and others, the last location it came from).

THE ENTIRE INTERNET is POINT TO POINT! "Consumers" are only just finally getting in on the game by hosting their other information. But that was the norm 20-25 years ago when networks were just starting to be built. Everyone on the network hosted information for other people to use. It wasn't really until AOL started taking off where people had access to the internet at home that the majority of the computers connected to the internet not have information that they were hosting out for others to look at. The trend is finally going back the other way as people figure out that you can do a lot more by hosting information and services to the rest of the internet. Maybe even if it is only for your own personal use, but having the ability to connect up to your own personal computer from any other computer connected to the internet is finally a benefit people are realizing and starting to do, be it hosting files that others are downloading or their own private FTP service.




By notfeelingit on 1/28/2008 12:55:40 PM , Rating: 2
P2P stands for peer-to-peer, as in client computer to client computer. Your downloading this article from DailyTech is a server-to-client transmission. Generally a server is used primarily to host data.


By TomCorelis on 1/28/2008 2:25:52 PM , Rating: 2
Generally "P2P" (at least here at Dailytech) refers to "peer-to-peer," specifically meaning peer-to-peer filesharing protocols such as BitTorrent or FastTrack.

I understand where you are coming from (I used to be a sysadmin), so I apologize for the confusion.


"And boy have we patented it!" -- Steve Jobs, Macworld 2007

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki