backtop


Print 50 comment(s) - last by IntelNick.. on Jun 11 at 7:35 PM

AMD and NVIDIA say Intel won't share its USB 3.0 open host controller specs

The USB 3.0 specification is expected to be out in 2009 and will significantly upgrade the bandwidth of the current USB 2.0 ports and products that all computer users are familiar with. The body responsible for the support and promotion of the USB specifications going back to USB 1.1 is the USB Implementers Forum (USB-IF).

The USB-IF was founded by Intel in 1995 along with other industry players including Microsoft, HP, Texas Instruments, NEC and NXP Semiconductors. Currently, the USB-IF and its members are working to bring the USB 3.0 specification to market. USB 3.0 is also being called “PCI Express over cable” because the USB 3.0 specification uses intellectual property that was sourced from the PCI SIG. USB 3.0 will increase the bandwidth offered by USB 2.0 by 10 times with a data throughput of about 5 gigabits per second.

Despite the fact that much of the intellectual property behind the USB 3.0 specification wasn’t developed by Intel, AMD and NVIDIA both assert that Intel is keeping crucial information concerning the open host controller to itself. According to NVIDIA and AMD, Intel has working silicon, meaning the open host controller portion is mature and working, yet Intel is refusing to give the specifications to other processor and chipset makers.

AMD and NVIDIA say that by withholding the open host controller specifications that Intel is basically giving itself a market advantage of six to nine months because of the time lag between receiving the host controller specifications by other CPU and chipset makers and getting product to the marketplace.

An Intel source told News.com, “Intel only gives it [open host controller specifications] out once it's finished. And it's not finished. If it was mature enough to release, it would be released. If you have an incomplete spec and give it out to people, these people will build their chipsets and you'll end up with chipsets that are incompatible with devices. That's what (Intel) is trying to avoid."

The Intel source continued saying, “[Intel is] a little bit behind and that's what might be causing some of the resentment. You could take the opinion that Intel is giving stuff out for free and people are complaining because (Intel) isn't giving it out fast enough.”

If Intel feels that AMD and NVIDIA aren’t willing to do the hard work of developing the open host controller for USB 3.0 themselves, it may be very mistaken. AMD and NVIDIA say they are going to develop their own open host controller for USB 3.0. Both firms point out that developing a separate open host controller could very well mean incompatibilities between USB 3.0 controllers and products.

An AMD source told News.com, “We are starting development on it [open host controller] right now.” An NVIDIA source says the first meeting of the alternate open host controller specification is set for next week and adds, "We fully intend to productize this spec.”

Intel maintains that it is not withholding the specification and that it will provide the details for the open host controller when it is complete.

Intel is in hot water already for some of its business practices. The FTC announced last week that it will investigate whether Intel has abused its market position to stifle competition.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

If nvidia and amd were smart...
By Locutus465 on 6/9/2008 11:48:49 AM , Rating: 2
They'd develop the open host controller in tandum ensuring that devices work properly between nVidia and AMD products leaving Intel products as the "odd man out". I think it's the best way for them to ensure they're not the one's getting burned in this situation.




RE: If nvidia and amd were smart...
By Homerboy on 6/9/2008 12:05:21 PM , Rating: 5
Even if they did that, they would still be the odd-man out. the numbers of Intel based MBs out there, no matter what, is going to GREATLY out number those of NV/AMD combined. Think of the millions and millions of business PCs from Dell and HP alone, not to mention those sold to homes for personal use.

Intel holds the cards by sheer numbers alone.


RE: If nvidia and amd were smart...
By Locutus465 on 6/9/2008 12:15:33 PM , Rating: 2
AMD proved this isn't nessearly true, look at the unofficial itantic v. x64 battle. It is possible to beat intel with a very desirable product... Of course I'm not suggesting they should *try* to make their product incompatible, but with a number of different companies working on a host controller spec, it's going to happen.

Another benificial side effect of AMD + nVidia working together is it should reduce the number of incompatibilities customers will have to deal with.


RE: If nvidia and amd were smart...
By MPE on 6/9/08, Rating: 0
RE: If nvidia and amd were smart...
By Locutus465 on 6/9/2008 12:26:03 PM , Rating: 2
When itanic was released Intel announced their full intention to bring the technology to the main stream, the best way to do this was to start in the enterprise world due to abysmal x86 compatibilty. Until x64 came out it seemed to be going intel's way, they got a Windows OS to support it, they started getting some enterprise software on it... And then x64 came out giving consumers and enterprise customers alike 64bit with out all the hassle.

Now yes, the situation is different here a smidge, but the fact remains having the same host controller spec puts AMD and nVidia in the best possible place. What they do have going for them is the fact that their products are usually used for premium systems (in the home sector), i.e. if you want a little more powerful "pedestrian" desktop you go with AMD or nVidia and their integrated solutions. If they share host controller functionality they might be able to convince manufuactures to just start using their chipsets more in lue of intel.


RE: If nvidia and amd were smart...
By Strunf on 6/9/2008 1:06:49 PM , Rating: 2
Itanic was a failure since day 1... we didn't need Windows X64 to know that, no one expected a far inferior product in real world tasks to now change all our habits, Intel just learned the hard way that people don't like changes.

here is a completely different matter, Intel is part of the USB-IF and a big part of it, do you really think some outsiders that don't even are the major players on the OEM market would have enough power to shift it towards them rather than to Intel? Besides as I read it Intel would release its product at the same time as them, and NEC, HP and TI are other key players that will use the same specs as Intel.


RE: If nvidia and amd were smart...
By Locutus465 on 6/9/2008 1:19:47 PM , Rating: 5
No it wasn't, in fact i remember when it was a hotly anticpipated product and I also remember a lot of industry buzz about getting rid of the complicated x86 instruction set finally (many thought the wrong side won in risc v. cisc), there were a lot of factors involved in the demise of itanic, but the fact is x64 made the failure inevitable, when otherwise it may well have been a long term success which is what Intel had been planning since day one.

Anyway, back on topic... AMD and nVidia are much better off developing their host controller together. You can either have 2 seperate host controllers that are potientially incompatiable with eachother, or you can have 3 leaving AMD and nVidia on their own to abosorbe the impact of the broken controller. At least together they can share the burden which is a great thing for both companies, as neither has intel's resources to deal with USB implementation issues and all the products they actually *WANT* to be developing.


RE: If nvidia and amd were smart...
By DigitalFreak on 6/9/08, Rating: -1
By eye smite on 6/9/2008 3:44:25 PM , Rating: 2
Seems like I said a few weeks ago, any partnership between nvidia and amd that they maintain or develope new will benefit both companies. I guess we'll just have to watch and see.


RE: If nvidia and amd were smart...
By RamarC on 6/9/08, Rating: 0
RE: If nvidia and amd were smart...
By Locutus465 on 6/9/2008 3:57:26 PM , Rating: 3
Incorrect, Intel from the outset planned on replacing x86 with itanium arch, I followed this chip from the first rumors about it.


By RamarC on 6/9/2008 4:25:14 PM , Rating: 1
get your facts straight. hp and intel PARTNERED to produce itanium to compete in the enterprise realm. as with any new technology, they *speculated* that it could replace any class processor, but intel NEVER stopped their bread-and-butter x86 research/design/development. intel pursued the ia-64 architecture because hp was splitting funding costs and got other RISC manufacturers to dump their chip development for ia-64.

"HP determined that it was no longer cost-effective for individual enterprise systems companies such as itself to develop proprietary microprocessors, so HP partnered with Intel in 1994 to develop the IA-64 architecture, which derived from EPIC. Intel was willing to undertake a very large development effort on IA-64 in the expectation that the resulting microprocessor would be used by the majority of the enterprise systems manufacturers. HP and Intel initiated a large joint development effort with a goal of delivering the first product, codenamed Merced, in 1998.[4]

During development, Intel, HP, and industry analysts predicted that IA-64 would dominate in servers, workstations, and high-end desktops, and eventually supplant RISC and complex instruction set computer (CISC) architectures for all general-purpose applications. Compaq and Silicon Graphics decided to abandon further development of the Alpha and MIPS architectures respectively in favor of migrating to IA-64."
http://en.wikipedia.org/wiki/Itanium


By omnicronx on 6/9/2008 12:40:18 PM , Rating: 2
You all seem to be missing that AMD and Nvidia can benefit much more from such a controller. I don't think AMD and Nvidia would be making such a big deal about this if they do not have external products already in mind. External Videocards come to mind. I could easily see everyone dropping the Intel spec in favor for something that is actually going to see use on a massive scale, instead of just increased external hard drive speeds, which are still limited by the hard drives themselves =P.


By jconan on 6/9/2008 8:33:17 PM , Rating: 2
That's if AMD and NV publish their specifiations for OHC first before Intel does for early adoption. That would bring early adopters to develop for AMD/NV spec rather then Intel leaving Intel as the odd man out.


RE: If nvidia and amd were smart...
By tastyratz on 6/9/2008 12:26:37 PM , Rating: 2
exactly
the last thing Amd needs right now is another reason for people to single them out. They wouldn't get a leg up by making a proprietary interface limited to a small market percentage. They would simply kill sales when a low number of hardware peripherals are released compatible with Amd.


RE: If nvidia and amd were smart...
By Locutus465 on 6/9/2008 12:36:54 PM , Rating: 2
on the otherhand by making them selves (similarly for nvidia) the odd men out when compared to Intel they're just asking for trouble... Either they don't have USB 3.0 (amd phenoms have enough market challenges as it is) or they have a USB 3.0 implenetation with potential side effects between intel and nvidia implementations.

nVidia and AMD minimize both development cost related risks as well as market perception risks by using the same host controller ensuring that whether customers by amd or nvidia the same set of devices work properly with out issue.


RE: If nvidia and amd were smart...
By omnicronx on 6/9/2008 12:53:29 PM , Rating: 3
All that AMD and NVidia said is that they are developing their own implementation, for all we know this could mean that it will be able to go above and beyond the USB3 spec, although it will be fully compatible with Intels implementation, once it is complete of course.


By Locutus465 on 6/9/2008 1:21:52 PM , Rating: 2
in theory, in practice I expect bugs and other issues to rear their ugly heads... This will cause real world incompatibilities where in theory non should exist. If they share then at least AMD and nVidia share the same bug set, and split the costs of fixing them.


RE: If nvidia and amd were smart...
By Klober on 6/10/2008 12:14:31 PM , Rating: 3
So you can get your facts straight, just because the PCs are sold by Dell/HP/whoever doesn't mean they're all Intel based. And even the ones that are Intel processor based aren't necessarily Intel chipset based. Look at many of the Dell systems - XPS in particular - you guessed it, nForce motherboards. I used to work for Dell, trust me, there are tons of non-Intel based motherboards in the market, both business and consumer. AMD and nVidia are doing this to force Intel's cooperation, and most likely they'll end up succeeding. If for no other reason it will be because of pressure from OEM PC builders who don't want their products, built with varying chipset motherboards, to have separate compatibility issues with peripherals.


By FITCamaro on 6/9/2008 12:50:15 PM , Rating: 2
Uh that sounds exactly what the article is describing.


By TimTheEnchanter25 on 6/9/2008 1:10:42 PM , Rating: 2
quote:
They'd develop the open host controller in tandum ensuring that devices work properly between nVidia and AMD products leaving Intel products as the "odd man out".


The new Intel motherboards will greatly outsell any AMD/Nvidia motherboard with their alternate standard, just like they do now. So, even though it is 2 companies against one, it won't make Intel the odd man out.

It's hard to tell if Intel's story is true or not. It does make sense for them to finish it before releasing it, but they could also just say "here's what we have so far."

It is just foolish for AMD and Nvidia to make a standard that most likely won't be supported by any device companies.


By IntelNick on 6/11/2008 7:35:45 PM , Rating: 2
Hi, I work for Intel. For Intel's side of the story, pls read this: http://blogs.intel.com/technology/2008/06/usb_30_f...

Nick


Intel got it right...
By Strunf on 6/9/2008 12:48:44 PM , Rating: 2
How many of you didn't grew tired of the n crap, for almost a year companies kept throwing n draft, spec x.x and what not wifi products and in the end they either didn't work with each other or they did but at g speeds...

Besides USB 3.0 is arguably a selling point on its first 6 to 12 months.




RE: Intel got it right...
By FITCamaro on 6/9/2008 12:52:33 PM , Rating: 2
Honestly for what I use USB for, USB 2.0 is more than enough. I didn't even know they were developing USB 3.0.

I guess one potential use of this new spec is a new way to connect an external graphics card.


RE: Intel got it right...
By daftrok on 6/9/2008 2:05:06 PM , Rating: 1
Here are plenty of departments where USB 3.0 can be used for:
1) External Hard/Flash Drives (imagine the blazing speed)
2) Monitors (if the monitor had low enough power usage, that USB cable would be all you need for data and power)
3) Wired routers (no need for Ethernet cables)

Imagine USB 3.0 finally replacing Ethernet and VGA cables. Granted HDMI is still faster and I doubt USB 3.0 can handle 2560x1600 monitors, but still, its a step in the right direction.


RE: Intel got it right...
By PrinceGaz on 6/9/2008 2:48:11 PM , Rating: 2
Imagine USB 3.0 replacing Ethernet? No thanks. USB is fine for many tasks, but for reliable and robust network connections I'll stick with ethernet which is designed specifically for that purpose.

Honestly, is there anyone here who would willingly choose to connect a router or modem using USB instead of ethernet, if they had the option of using either?


RE: Intel got it right...
By JonnyDough on 6/10/2008 6:14:21 AM , Rating: 2
To answer your stupid question, if it was faster than Ethernet (10GB ethernet anyway) and just as secure and reliable? Youbetcha.


RE: Intel got it right...
By mallums on 6/9/2008 2:58:58 PM , Rating: 3
USB of any stripe won't replace Ethernet. Ethernet works for long distances, USB only for short ones. USB 3.0 is an optical cable standard (with a copper alternative allowed) and, while you could hook a router to a computer within three feet of each other, you still need Ethernet for the whole building.


RE: Intel got it right...
By larson0699 on 6/9/2008 3:08:19 PM , Rating: 2
I think your expectations are a bit high.

ATA drives will always be bottlenecked by the USB adapter; just go eSATA (if it's that important to you, you'll have the motherboard for it... and more come equipped every quarter)

Likewise, flash will only perform to the abilities of NAND, and while the sky is falling with USB 2.0, I wouldn't expect anything near 5 Gbps in flash.

The monitor idea is just crazy. A fractional amp won't power but the smallest LCD.. like a PSP screen. Most embedded and server applications already integrate a dated ATI or XGI GPU through VGA, so good luck replacing that.

10-gigabit Ethernet . XbaseT is all too prevalent in the industry for these server rooms to just toss out their Ethernet cables and start over with USB. Not only would the less experienced admins be confused to hell between network cables and peripherals, but IIRC there is no defined networking standard for USB outside of ad-hoc which is OS-dependent anyway.

I simply don't understand the USB-IF's decision in scaling bandwidth between revisions of their own standard.

The AGP and PCI-E buses (respectively) were always scaled relatively, i.e., 2x, 4x, 8x / x1, x2, ... x32, which made it simple to determine which fit the application best (AGP 8x didn't catch on for a while for this reason)

Then we have USB 1.1, *20 = USB 2.0, *10 = USB 3.0. Why not just another 20 times for consistency? There's never too much bandwidth, unless of course the difficulty in designing the host controller rides on instability at high speeds. Then it's just a matter of "why bother?" There were a few high speed buses before USB 3.0, but it survives on its dominance if nothing else.


RE: Intel got it right...
By omnicronx on 6/9/2008 3:19:45 PM , Rating: 2
Most of your post seems bang on, except for this
quote:
ATA drives will always be bottlenecked by the USB adapter; just go eSATA (if it's that important to you, you'll have the motherboard for it... and more come equipped every quarter)
My guess is they are not calling it 'PCI EXPRESS OVER CABLE" for no reason. Previously USB had to travel through the motherboards chipset (usually the southbridge) which is what crippled the transfer speeds for ATA drives over USB. It sounds like USB 3 gets past this issue and has some sort of direct access to the PCI express lanes which would all but break any barriers previous versions of USB had.


RE: Intel got it right...
By larson0699 on 6/9/2008 3:41:20 PM , Rating: 2
Yeah, I think I overlooked that.

Bang on!


RE: Intel got it right...
By nitin213 on 6/9/2008 10:55:53 PM , Rating: 2
Well bang on as said in the other mail..
Though I don't think a fractional amp would power the PSP screen either.. for that matter any screen with a backlight...

Cheers,


RE: Intel got it right...
By Vinnybcfc on 6/10/2008 3:31:08 PM , Rating: 2
quote:
Then we have USB 1.1, *20 = USB 2.0, *10 = USB 3.0. Why not just another 20 times for consistency? There's never too much bandwidth, unless of course the difficulty in designing the host controller rides on instability at high speeds. Then it's just a matter of "why bother?" There were a few high speed buses before USB 3.0, but it survives on its dominance if nothing else.


Does it matter that it is consistant? The fastest possible speed at the lowest possible price is all that counts.

So you would not bother making a new bus if it isnt exactly the same leap as before?

quote:
ATA drives will always be bottlenecked by the USB adapter; just go eSATA (if it's that important to you, you'll have the motherboard for it... and more come equipped every quarter)


Thats assuming the USB 3 adapter will be bottlenecking the system, it is possible that they will improve it.


USB 2.0 was great, but will 3.0 really benefit us?
By bupkus on 6/9/2008 12:52:57 PM , Rating: 3
Will USB Flash Drives even if designed for 3.0 be able to use the additional speed? The 2.0 drives appear to be rather slow considering the reviews of these products at newegg.
USB Flash Drives and older external hard drives with USB 2.0 ports are the only devices I use that could stress any USB standard. Today, any new external hdd I would buy would be eSATA.




By FITCamaro on 6/9/2008 1:19:12 PM , Rating: 3
Many new PCs still don't have eSATA though. And you can only connect one drive to it. With USB, you could get a 3.0 hub and connect many drives. Just like people do with 2.0 today.


By larson0699 on 6/9/2008 3:26:28 PM , Rating: 2
You're right in that it is much more economical to opt for the USB hub and array the drives from there. I do hope that the efficiency in those ATA/USB adapters improves in the USB 3.0 era, but for my own applications I'd rely on as few adapters as possible. Thus eSATA is more to my liking. (If only it supplied power to smaller drives like USB did...)

Similarly, you could assemble a small external workstation (much in the context of NAS, just without Ethernet) with multiple drives in RAID, channeled through the single eSATA to the host. Of course we have the disadvantage of a single link's bandwidth (as well as the additional SATA controller), but also a ton of space consolidated into one reliable native ATA volume. I would give that my vote for externally-streamed media / mass storage over USB anyday, even if USB 3.0 and the conversion therein surpasses it in raw numbers.

Unless of course there were ever a native USB hard drive.

Until then, I'll leave USB to my input devices, printers/MFD's, and flash drives.


By mallums on 6/9/2008 3:03:36 PM , Rating: 2
Flash seems slow because it is slow. It gets faster all the time, but USB 3.0 is not yet necessary.


Just checked....
By meewok on 6/9/2008 12:57:36 PM , Rating: 2
...the USB-IF website and both nvidia and amd are members. Isn't the point of a group like this to collaborate in the design and development of open standards and the "open" host controller that goes with it?

If Intel isn't happy with the distribution of work or how the group is being run, perhaps they need to bring it up with the group or leave the group and develop something on their own and don't market it under the guise that it is as being developed in conjuction with the USB-IF.

Sure, we don't have much of the backstory, but isn't this somewhat logical?

Just my .02.




RE: Just checked....
By finalfan on 6/9/08, Rating: -1
RE: Just checked....
By rudolphna on 6/10/2008 8:29:50 AM , Rating: 2
That doenst tell you jack.


RE: Just checked....
By Vinnybcfc on 6/10/2008 3:37:36 PM , Rating: 1
And if you actually read it you would see this:

quote:
The Universal Serial Bus (USB) 3.0 Promoter Group is
looking for additional contributors to its initial draft of the group’s proposed specification with a
goal to have it completed by the first half of 2008.


RE: Just checked....
By Vinnybcfc on 6/10/2008 3:39:53 PM , Rating: 2
1 rating on posting? I thought that only happened with swearing?


Bad
By B3an on 6/9/2008 11:52:07 AM , Rating: 5
This is very bad for all computer users. It could be USB 1.0 all over again. Where some devices would work and some wouldn't, because there was 2 standards.




Intel Monopoly
By GoatCheez666 on 6/9/2008 12:19:35 PM , Rating: 5
This is just another example of Intel taking advantage of its position to distance itself from the competition. Ask the Intel rep why they haven't been giving out their working drafts of the spec. Intel should be releasing what they're working with until the OHC spec is complete so other companies can begin work on producing their parts. By not releasing anything at all, Intel is most definitely giving themselves an unfair advantage in the market.




Odiogo is nice
By Einy0 on 6/11/2008 1:14:22 AM , Rating: 2
I love the built in text to speech. It's great, it comes in handy too. One issue that drives me nutz... Someone needs to teach the ass clown reader software to pronounce chipset right. It says chutset...




RE: Odiogo is nice
By BarkHumbug on 6/11/2008 10:23:48 AM , Rating: 2
Agreed, although I think it sounds more like "chussets"...


external gaming cards
By cotdt on 6/9/2008 4:46:05 PM , Rating: 3
You guys forget that external video cards will only work on AMD/nVidia USB 3.0 ports, and will not compatible with Intel's implementation. This is a huge incentive to go for AMD/nVidia chipsets, especially for people who want to game on their laptops.




Damn!
By amanojaku on 6/9/2008 11:59:47 AM , Rating: 2
Creating a second, potentially incompatible specification is the wrong way to go. Manufacturers and consumers alike will suffer, whether it's incompatibilities or higher prices due to licensing fees. Since AMD and NVIDIA are both part of the USB-IF can't they sue for money or, better yet, their share of the development effort? There's 700+ members of the USB-IF; someone can help Intel get 'er done.




By Targon on 6/9/2008 1:16:21 PM , Rating: 2
Remember the whole issue of Microsoft OOXML and how Microsoft managed to get it listed as a standard, even though it wasn't really open and Microsoft had a head-start on development because they were the ones driving the development of the standard?

This is a similar situation, something that is supposed to be open is being pushed by one company, but that company refuses to work with others to make sure things are properly interoperable. A normal "standard" doesn't see products in development until much closer to the release of the standard, rather than having the standard follow the work of the one "leading" company. This sort of thing tends to upset people and cause a LOT of compatibility problems. It is one thing to help coordinate efforts, but it is another to develop a product and then push to make it into an open standard.

In this case, it seems more like Intel decided to come up with a new technology based on stuff they do not hold the rights to. So, they do the development work, but then put it forward into an "open" standard in the hopes that it will give them an edge. I would love to see this "USB 3.0" get developed for another year, fully in the open, in order to nullify the advantage Intel has made for itself.




"A lot of people pay zero for the cellphone ... That's what it's worth." -- Apple Chief Operating Officer Timothy Cook














botimage
Copyright 2015 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki