backtop


Print 33 comment(s) - last by Lazarus Dark.. on Apr 5 at 12:29 PM

The struggle between DisplayPort and HDMI continues

This week marks an important move forward for the DisplayPort special interests group as the Video Electronics Standards Association (VESA) has officially approved DisplayPort version 1.1 as an industry standard. Despite the approval, there is an ongoing struggle in the graphics industry about which technology will make the cut as the de facto high-definition PC interface: DisplayPort or HDMI.

According to VESA, the DisplayPort standard has come quite a long way. "DisplayPort 1.1 gives manufacturers of LCD panels, monitors, graphics cards, PC chipsets, projectors, peripherals, components, and consumer electronics a next generation digital interface that is designed to replace LVDS, DVI, and eventually VGA," said the statement.

VESA indicates that the benefits of DisplayPort are significant and important, and that the group thinks DisplayPort will be integrated into many next-generation PCs. "Our task groups and committees within VESA worked very hard to ensure that DisplayPort 1.1 satisfies the important objectives it is designed for, and as a result, this new version has widespread support among all the leading computer and consumer electronics suppliers."

Major developers like AMD, NVIDIA, HP, Intel, Lenovo and Samsung have said that they will fully support DisplayPort. According to the release:
Available throughout the industry as a free to use, open and extensible standard, DisplayPort is expected to accelerate adoption of secure digital outputs on PCs, enable higher levels of display performance, and introduce high volume digital displays that are simpler, thinner, and easier to use than VGA.
On the other end of the spectrum, the groups backing HDMI argue that while there are valid features in DisplayPort, HDMI can do everything that DisplayPort can and more. The most pominent factor however is the fact that DisplayPort doesn't have solid definitions for licensing. Although the DisplayPort group claims that there is little to no fees, the HDMI group points out that there are also no restrictions on adding in fees at a later date.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

By hmurchison on 4/3/2007 7:48:57 PM , Rating: 5
HDMI is a debacle.

At CES 2007 there were how many HDMI equipped AVRs? Sherwood Newcastle is the only one I know.

Search Google for HDMI problems and marvel at the endless pages of info denoting the clusterfck that HDMI is.

Handshaking issues, plugs that fall out because they have no locking mechanism, proliferation of formats "is it HDMI 1.1 , 1.2, 1.2a, 1.3, 1.3a)

Or what about useless HDMI 1.3 that support native TrueHD and DTS HD yet no AVR is smart enough to mix things right so the HD DVD or Blu-ray players just mix the audio down and send out PCM.

What a useless connectivity standard wrought with flaky DRM and weak performance. DisplayPort shouldn't have to improve much to usurp the stinkfest that HDMI is.




By WayneG on 4/3/2007 7:56:57 PM , Rating: 2
I see it this way: HDMI is a format primarily designed for the home entertainment devices, this is evident from the fact that it carries an audio signal as well. The numerous amounts of revisions makes it nearly impossible to implement on a PC device since PC users want to have the best all the time and can't afford to change monitors/graphics cards to suit the best HDMI revision available. Displayport on the other hand is designed primarily for graphics cards, none of that silly audio routing (DTS, etc.) which to be fair is more of an annoyance to the graphics/monitor and sound card manufacturers since they have to find a way to route the sound through the graphics card to the sound card to your speakers which adds to latency causing unnecessary problems. It's fairly easy to see the most majestic method of transporting video on the PC is simply through a dedicated cable, in this case Display Port. HDMI should stay with media and entertainment since consumers like simplicity, they don't care what HDMI revision they have or how the audio is handled, they just want it to work. Simple as ;)


By KristopherKubicki (blog) on 4/3/2007 10:11:47 PM , Rating: 3
DisplayPort has HDCP and another copy protection scheme called DisplayPort Copy Protection. Of course, UDI, HDMI and DisplayPort can all operate find without copy protection schemes as well.


By hmurchison on 4/4/2007 1:07:52 AM , Rating: 5
I don't mind that it has DRM. I mind that it has flaky performance because of DRM. If consumers have to deal with DRM it should be as transparent as possible. I shouldn't have to memorize a power on sequence just to get my HDMI products to sync right. hmurchison getting gone baby.


By BillyBatson on 4/3/2007 10:42:33 PM , Rating: 2
SOOO true. As stated in the article DP, at this point in time anyway, is supposed to be free to very little cost to license while HDMI is supposed to be expensive especially when it comes to which version and what features you support and that cost is no doubt passed down to the consumer no matter how little that cost is.
Anyone know roughly how much ends up coming out of the consumers pocket for purchasing a device with HDMI support?


By hmurchison on 4/4/2007 1:10:33 AM , Rating: 2
Billy HDMI seems to be a licensing boondoggle. One of the problems is that a lot of cool stuff seems to be "optional" in the spec. I've heard that HDMI is not that cheap and the chipsets really aren't made by a lot of companies so pricing stays high. Look how long it has been around and its penetration is relatively low.


By Lakku on 4/4/2007 12:30:56 AM , Rating: 2
I don't see the need for True HD and DTS-HD when I already have lossless linear PCM audio right now. It is a bit for bit match of what the film makers/studio intended, and it's hard to get any better then that. If you care about True HD or DTS-HD, I say you are buying into hype and just allowing them to collect licensing fees for something that is essentially not needed. Besides, ultimately, PCM is just the digital representation of an analog wave, and all Dolby and DTS do is allow for the compression of such wave, but it is still technically PCM audio. So to conclude, linear PCM is lossless, and therefore I put forward that DTS-HD and True HD are not needed and/or just marketing hype so the companies can continue to make money in a world where compression is no longer needed, so, who cares if any output method supports them.


By hmurchison on 4/4/2007 1:13:32 AM , Rating: 2
DTS HD and Dolby TrueHD are superfluous features if you have enough space to include uncompressed PCM. I won't argue against PCM but sadly when you take HD DVD and Blu-ray both are confined at some point by the movie and concessions have to be made. If space is tight then DTS HD or TrueHD is a choice to recoup some datarate pack. If you have an ocean of storage available then PCM is the "free beer" choice.


By Visual on 4/4/2007 5:02:29 AM , Rating: 5
But storage space and connection bandwidth are two different things. If you lack storage space, you're free to store the audio as DTS HD or TrueHD on the media. The player should still decompress it and send it as PCM through the cable. There is no logical reason to want compressed audio going over the cable, unless you're trying to save on bandwidth. The video is sent uncompressed after all, why should the audio be different?

As it is now, HDMI 1.3 has 10.2Gbps bandwidth. Uncompressed 1080p60 8bpp video is 60*1920*1080*3*8 = 2.8Gbps, even Deep Color (10, 12, even 16bpp) is at most twice that, so at most around half the available bandwidth. In comparison, audio is absolutely insignificant. 96KHz*16bit PCM is less than 0.0015Gbps per channel, even an excessive 192KHz-24bit-no-such-surround-22.1-channel stream will be a mere 0.1Gbps... why the hell would I want to send it compressed?


By vanka on 4/4/2007 3:06:33 PM , Rating: 2
quote:
If you lack storage space, you're free to store the audio as DTS HD or TrueHD on the media. The player should still decompress it and send it as PCM through the cable. There is no logical reason to want compressed audio going over the cable, unless you're trying to save on bandwidth.


Most Home Theater geeks will disagree with you on that one. Say I spend several thousand on a state of the art a/v receiver that has support for DTS HD and TrueHD with algorithms that recover the bits lost in compression (kinda like what the X-FI does for mp3s). Under your proposal I pay extra for an HD player that has to uncompress the audio stream - an HD player doesn't use an optimal algorithm.


By Zorlac on 4/4/2007 4:30:27 PM , Rating: 2
"DTS-HD" is not lossless.

"DTS-HD Master Audio" is lossless.

I made this mistake on AVSForum once and labeled myself as a n00b without even realizing it! ;)


By vanka on 4/4/2007 5:52:34 PM , Rating: 2
I was working on the assumption (which you just confirmed) that DTS-HD was lossy. The whole point of my post was that there is a logical reason for wanting HDMI to be able to transfer compressed audio and to not have the HD player uncompress it - namely if I have invested several grand in an A/V receiver that has advanced algorithms to restore the lost bits of lossy codecs.


By fic2 on 4/4/2007 2:21:10 PM , Rating: 2
According to a friend of mine that is trying to do HDMI connection programming this is SO true. I was talking to him a couple of weeks ago and he went on and on about the problems he had encountered. He would get one device to work and another that was working would quit.

And I thought my job doing programming in hex was bad...


One cable to rule them all...
By Lazarus Dark on 4/4/2007 6:50:29 AM , Rating: 2
Okay, lame jokes aside, I have wondered for some time: Why don't we just use one cable standard for everything? What I mean is, when I read hdmi has 10-whatever gbps, I think, my ethernet doesn't even have that. I understand high def video needs that bandwidth, but why can't I use that bandwidth for other devices? USB has been great in that it has been universally accepted. But, 2.0 won't last forever, eventually we will need more for our external devices. Esata is starting to be more widely adopted. External pcie is coming. But all these only do some things. Why not make a cable that transfers any data. Take the bandwidth of hdmi and just adopt protocols so that any data can be transmitted, whether it be uncompressed audio/video to a display or data from an external hard drive to the pc or data exchange between networked pc's. One cable standard that does everything, with proper protocols and standards, I would think this would be ideal. Just add in some power for powered devices and you eliminate 90 percent of the cable mess we have to deal with.

Am I just daydreaming? Is there some reason this is not doable? Other than royalties on frickin cable designs, that is. Seriously, what do you guys think?




RE: One cable to rule them all...
By Mitch101 on 4/4/2007 11:33:32 AM , Rating: 2
We wouldnt have 20 different power chords either. Amazing how many variations we can come up with for 2 and 3 prong connections. Any why its not like any of them are that much smaller than the other so its not a design requirement.


RE: One cable to rule them all...
By Mitch101 on 4/4/2007 11:35:17 AM , Rating: 2
USB deserves a kick in a nuts as well with how many different connections it can have. I have 3 different USB cables hanging off my PC because of the differences in one device to another just had to do it different than everyone else.


RE: One cable to rule them all...
By Hawkido on 4/4/2007 6:15:55 PM , Rating: 2
quote:
Why don't we just use one cable standard for everything?


That's a good question. I had the same thoughts, and found a bunch of experienced CSEG's (Computer systems engineers) to give me the answer. The more types of data protocols and formats you transmit over an interface the more overhead and latency you create. The overhead is added when with every packet or cell in a stream of data that identifies what type of data it is, and latency is caused by dealing with the varying packets and deciding what to do with the data based on the type of packet or cell. Plus you will add contention to any media that is used by more than one device, causing additional latency and jitter (deviation in latency caused by bad luck in the standoff roll of the dice)

To add to the muck, high-speed high-density data transmission requires shielding to prevent signal corruption and such, making the cables cost more and require much more QA at production and therefore much more expensive.

Switching to one cable interface would mean that your mouse would cost an additional $15-$20 dollars and have a cable 1/2 inch in diameter, it would also be as flexible as a stick.

So one cable, no. Perhaps 3 cables (USB 2.0 for light local comm, a varient of CAT5 (very inexpensive) for long haul comm, and a heavily shielded cable for short range high volume data) which is pretty much what we have now, although we have different flavors or brands (licensing ya' know) for each of the 3 basic classes of comm cable.

I hope this sheds some insight, and as always I am mostly repeating hearsay that may have been dumbed down to the level that my sources think I can handle.... LOL


By Lazarus Dark on 4/5/2007 12:29:12 PM , Rating: 2
that actually does make some sense, I suppose. Then again like someone said, eventually we'll probably go mostly wireless. I'm already using bluetooth for my kb/mouse. Wireless usb is coming out. 802.11n could eliminate the network cable for many. Maybe wimax cards could eliminate that further. And some companies I believe are working on wirelessHDMI even. I'd much rather see that. So I guess eventually we'll eliminate most of the cables.


By thecoolnessrune on 4/4/2007 7:08:47 PM , Rating: 2
VGA: ""Thats what we are TRYING to do, if these other companies wouldn't keep getting in the way!"

DVI: "Thats what we are TRYING to do, if these other companies wouldn't keep getting in the way!"

HDMI: "Thats what we are TRYING to do, if these other companies wouldn't keep getting in the way!"

UDI: "Thats what we are TRYING to do, if these other companies wouldn't keep getting in the way!"

DP: "Thats what we are TRYING to do, if these other companies wouldn't keep getting in the way!"


UDI?
By Chriz on 4/3/2007 7:55:04 PM , Rating: 2
Um, isn't UDI still competing with HDMI and display port also? That was the one I was interested in, but I guess nothing happened with it since it wasn't even mentioned in this article.




RE: UDI?
By hmurchison on 4/3/2007 8:20:20 PM , Rating: 2
CES 2007 saw Samsung and Intel leaving. Apple will likely not stick with UDI as well (my guess)


RE: UDI?
By KristopherKubicki (blog) on 4/3/2007 10:09:56 PM , Rating: 2
Yeah I just got confirmation from Intel the other day -- UDI is looking kinda dead in the water


If there's a bright side...
By MonkeyPaw on 4/3/2007 10:52:34 PM , Rating: 2
I wonder how hard it would be to produce graphics cards that have both ports? Cards have shipped with D-sub and DVI for years, and HDMI and Display port are both smaller connectors--it looks like you could have 2 of each on the backplate with room to spare. If that were the case, you wouldn't have to worry about which standard has more support.

Has DVI reached its limits? Is this why new ports are in the making, or is it just for DRM and all-in-one (video+audio) connections?

I guess what disappoints me is how these video connections and HD-content wars don't even have final standards. PCIe and SATA arrived with relatively few revisions, and backwards compatibility doesn't appear to be a problem when revisions are announced. HDMI/DP and HD-codecs don't appear to be so set-in-stone. Why would I want to adopt any of it when I have no confidence in its long-term value? Is it any wonder the tech sector is so volatile?




RE: If there's a bright side...
By Visual on 4/4/2007 5:25:52 AM , Rating: 2
DVI has reached its limits, yes - but there is dual-link DVI to the rescue, so no real reason to abandon DVI yet.

HDMI started out as just DVI+audio over the same cable, with a smaller nicer connector, definitely a win for CE devices and not bad to have on PCs too... later revisions 1.1, 1.2, 1.2a changed nothing in the physical connection, only further specified things as content-protection metadata, Consumer Electronic Control features and command sets, and 1.3 increased the single-link bandwidth more than twice (i.e. it gets more bandwidth not by adding more wires but by increasing frequency).
So the latest HDMI is a good deal better than DVI, and given that it's already winning grounds in the CE market I would choose it as the best for PCs as well. Licensing fees are the only reason alternatives as DisplayPort are even considered, but I've never seen actual price differences being quoted and so can't say if it's worth it... anyone have a clue how much we're talking about?

I really don't like the idea of having to deal with yet another non-standard standard just to save a dollar on my video card, I'd rather stick with HDMI.


RE: If there's a bright side...
By jtesoro on 4/4/2007 7:28:27 AM , Rating: 2
I'm connecting my video card to my 1280x1024 res LCD via the standard VGA cable. Somehow I feel that I'm not really missing anything (video wise) against DVI, HDMI or whatever. Am I wrong here? In case it's relevant, I use my PC for gaming and web access. No movies on this setup.


UDI is dead
By iwod on 4/3/2007 9:53:14 PM , Rating: 2
I think Apple is more likely to be in alliance with intel over this issue. Since Intel has pulled out i suppose Apple will as well.

But wasn't DisplayPort 2.0 in the making as well? I thought 1.1 was old news?




RE: UDI is dead
By KristopherKubicki (blog) on 4/4/2007 1:34:54 AM , Rating: 2
It takes a long time to ratify these type of standards. 2.0 was just proposed, it might be a year or more before it's even ratified.


do we even want cables?
By gorobei on 4/4/2007 7:52:07 AM , Rating: 2
Next gen phone and MP3 players are going wireless for data transfer, so are printers and NAS. With bluetooth, usb wireless, and 802n coming out; more and more of our devices are getting rid of wire connections or only using them for charging batteries.

If you extrapolate, the next items would be tvs, home theater speakers, PC monitors. In such a wireless world data encryption would be pretty critical. But at least the standard would use the same hardware and let the manufacturers fight it out in software.

Either way it's a small price to pay for not having to worry about wires and cables and my cat chewing through them and being electrocuted.




RE: do we even want cables?
By bldckstark on 4/4/2007 12:47:40 PM , Rating: 2
Having a cat chew through them and getting electrocuted is reason enough for laying 480v cable throughout your house.

(yes, this is a joke, I have a cat)


Why, Why
By andrinoaa on 4/4/2007 5:58:42 PM , Rating: 2
Guess what us plebs want?
One frigin cable to send and receive. HOW HARD IS THAT??
Standards are fine but what a waste of resources we have at the moment. Just try buying a tv and hooking up the dvd, settop box , vcr and computer. There is just too much crap.
All those connectors cost us money and time to figure it out.
Guys , this is the year 2007, surely we are smarter than the current set up? This is a classic case of the competition mantra being HUMBUG !




why?
By michal1980 on 4/4/07, Rating: 0
"It looks like the iPhone 4 might be their Vista, and I'm okay with that." -- Microsoft COO Kevin Turner

Related Articles
The Future of HDMI
February 19, 2007, 12:47 AM
DisplayPort 1.1 Adds HDCP Support
November 7, 2006, 7:17 AM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki