backtop


Print 25 comment(s) - last by AnnonymousCowa.. on May 8 at 8:08 PM

Yet another digital display connectivity standard

DVI-I, DVI-D, UDI, HDMI -- a confusing group of abbreviation for many. Interestingly, all of them do similar things and the two later ones attempt to address the same issues including backwards compatibility while being different themselves. As far as standards go, computer and digital displays have pretty much been using one big standard, DVI. However, industry supporters say that connectivity is too confusing, and in fact, will now launch a newer standard, called DisplayPort.

DisplayPort, designed by the VESA group, attempts to do one thing: unify digital display connection interfaces. Like UDI and HDMI, DisplayPort will be backwards compatible with DVI. The specification claims however, that DisplayPort offers greater bandwidth for HD video while at the same time offering a connection interface that's simple and easy to use.  Dell's press release claims:

The DisplayPort specification also addresses the industry need for a ubiquitous digital interface standard with a compact connector, as well as optional content protection, that can be deployed widely at low cost. A protected digital interface that can be easily deployed on a PC enables broad access to premium content sources such as high-definition movies.

The DisplayPort interface is designed to be used for all types of digital display connections, including internal connections in a notebook, monitor, or TV. This capability makes it possible to avoid the costly signal translation from one display format to another that is required with today's display interfaces.


Interestingly, the original supporters for UDI and HDMI also vouched to offer the same things. What consumers hate most however, are too many competing standards that just confuse the purchasing and learning progress. Too many standards also drive up manufacturing costs, where some manufacturers end up implementing an entire range of standards just so that their customers won't be left out. This drives up costs, and in the end, the consumer is the one left paying.  HDMI cables, for example, still cost upwards of $30 for even small lengths.

DisplayPort, however, is now receiving major industry support by Dell, HP and Lenovo -- three major PC giants that make up most of the world's desktop PC shipments. The HDMI forum recently said that HDMI would replace DVI by 2008. DisplayPort's supporters say however, that its standard will be superior to existing and emerging standards.

On the other side of the industry, Blu-Ray and HD-DVD still have yet to come to terms. Drive manufacturers are now implementing both standards into drives and it looks like we will have another DVD-R and DVD+R situation on store shelves. For now however, it is uncertain which digital interface will succeed but DisplayPort definitely has significant industry backing.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

It all comes down to money
By AppaYipYip on 5/5/2006 5:57:22 PM , Rating: 2
DVI has MORE than enough bandwidth for HD content, the manufacturers each want a peice of the profit. The only need for HDMI (which is actually terrible once the length of cable goes beyond 12 feet) was to integrate digital sound and video into one connection.




RE: It all comes down to money
By Trisped on 5/5/2006 6:47:44 PM , Rating: 2
I thought HDMI was just suppose to add DRM to the DVI connection...

Still, I doubt that we will be really replacing the duel link DVI any time soon.


RE: It all comes down to money
By brystmar on 5/5/2006 9:48:13 PM , Rating: 2
incorrect -- it is already possible to implement HDCP over DVI, although it is *optional* for the manufacturer to implement HDCP or not on the device with DVI. HDMI connections *require* HDCP protection on every port, in addition to integrating digital audio and control information over the same cable.


RE: It all comes down to money
By Trisped on 5/8/2006 1:35:00 PM , Rating: 2
Oh, I had HDMI and HDCP switched. Stupid companies with their dumb acronyms for useless tech just so they can make more money. If I had a flame thrower and some butter they would be so sorry.


RE: It all comes down to money
By ViRGE on 5/5/2006 7:59:56 PM , Rating: 2
The OP is correct. To put things in perspective, a single-link DVI connection can drive a panel up to 1920 x 1200(that's 1080P, folks), and dual-link can go above 2560 x 1600. Single-link is enough for HD, dual-link is overkill; what would you possibly do with even more video bandwidth?


RE: It all comes down to money
By SunAngel on 5/5/2006 9:57:23 PM , Rating: 1
The same reason knuckleheads want video cards with framerates greater than 30fps, bragging rights. Human eyes can not notice anything greater, but the so-called enthusiast will definitely tell you they can (ROFL).

Why can't we just have HDMI with HDCP content protection. We would have a one cable solution for audio and video, protected path from pc to display, and backwards compatibility with DVI.


RE: It all comes down to money
By Ralph The Magician on 5/5/2006 10:41:13 PM , Rating: 2
Dude, 30FPS in a game is pretty shitty. Most human eye can't notice anything above 30FPS when you are viewing FLUID MOTION.

It's very different when you are viewing independently rendered static images. There is a natural...I guess you could call it a "blurring" effect that happens when you record live motion, that doesn’t happen in say, a game.

The standard for things that are rendered by way of independent static images is actually more like 60FPS. That's the point where most people no longer benefit from any more FPS. That's also pretty much the "limit" for an LCD, because most LCDs only run at 60Hz, so they can only accept 60FPS. With CRTs, that limit is slightly raised to about 85FPS if you have a good CRT because the refresh rate is 85Hz.

Can you tell the difference between 60FPS and 85FPS? Some can. I can't. Can you tell the difference between 30FPS and 60FPS? Everyone can.


RE: It all comes down to money
By SunAngel on 5/6/2006 6:58:55 AM , Rating: 2
My bad. I keep foregetting about the First Person Shooting Games folks. I do not play those types of games and it did not occur to me higher frame rates make the game SEEM more enjoyable.

I will hold to my statement for all other games. I would even go as far to still say 30 fps in games is all that is necessary because with the video card rendering at 30 fps and the refresh rate of a monitor or lcd set at 60+MHz that would in effect equal 90+ fps. At 90+ fps and you do not see fluid motion then you really have more of a mental issue than a physical one.


RE: It all comes down to money
By gersson on 5/6/2006 10:12:41 AM , Rating: 2
http://amo.net/NT/02-21-01FPS.html

I logged-in only to combat your trolling.
Just because you are uninformed doesn't mean we're nuts. In fact, quite the contrary. Look do me a favor and put your refresh rate @ 60hz and stare @ it all day. Receive your punishment.


RE: It all comes down to money
By lethalchronic on 5/6/2006 10:25:22 AM , Rating: 2
The fact is the human eye can see WELL BEYOND 60fps or 90fps. In fact technology has not been invented yet that can produce image speeds that the human eye cannot detect. For reference frames per second can be directly translated to fractions of a second, i.e. 60fps is equal to 1/60th of a second (very simple). The following is a test that is currently employed by the USAF: the seat a pilot in a dark room with a black screen, they then flash a image of an airplane (single "frame") at 1/220th of a second. Not only can nearly all pilots see the image but can also identify the aircraft type. This is proof that 220fps is nowhere near the threshold of the human eye.


RE: It all comes down to money
By hakan on 5/7/2006 4:17:52 AM , Rating: 2
You are comparing how the eye interprets successive frames (fluid motion) to single flashed images. They do not have much in common. Sure, perhaps a lot of pilots can pass that test, but I don't expect they will be able to identify the aircrafts in successive images. At 220fps you're not likely to see much but blur. 220 images in one second. *brrrt* Now identify these aircrafts, please.


RE: It all comes down to money
By Ralph The Magician on 5/6/2006 10:29:03 PM , Rating: 2
Wow. Just...wow. You have no idea what you are talking about. You don't add the refresh rate to the frame rate. That's not how it works. If the monitor is set to refresh at 60Hz, and the input frame rate from the video card is set to 30FPS, that means that the frame rate is still 30FPS. All that means is that the monitor refreshed every frame twice, eventhought it hasn't changed. The refresh rate is the "limit" of how many frames the monitor can handle from the video card. If you have a refreh set to 60Hz, and a frame rate of 120FPS, you don't have 180 FPS. You still see 60FPS. The every other frame is just dropped.

You really don't know what you are talking about. You should probably learn a little before you keep talking.


RE: It all comes down to money
By SunAngel on 5/8/2006 7:51:53 AM , Rating: 1
I believe I follow you now. So, let me get this straight. I take my 1080i dvi tv and hook up my video card that is capable 30+ fps on any game. Just for the sake of argument lets choose Doom 3 as the game and the Nvidia 6800 Ultra as the graphics card. This card executes well over 80 fps for this game. But, because my tv only refreshes at 30 MHz the game will be jerky and tearing because of the "limit" of the refresh rate of the tv. Yet, its tearing because the card is rendering faster than the refresh rate to the tv.

I may be mistaken, but tearing occurs because of ADDITIONAL frames. Any video card rendering greater than the refresh rate creates this effect. In other words, it would be impossible for me to game on 1080i because, as you write, I would be limited to only 30 fps. Anyone that games at this resolution will tell you they witness no such issues. This was part of my basis for saying 30 fps output from a video card is all you really need. Furthermore, why purchase a 6800 Ultra and V-sync it to 30 MHz? Wouldn't a card with high memory bandwidth to produce texturing solve your issues of "eye candy"?


RE: It all comes down to money
By Zoomer on 5/8/2006 4:46:35 AM , Rating: 2
It is more of the minimum fps that counts. Avg 30fps, min 8fps will be very noticable to just about everyone.

Besides, I can set my monitor to 150Hz at XGA. I'd like to see a LCD beat that. ;)


RE: It all comes down to money
By Trisped on 5/8/2006 1:45:59 PM , Rating: 2
I think you are still confused. If the display devices is working at 60 Hz, it is showing 60 frames(pictures) a second. No matter how fast your video card can render frames (fps) you will never see more then 60 a second.

I run my CRT at 1600x1200 at 75 Hz because anything lower makes it look like the screen flickers (so some people will be able to use higher fps and still get benefit).

The reason gamers go for the insane fps is because most games run well until you get 3+ players in the same area trying to kill each other with big flashy weapons. Then the game will slow down from something around 75fps to 15fps (or for me from 30 fps to .5 fps), which makes it hard to hit your target because the frame where you spun the target into your cross hairs was not rendered. Yes I think it is stupid to pay 1200+ for your graphics solution when you can wait 12 months and get the same performance for $500 or less, but some people work full time, share an apartment with their buddies, and have no kids. They have so much money it burn holes in their pockets.


By AnnonymousCoward on 5/8/2006 8:08:43 PM , Rating: 2
Sorry, you're wrong. There's a big difference between 30 and 80 fps. It doesn't have to be for first person shooters.

I wonder if you have any experience with this, especially because of your first post that claimed beyond 30 for any game was overkill. I think if you did an experiement, you'd notice the big difference. Til then, don't go around making these claims.


By peternelson on 5/5/2006 11:31:48 PM , Rating: 2

Well since the data format on single link DVI is commonly interpreted as 8 bits luminance per colour channel, a very valid (and approved standard way) of signalling more than 8 eg 10 bits per colour channel (which is broadcast HD as in HDSDI) is to carry extra bits in the second ie dual link but at the same resolution of pixels.

DVD is limited to 8 bits but some other things are not. Particularly scans of movie film which have bigger dynamic range. Now we just need displays capable of imaging that kind of resolution of colour depth in TFT. CRT can do it of course.


Enthusiasts
By Questar on 5/6/2006 1:30:17 PM , Rating: 2
Wow, for technology "enthusiasts" you guys sure do bitch a lot.

What's wrong with more than double the bandwidth?




RE: Enthusiasts
By wyefye on 5/6/2006 11:21:15 PM , Rating: 3
Think of it as one of those products that advertises a new feature.. one that you think "wow, holy shit, i could really use that - that will make my life so much better" when in reality,(kind of like sli) it's not doing jack shit for you at all because current technology cannot use it to its full advantage. DisplayPort is to DVI as SATA is to EIDE.. you'd think that since the older standard has more conductors, you could cram more bandwidth in right? wrong according to manufacturers.

Just a thought.


vga
By Missing Ghost on 5/5/2006 7:25:58 PM , Rating: 2
I suppose that won't include analog signals? Cause I would not buy a video card with digital only signals.




this is so useless
By fliguy84 on 5/5/2006 9:40:26 PM , Rating: 2
Just when people are starting using DVI for their computers, HDTVs and console...




what this is
By Wwhat on 5/7/2006 9:51:30 PM , Rating: 2
What this is a standard that bypasses the move to HDMI which was designed to milk companies for license rights, they have to pay a standard fee plus a fee per connector to the HDMI group, which is a nice scam if they can get away with it.
This then is a way to have the same thing without being milked by the HDMI group, so a worthwhile thing and better than HDMI because of that, and that's also why they must release it now, so that it might win from HDMI.




Shouldn't it be DisplayPort?
By mikecel79 on 5/5/06, Rating: 0
"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki