backtop


Print 11 comment(s) - last by Trisped.. on Jan 11 at 4:58 PM

Windows users can connect up to six of the adapters

Here's a little quick product announcement for you today from Las Vegas. ZOTAC today announced a new USB 3.0 to HDMI adapter for desktop and notebook computers.
 
The USB 3.0 to HDMI adapter, which uses DisplayLink USB graphics technology, allows you to add an additional display to your computer with a screen resolution of up to 1920x1200. Of course, full 6-channel digital audio is also supported using the adapter.
 
“Enjoy videos, photos, movies, web content and more on multiple HD displays with the ZOTAC adaptor,“ said John Cummins, VP of Sales and Marketing for DisplayLink.  “With DisplayLink technology, you can expect high performance and crisp visual quality all with the ease of a USB accessory.“

 
ZOTAC indicates that a desktop or notebook computer running Windows can connect up to six of the adapters. Those running OS X will have to make do with just four. Another thing to consider for OS X users is the fact that no Mac (desktop or notebook) supports USB 3.0, so we're not so sure how well this device will work (or if it will support the maximum resolution of 1920x1200 or fallback to lower supported resolutions).
 
ZOTAC has not yet announced pricing or availability for the USB 3.0 to HDMI adapter.

Source: ZOTAC



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

lag
By GreenEnvt on 1/10/2012 6:31:56 PM , Rating: 1
i hope the usb 3.0 ones work better than the usb 2.0 ones. those are horrid. they claim they are good enough for simple browsing, but i can't stand them even for that. trying to run even youtube videos is a mess.
usb3.0 should have plenty of bandwidth, so displayinks implementation will be key.

At work they had put in several usb based "2nd monitor" setups for desktops and staff hated them. I ripped them all out and put in cheap ati 2350/3350/4350/5350 type cards, which did the just so much better, for only $10 more per desk.




RE: lag
By DanNeely on 1/10/2012 7:17:24 PM , Rating: 5
USB2 has enough theoretical bandwidth to support (uncompressed) 640x480x24(bit)x60hz video. With a junky 1280x1024 monitor you need >4 compression (probably 5x in the real world) to squeeze your video out the USB2 port; 5.75x theoretical for a mid range 1680x1050 screen. USB2 video can't help but fail when so severely bottlenecked.

USB3 has theoretical bandwidth similar to HDMI/Single Link DVI. While the adapter might have to add another frame or three of input lag to deal with USBs loose QoS requirements; there's no reason not to expect decent video quality (non gaming) from it unless you're sharing the controller with an eSSD with lots of IO.


RE: lag
By MGSsancho on 1/11/2012 4:26:44 AM , Rating: 2
Bandwidth we see advertised for USB is total. Device that by two for each direction. Encoding and other overhead along with other USB devices (webcams, mass storage, mice, keyboards, usb audio, who-knows) also take up bandwidth.


RE: lag
By DanNeely on 1/11/2012 6:29:06 AM , Rating: 2
That's clearly wrong. The 480Mbps (60MBps) is shared among read/write but isn't required to be 50/50. USB2 HDs can get >40MBps in sequential read; ~2/3rds of the theoretical bandwidth in a unidirectional transfer.

http://reviews.cnet.com/external-hard-drives/seaga...


how does it work?
By Visual on 1/11/2012 5:36:52 AM , Rating: 2
Is it using your normal video card to render and then sends the frame buffer through the USB port?
Or is the USB device seen as a separate video card itself, and has separate video drivers, and does the rendering itself? If so, does it have any 3d capabilities at all?




RE: how does it work?
By neogrin on 1/11/2012 11:31:25 AM , Rating: 2
Good Question.

Anyone know?


RE: how does it work?
By Trisped on 1/11/2012 4:53:05 PM , Rating: 2
It looks like the computer's internal GPU renders the images, a Virtual Graphics Card (VGC) converts the output to a stream which can be sent over a network (USB 3.0 in this case), the stream is sent to the hardware over USB 3.0, and the Hardware has a video adapter which converts the stream into an HDMI format.

That is based on what is found at http://en.wikipedia.org/wiki/DisplayLink


RE: how does it work?
By Trisped on 1/11/2012 4:58:18 PM , Rating: 2
From http://www.displaylink.com/technology/technology_o...

1. DisplayLink software is installed on the PC and uses resources available in the CPU and GPU to process the graphical information from your USB connected display.
2. Updates to the screen are automatically detected and compressed using the DisplayLink compression technology (DL2+ or DL3). This adaptive compression technology automatically balances the compression methods based on the content, available CPU power, and USB bandwidth, providing the best possible USB graphics experience at any given moment.
3. Compressed data packets are sent over the standard USB 2.0 cable as quickly as possible to maintain a very interactive user experience.
4. A high speed DisplayLink chip embedded in the monitor, docking station, projector or adapter decodes the compressed data back into video or graphics data.

So it seems the software is merely redirecting the video output from your GPU, compressing it, and sending it through the USB connection to your additional monitor.

There are some nice pictures on the site which may help non-techies understand.


Don't worry...
By DPigs on 1/10/2012 6:30:08 PM , Rating: 3
quote:
Another thing to consider for OS X users is the fact that no Mac (desktop or notebook) supports USB 3.0...


I'm sure it will be available as a $750 upgrade soon enough.




RE: Don't worry...
By vol7ron on 1/11/2012 12:52:19 AM , Rating: 2
Actually Apple announced it's next upgrade is to include USB3.0


RE: Don't worry...
By wysiwyg009 on 1/11/2012 2:54:42 PM , Rating: 2
And since when did Apple start announcing things..


"We shipped it on Saturday. Then on Sunday, we rested." -- Steve Jobs on the iPad launch











botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki