backtop


Print 27 comment(s) - last by haukionkannel.. on May 14 at 1:22 PM


Work at the Intel Visual Computing Institute could lead to a UI as seen in Minority Report

The best the competition can do, according to Intel

Larrabee graphics, according to Intel
Intel is thinking long-term with its "Larrabee" strategy

Intel Corporation will invest $12 million over the next five years into the new Intel Visual Computing Institute which opened yesterday. Located at Saarland University in
Saarbrücken, Germany, the Institute is Intel's largest collaboration project with a European university.

The Intel Visual Computing Institute is a new research center that will explore advanced graphics and visual computing technologies. Visual computing is the analysis, enhancement, and display of visual information to create real-time, realistic imagery that enhances interactivity with computers and other devices. Anti-aliasing can be thought of as an application of visual computing, as are advanced holographic display systems.

Most applications include games, but interactive three-dimensional data models used for scientific research, geology, financial services, and medical imaging are increasingly being explored. Intel's visual computing vision is "to realize computer applications that look real, act real and feel real". New visual computing and parallel computing algorithms are needed to achieve this vision.

The lab will conduct basic and applied research in interactive computer graphics and realistic user interfaces. One of the major foci of research will be Intel's terascale program, which examines how multiple computing cores can be used to produce higher-performance computing and life-like graphics. This will help Intel with development of its Larrabee x86 many-core GPU and its follow-on products, which is expected to launch in the first quarter of 2010 and built using a 45nm process. Although the first Larrabee products are expected to use up to 32 cores, a faster version built on Intel's 32nm process could feature up to 64 cores.

"Intel has collaborated with the world-class researchers at Saarland University in visual computing for a number of years," said Justin Rattner, Intel Senior Fellow and Chief Technology Officer. "Given the growing importance of visual computing technology, it made perfect sense to expand our relationship and form this new institute. We are confident that it will become an internationally recognized center and a driver for European leadership in the visual computing field."

The institute will employ a dozen researchers by the end of this year from a diverse group including Saarland University, the Max Planck Institute for Informatics, the Max Planck Institute for Software Systems, and the German Research Center for Artificial Intelligence. Intel wants to expand that group to sixty researchers over the next five years as Larrabee and its follow-on products start to appear.
 
One of the Institute's goals is to actively solicit other academic and industry partners to join the research activities over time. It will also partner with Intel's European hardware design labs in Barcelona, Spain and Braunschweig, Germany to optimize Larrabee designs. 
 
New software tools and driver-based optimizations are also expected to come out of the research at the Institute. This will be important for Larrabee, since Z-buffering, clipping, and blending will be done in software using a tile-based rendering approach. Order-independent transparency, irregular Z-buffering, and real-time ray tracing are also rendering features that could be implemented with Larrabee, but would require a lot of software development.

The Intel Visual Computing Institute will become a part of Intel Labs Europe (ILE). Formally opened on March 2, 2009, the Munich-based organization represents Intel's European lab network, consisting of 19 labs that employ more than 800 research professionals.

The Director of Intel Labs Europe is currently Professor Martin Curley, Professor of Technology and Business Innovation at the National University of Ireland, Maynooth. He is also the Global Director of IT Innovation at Intel. The company has five research labs located in Ireland, as well as Fab 24.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

just me or?
By BuckinBottoms on 5/13/2009 11:59:47 AM , Rating: 3
Is it me or does anyone else like the competitions image better? Larrabee image has bad motion blur and a hazy look to it. Not something I would like to see. Maybe they should invest a lot more than 12 mil.




RE: just me or?
By GaryJohnson on 5/13/2009 12:05:50 PM , Rating: 2
Both are pretty horribly fuzzy, and I think both of those are photoshop composites and not 3D renders. Is there some comparison video somewhere that those are captured from?


RE: just me or?
By Jansen (blog) on 5/13/2009 12:19:55 PM , Rating: 5
Those are screencaps from an Intel presentation. As stated, they are from Intel, so a little bias is expected.

During the presentation, they alternated back and forth, and the result was pretty dramatic. The smoke and dust effect, as well and shadow detail can still be seen.

I guess you could use Windows Photo Gallery to switch back and forth to compare?


RE: just me or?
By akugami on 5/13/2009 12:52:13 PM , Rating: 2
You can just open each jpeg in new tabs and just click back and forth. Now, motion blur (could be the result of different time the animation screencap was taken) aside the dramatic difference is in the details. There is a huge bias since I'm sure that the "other" shot could be spruced up by a large amount and come very close if not equal to the Intel shot in terms of having smoke and extra villagers running around.

I'm also curious how similar Larrabee is to the Kyro cards from PowerVR which also uses tile based rendering.


RE: just me or?
By StevoLincolnite on 5/13/2009 1:48:34 PM , Rating: 2
Intel GMA used a variation of tile based rendering in some of it's IGP parts like the GMA 900 they called it Zone Rendering Technology.


RE: just me or?
By Chocobollz on 5/13/2009 3:49:04 PM , Rating: 2
Ok, I might've believed you if what you're saying doesn't have any "GMA 900" words in it :p

j/k


RE: just me or?
By GaryJohnson on 5/13/2009 12:58:55 PM , Rating: 4
I used browser tabs to flip back and forth. But that's not what I'm talking about, what I'm talking about is some video that shows movement through the scenes. Those don't look like 3D scenes, they look like 2D composites.

Either way, being still images, they have little relevancy to the actual or even theoretical capabilities of any realtime 3D hardware.


RE: just me or?
By wushuktl on 5/13/2009 2:42:01 PM , Rating: 2
i really hope Intel can come out with something awesome because lately we haven't seen anything too ground breaking from nivida or ati. but there is nothing to take from these images. i agree with another poster who said it just looks like photoshop work, not actual screenshots of real-time rendering. when you flip back and forth from shot to shot you see the tower on the left and the soldiers in the foreground are exactly the same. Why should they look exactly the same if there's all this extra work done by the rendering to account for motion blur and volumetric dust clouds?


RE: just me or?
By MRwizard on 5/14/2009 6:34:20 AM , Rating: 2
probably from the anim8tors not giving enough details to those?


RE: just me or?
By Jansen (blog) on 5/13/2009 12:10:50 PM , Rating: 2
The smoke effect was supposed to make things hazy. I don't know about the motion blur, but that might have been intended as well.

Either way, this is not the final result. I've been told that there will be a lot more Larrabee info at IDF this year. They are also putting a lot of work into drivers.


RE: just me or?
By Aloonatic on 5/13/2009 1:19:20 PM , Rating: 1
Seems to me that the Larrabee screen shot was taken with a camera in "sepia" mode to make the scene even more olde-worlde perhaps?

I'm with you tho, hte non-Larrabee image looks better and these cards actually exists too, where as the Larrabee cards are as easy to find in the wild as the dragon in the very image that they chose to demonstrate it.

It's all very promising, but until Anand (etc) get some actual cards to test they can show all the images, screen shots and demonstrations they like. We've seen this all before.


RE: just me or?
By B3an on 5/13/2009 6:55:33 PM , Rating: 1
...Err what @ all above posts?? I'm actually LMAO.
These images are CLEARLY airbrush drawings with some photoshop editing/filters. They are not 3D renders of any sort or rendered on any graphics hardware. There mere examples.

I would have thought this would be instantly obvious to ANYONE, regardless if they dont do this sort of thing like myself.


RE: just me or?
By Shuri on 5/13/2009 10:35:57 PM , Rating: 2
That's what I was thinking, before even clicking on the images. Both images seem to have some bad compression, so that makes intel's composite look very bad.

In defense of intel though, they are trying to give people an idea of what an image rendered on their hardware might look like, they just didn't do a great job at it :)

Also, it is a well known fact that 3D rendered images tend to be unnaturally sharp, so when implemented correctly, a bit of blur in the right places is not necessarily a step back.


RE: just me or?
By Aloonatic on 5/14/2009 1:35:39 AM , Rating: 2
*sound of palm slapping into face*

Why is it that people here find it hard to realise that something that is
quote:
obvious to ANYONE
is erm, obvious to anyone? Therefore their comments take this into account and they don't feel the need to point what is obvious anyone to everyone in every comment? Well, unless they think that it makes them feel big and clever, but those people probably have issues all of their own to deal with.

Clearly the images are blurred due to some poor image compression/resizing and do seem to have been filtered quite a lot too, the source is less than clear also.

The point that people are making is that the image that Intel have chosen to show as "inferior" to theirs looks better to many.

Here's what I was saying, edited for the hard of thinking.

The Larrabee shot is a weird brown colour, makes it look older, not really sure why they want that.

Intel can show any image of what they think they can do compared to the competition, the only problem is that the competition exists, their card does not.

There is no point getting excited about all of this until the cards have been tested by independent reviewers who will show us genuine screen shots, not these ridiculous images.

I hope that's clear enough for you?


RE: just me or?
By Lightnix on 5/13/2009 2:46:57 PM , Rating: 3
The point is more to illustrate the amount of computational horsepower required to produce the particular screenshot, which could potentially be put to better effect. For example, the greater amount of people walking around, the dust, the motion blur, the more pronounced bloom effect, etc. It might not necessarily look 'nicer' in your opinion, but it requires more calculation.

How good it looks is purely subjective and is more of a design criticism which isn't really what the graphics card is responsible for. It's more of a visual representation of the ability of the graphics cards as opposed to an implication that Larrabee inherently making things look nicer.


RE: just me or?
By BuckinBottoms on 5/13/2009 3:17:38 PM , Rating: 4
Is it truly computational power or are they using an nvidia card as competition and using havok visuals to misrepresent that larrabee has some great advantage? I don't know really as I can't see what card they used in the comparison. Maybe when it reaches a third party it can be validated by running the demo against an ATI card that supports havok and checking the results.

No I think the coding is the real point here. Intel should be pushing C, C++ coding aspect of Larrabee at all times. If they get into a power argument they will loose big time with hardly any effort on behalf of the competition. Intel has to add another 16-32 cores for every single core the competition adds and that will definitely cause them problems fast.

Oh... and I hardly think looks are subjective in the graphics world. That is what the graphics world is all about isn't it?


RE: just me or?
By stmok on 5/13/2009 4:15:02 PM , Rating: 2
quote:
Intel should be pushing C, C++ coding aspect of Larrabee at all times.


Its already started...

A First Look at the Larrabee New Instructions (LRBni) by Mike Abrash
http://www.ddj.com/hpc-high-performance-computing/...
or
http://isdlibrary.intel-dispatch.com/isd/2495/drdo...
(This is the PDF version)

C++ Larrabee Prototype Library
http://software.intel.com/en-us/articles/prototype...

Game Physics Performance on the Larrabee Architecture
http://isdlibrary.intel-dispatch.com/isd/2499/Game...


RE: just me or?
By Lightnix on 5/13/2009 4:41:21 PM , Rating: 2
On the computational power front, who knows? We don't know that much detail, we'd just be using conjecture. My point is that Intel was trying to illustrate, the 'competition' can do these effects all at once whilst rendering these models, etc., etc., and Larrabee can do these extra things on top of that. "Competition" is vague, I agree, they want us to assume that means either current generation high-end graphics card from either Nvidia or ATi but that's not described.

Also, on the 'has to add another 16-32 cores for every core the competition has' point, do you have any proof or reliable sources for that or is this just conjecture? Is this an argument about the software rendering approach Larrabee takes? How are you defining 'cores' exactly? I mean Larrabee is rumoured to have 32 'cores', but they've publicly stated that each of these cores will definitely have a 16-wide, 512-bit vector unit, meaning each Larrabee 'core' would effectively have the equivalent of 8-16 of what Nvidia calls 'cores' but are really just arithmetic logic units. That, however, is still a totally invalid comparison given the extreme differences in their architectures. I'm curious as to where you drew your numbers from, though.

Also, appearance is entirely subjective in the graphics world because that's the opinion that somebody has, as defined in the oxford english dictionary:

http://www.askoxford.com/concise_oed/subjective?vi...

You might really like the sepia colours of the supposed Larrabee version, you might not. You might appreciate the motion blur, you might think it just makes the picture look nasty. Some people think that 2D sprites look better than 3D models, that's not a widely held opinion but some people certainly have it. It's an opinion, it's subjective. It's a relevant opinion, sure, but still just an opinion.


RE: just me or?
By BuckinBottoms on 5/13/2009 5:31:28 PM , Rating: 2
quote:
Also, on the 'has to add another 16-32 cores for every core the competition has' point, do you have any proof or reliable sources for that or is this just conjecture?
Well its most certainly conjecture at this point for me as I don't seem to have a Larrabee on hand. :)

I think I am mostly going on the verbiage that Intel is using here. They are calling them "cores" where as Nvidia/ATI refer to their tech as logic units. I suppose it will take proper analysis to determine the strength of one versus another. Given they need 32 cores to equal the same throughput I am indeed taking a leap in saying an increase in one is proportional to another through shear math reasoning. But that leaves scaling to which nobody knows at the moment. I concede I could be way off the mark there.

I realize that visuals are subjective. That is why I asked everyone's opinion. While it is subjective there is a proof of concept that determines which is better and that would be polling people. If more people like one over the other then you sort of have a consensus. That is all I was asking.


RE: just me or?
By Harmik on 5/14/2009 8:52:13 AM , Rating: 2
Thats exactly what I thought as soon as I sore it, then I read your comment.


If Intel was a contender from the beginning....
By Tegrat on 5/13/2009 11:54:56 AM , Rating: 2
... I would have some faith in this Larabee project. Up until now their IGP's have been Subpar. From processing power to poor drivers.

Prove me wrong with Larabee Intel!




By stmok on 5/13/2009 4:10:08 PM , Rating: 1
The Larrabee and Larrabee 2 projects are NOT being developed by the IGP team. In fact, they're two completely different teams!

The project so far, has been interesting. It can do both rasterization and raytracing...As well as physics. (Not to mention GPGPU roles).

Of course, all this won't mean much if it isn't competitive with ATI or Nvidia. Time will tell on that.


By robertisaar on 5/13/2009 4:33:20 PM , Rating: 2
notice my user id? notice where all of this is happening?

ok, downrate me now...




come on. give me a F**king break
By rainyday on 5/14/2009 1:48:30 AM , Rating: 2
to me it looks like the competetion is rendered in DX9c level and intel in DX10 with HDR and bloom effects. they make the intel image first, and them dumbing down the competition to make a comparison. do u really believe otherwise? i dont coz i'hv seen better smoke and lighting effect in crysis DX10 very high than the image of competition (which looks like DX9 crysis).




larrabee will be amazing, but...
By lycium on 5/14/2009 5:31:45 AM , Rating: 2
intel's marketing is as criminally misleading as ever - they're fond of highly imaginative BS like pentium3's making your internet faster (anybody remember that? even as a young teenager i laughed heartily...)

as a graphics programmer i absolutely cannot wait for my larrabee card. the instruction set was co-designed by a great programmer (mike abrash, pc graphics veteran and co-developer of quake1), not the clueless designers responsible for the utter mess that is SSE1-4. the people doing the fixed-function logic are not the developers of their poopy laptop integrated graphics chips. and intel make great software tools.

i try to support amd as much as possible, but man oh man this changes everything and they'd better hurry up with their fusion thingy (whatever that may be).




Minority Report?
By Jacerie on 5/14/2009 10:04:56 AM , Rating: 2
quote:
Work at the Intel Visual Computing Institute could lead to a UI as seen in Minority Report


I guess none of the teams at Intel have seen the work being done at oblong.com.




By haukionkannel on 5/14/2009 1:22:54 PM , Rating: 2
I supose, that the Larrabee will be fast enough. The problem is, if there will be support for Larrabees best abilities.
We know what happened to DX10.1. Not well supported, even it could make graphic faster and thus better looking. Or how about PhysX? Allmost the same thing.
The Larrabee will rise or fall depending on the game developers support to that architecture. We will see some "Plays like Intell means it to be played" tittles, but is it enough?
Intell has knowledge to make fast hardware, but in graphicks it is in the same situation, when they released their first 64bit Xeons. Good CPU, but no support, and really slow backward 32bit support. I think that they have learned from that, so there should be very good support to older products. How fast or good it will be with them, tells a lot how it will fare in future.




"Young lady, in this house we obey the laws of thermodynamics!" -- Homer Simpson

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki