Print 20 comment(s) - last by Silver2k7.. on Oct 2 at 5:34 PM

GT300 DX11 chips using 40nm process expected to be revealed

The inaugural GPU Technology Conference (GTC) will be hosted by NVIDIA over the next three days at the Fairmont San Jose. It replaces the company's NVISION love-in with a more formalized approach focusing on three tracks.

The Emerging Companies Summit is supposed to help start-ups that are basing their business models around GPU computing. They will have the opportunity to present their company to entrepreneurs and venture capitalists involved in the GPU computing ecosystem.

The GPU Developer Summit is the traditional series of technical presentations, tutorials and panels aimed at developers GPU-based computing applications. Advanced programming techniques using industry-standard languages such as C/C++ and Fortran will be covered, as well as the use of APIs such as Direct3D, DirectX Compute, OpenCL, and OpenGL.

The NVIDIA Research Summit is targeted at researchers and academics using GPUs in science and engineering research who are seeking information on how GPU computing can increase computational power and reduce time-to-discovery.

President and CEO Jen-Hsun Huang will lead off the opening keynote at 1300 PST. He is expected to give an update on NVIDIA technology and reveal details about the GT300 series of 40nm DirectX 11 GPUs. The company is facing sales pressure from ATI after its launch of the Radeon HD 5800 series ahead of the Windows 7 launch.

GTC was originally supposed to be capped at 1000 attendees, but a NVIDIA spokesperson said that the total number of attendees will be almost 1500 due to "overwhelming demand".

“The GPU has quickly become one of the world’s most important processors,” stated Bill Dally, Chief Scientist at NVIDIA. “This event will bring together some of the brightest minds who will share the tools and techniques they used to leverage the enormous parallel processing power of the GPU.”

NVIDIA will be webcasting keynote speeches from its website. DailyTech will be providing coverage of the GTC, with the occasional update provided through Twitter.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Point can away.
By SavagePotato on 9/30/2009 10:22:46 AM , Rating: 5
Every time I see a picture of that guy I think that they should start putting directions on cans of whoop ass.

Hold this end AWAY from the user when opening...

RE: Point can away.
By kattanna on 9/30/2009 11:55:21 AM , Rating: 2
it will be interesting to see how they spin the utter failure of their current test run of the new chips.

7 whole working chips from 4 full wafers.

RE: Point can away.
By FITCamaro on 9/30/2009 12:22:15 PM , Rating: 2
Oh? I missed this? Link? Would just like to read it.

RE: Point can away.
By kattanna on 9/30/2009 12:50:38 PM , Rating: 2
here ya go

the tasty part is this:

The first hot lot of GT300s have 104 die candidates per wafer, with four wafers in the pod Nvidia got back a week and a half ago. There is another pod of four due back any day now, and that's it for the hot lots. How many worked out of the (4 x 104) 416 candidates? Try 7

RE: Point can away.
By ClownPuncher on 9/30/2009 2:35:35 PM , Rating: 2
At best, that information is dubious. Charlie is hard to trust.

RE: Point can away.
By tviceman on 9/30/2009 3:23:19 PM , Rating: 2
RE: Point can away.
By kattanna on 10/1/2009 12:12:40 PM , Rating: 2
while i have known about charlies "reporting" for years, this time he does seem to be on target.

especially with quotes like this from the nvidia convention yesterday

Jensen added, Silicon of Fermi is "in house" and "we are bringing it up." Time to market is likely "a few short months."

a few short months? its october already so that places actual hardware in february, LOL, thats IF they can keep it on track. far cry from Q4 2009

also, with others feeling similiar

If you are waiting for NVIDIA to jump out of the GPU closet with a 5800 killer and put the fear into you for making a 5800 series purchase for Halloween, we suggest paper dragons are not that scary. We feel as though it will be mid-to-late Q1’10 before we see anything pop out of NVIDIA’s sleeve besides its arm. We are seeing rumors of a Q4’09 soft launch of next-gen parts, but no hardware till next year and NVIDIA has given us no reason to believe otherwise.

and i will take the word of HardOCP over any nvidia company exec anyday, especially since nvidia has a vested interest in lying.

RE: Point can away.
By cmontyburns on 9/30/2009 12:28:06 PM , Rating: 3
just like those ACME missiles with red warning labels that say "AIM AWAY FROM FACE".

RE: Point can away.
By FITCamaro on 9/30/2009 12:49:38 PM , Rating: 2
Actually real ones say "This end towards the enemy." or something like that.

RE: Point can away.
By BruceLeet on 10/1/2009 4:15:46 AM , Rating: 2
At least he doesn't look like Jerry Yang, that guys physical appearance is just freaky. Carnival freaky.

Is GTC neutral, or exclusively Nvidia?
By Amiga500 on 9/30/2009 10:20:26 AM , Rating: 1
Will, say ATi be present and allowed to talk at the conference? Or (whisper it) Intel?

Or is it just another name for a Nvidia PR day? Where basically they are trying to imply to the gullible that "we (nvidia) are the only GPU makers around".

RE: Is GTC neutral, or exclusively Nvidia?
By bighairycamel on 9/30/2009 10:36:10 AM , Rating: 4
Well since NVidia is the host and it is essentially their conference, I'm guessing the ATI and Intel invitations were conveniently lost in the mail.

RE: Is GTC neutral, or exclusively Nvidia?
By Amiga500 on 9/30/2009 12:16:28 PM , Rating: 4
So it is not really a conference at all then.

Its a glorified PR stunt?

RE: Is GTC neutral, or exclusively Nvidia?
By Taft12 on 9/30/2009 4:19:19 PM , Rating: 2
Yes, it is a PR stunt, but I don't know what ATI or Intel would have to contribute to a NVidia conference....

By StevoLincolnite on 10/1/2009 4:04:34 AM , Rating: 2
Yes, it is a PR stunt, but I don't know what ATI or Intel would have to contribute to a NVidia conference....

Chipsets and Processors that allow us to use nVidia hardware in the first place? I don't know... just throwing ideas around...

By Silver2k7 on 10/2/2009 5:34:18 PM , Rating: 2
"So it is not really a conference at all then.
Its a glorified PR stunt? "

intel too have their own pr stunts.. its called IDF.
im sure AMD have them too.

nVidia GT300's architecture unveiled
By SerafinaEva on 9/30/2009 4:35:49 PM , Rating: 2
* 3.0 billion transistors
* 40nm TSMC
* 384-bit memory interface
* 512 shader cores [renamed into CUDA Cores]
* 32 CUDA cores per Shader Cluster
* 1MB L1 cache memory [divided into 16KB Cache - Shared Memory]
* 768KB L2 unified cache memory
* Up to 6GB GDDR5 memory
* Half Speed IEEE 754 Double Precision

RE: nVidia GT300's architecture unveiled
By dastruch on 9/30/2009 6:16:48 PM , Rating: 2
any release dates?

RE: nVidia GT300's architecture unveiled
By Major HooHaa on 10/2/2009 5:21:59 AM , Rating: 2
Any word on power consumption?

There are rumours that GT300 means 300 watts. I am not sure I believe that (it wouldn't be something that Nvidia would want to advertise) and being built on a 40nm scale should help.

By Major HooHaa on 10/2/2009 5:29:56 AM , Rating: 2
Oh I found some power consumption info, here is the link and a quote.

“According to information we have at hand, the GT300 board [yeah, featuring "Fermi" CUDA architecture] barely missed 225W cut-off for the 6+6 pin if the board comes with 6GB of GDDR5 memory.”

"Well, there may be a reason why they call them 'Mac' trucks! Windows machines will not be trucks." -- Microsoft CEO Steve Ballmer

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki