Print 50 comment(s) - last by surt.. on Mar 15 at 2:37 PM

Intel says parallel software is more important for many-core CPUs like "Larrabee"

Multi-core processors have been in the consumer market for several years now. However, despite having access to CPUs with two, three, four, and more cores, there are still relatively few applications available that can take advantage of multiple cores. Intel is hoping to change that and is urging developers of software to think parallel.

Intel director and chief evangelist for software development products talked about thinking parallel in a keynote speech he delivered at the SD West conference recently. James Reinders said, "One of the phrases I've used in some talks is, it's time for us as software developers to really figure out how to think parallel." He also says that the developer who doesn’t think parallel will see their career options limited.

Reinders gave the attendees eight rules for thinking parallel from a paper he published in 2007 reports ComputerWorld. The eight rules include -- Think parallel; program using abstraction; program tasks, not threads; design with the option of turning off concurrency; avoid locks when possible; use tools and libraries designed to help with concurrency; use scalable memory; and design to scale through increased workloads.

He says that after half a decade of shipping multi-core CPUs, Intel is still struggling with how to use the available cores. The chipmaker is under increasing pressure from NVIDIA who is leveraging a network of developers to program parallel applications to run on its family of GPUs. NVIDIA and Intel are embroiled in a battle to determine if the GPU or CPU will be the heart of future computer systems.

Programming for processors with 16 or 32 cores takes a different approach according to Reinders. He said, "It's very important to make sure, if at all possible, that your program can run in a single thread with concurrency off. You shouldn't design your program so it has to have parallelism. It makes it much more difficult to debug."

Reinders talked about the Intel Parallel Studio tool kit in the speech, a tool kit for developing parallel applications in C/C++, which is currently in its beta release. Reinders added, "The idea here [with] this project was to add parallelism support to [Microsoft's] Visual Studio in a big way."

Intel says that it plans to offer the parallel development kit to Linux programmers this year or early next year. The CPU Reinders is talking about when he says many-core is the Larrabee processor. Intel provided some details on Larrabee in August of 2008.

One of the key features of Larrabee is that it will be the heart of a line of discrete graphics cards, a market Intel has not participated in. Larrabee is said to contain ten of more cores inside the discrete package. If Larrabee comes to be in the form Intel talked about last year it will be competing directly against NVIDIA and ATI in the discrete graphics market.

NVIDIA is also rumored to be eyeing an entry into the x86 market as well. Larrabee will be programmable in the C/C++ languages, just as NVIDIA's GPUs are via the firms CUDA architecture.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Long time coming, hurryup.
By Uncle on 3/11/2009 3:05:16 PM , Rating: 2
I like it when some truth gets quoted about software not being ready for 3 or 4 cores. It seems like he is begging for the developers to get on the turnip wagon or the Industry can't justify the quads, because if they don't, consumers will start to see the light and not put aside perfectly good equipment just to keep shareholders happy. Now for some self affirmation. It justifies my sitting back and not getting caught up with hyper marketing. Looks like my overclocked two core 3ghz opteron on a DFI Expert has lots of life left in it. I've only changed out my video card once in the last three years,I'm ready for my next video card which will go into this system. Anyone else who has done their homework over the last few years and saved a few bucks, pound or yen, a little self affirmation helps.

RE: Long time coming, hurryup.
By TomZ on 3/11/2009 3:30:53 PM , Rating: 2
Not sure about that. I just replaced a reasonably fast dual-core machine with a (quad-core) Core i7, and I think it was worth the cost of upgrade. The i7 machine is at the same time very fast and very quiet compared to the previous one. I noticed a huge speed increase especially for video transcoding, i.e., burn to DVD processing.

RE: Long time coming, hurryup.
By Spectator on 3/12/2009 4:17:30 AM , Rating: 2
I to did i7 for dvd stuffs. but best i seen is 4 threads not even maxed out.

never understood it really. there is a key frame every 3min? why not break up the encode into 20-30 3min chunks and send all 20-30 threads off for processing independantly?

that would be a good example of a parralel task done properly?

By SublimeSimplicity on 3/12/2009 10:36:56 AM , Rating: 2
I think you're right to an extent, I do believe Intel is begging, but I think it's actually for a competitive advantage.

Right now they're the only Hyper-Threading game in town (and probably that way for the foreseeable future). With memory latencies only increasing as we move from DDR2, to DDR3, to DDR93 and beyond. Hyper-threading will become a much more important competitive advantage with these increased latencies, but only if applications take a parallel task approach to development.

RE: Long time coming, hurryup.
By Spectator on 3/12/2009 11:47:21 AM , Rating: 2
Funny that. only have to swap out your vid card statement.

if I was like you and thought about longer term changes(sht im just a dumb short sighted user. :P)

you could say that the i7 is a server design cpu that just happens to own the desktop market.

next we could look at the QPI on the i7. which was designed for cpu/cpu coms.

then we could also consider your graphics changing logic.

What if intel considered this years ago. and decided on say "Larrabee". then you can also buy intel silicon for graphics AND it works best linked to QPI rather than some global pci-x bus(amd/nvid).

yes you would have to wait until i7/i5 is domminant. and also go through all hassle of making new QPI slots on motherboards(but sht pci-x will be old in a year or two yes?)

So yes you may be right. but also its fun to think ahead in a wider scope.

DISCLAMER "Info was implied in the above statement"

"If a man really wants to make a million dollars, the best way would be to start his own religion." -- Scientology founder L. Ron. Hubbard
Related Articles
Intel Talks Details on Larrabee
August 4, 2008, 12:46 PM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki