backtop


Print 41 comment(s) - last by kelmon.. on Sep 8 at 3:34 AM

Another expansion card that promises to offer more gaming realism

Israeli company AIseek promises to offer a hardware artificial intelligence accelerator card. AIseek seeks to offload AI tasks from the CPU by using an in-house designed processor dubbed the Intia processor. The Intia processor offers hardware accelerated path finding, sensory simulation and terrain analysis functionality—features typically processed by the CPU which AIseek claims takes quite a bit of CPU power. AIseek also claims the Intia processor can process low-level AI tasks 100-200 times faster than the system CPU can.

Games that will benefit from hardware accelerated AI include RTS, RPG and FPS games according to AIseek. A few examples of how these genres benefit from hardware AI acceleration technology are also stated in the FAQ:

With accelerated AI, RTS worlds can be large, complex and highly dynamic, with thousands of AI-controlled units moving and behaving intelligently at all times. In RPG games, cities and landscapes can come alive with thousands of fully-simulated characters. With full simulation, all characters remain alive and intelligent both on-screen and off-screen (i.e., no loss of AI level-of-detail), resulting in much more believable, life-like worlds. For Action games and Shooters, accelerated AI means no more dumb enemies.

It is unknown when AIseek’s hardware AI accelerator board or technologies will arrive in retail or gain developer support, though it most likely won’t be for a while. With manufacturers advertising the ultimate gaming experience as having multi-GPU rendering technologies such as ATI’s CrossFire or NVIDIA’s SLI, an Ageia PhysX accelerator, Creative Labs X-Fi Fatal1ty and Bigfoot Networks’ KillerNIC, there seems to be more expansion cards needed than slots available for the ultimate gaming experience.



Comments     Threshold


This article is over a month old, voting and posting comments is disabled

A Card for Everything
By Sunday Ironfoot on 9/6/2006 12:51:15 PM , Rating: 2
So now we have cards for graphics, sound, physics, networking, and now AI. Does the CPU actually do anything anymore?




RE: A Card for Everything
By petermjacks on 9/6/2006 1:01:15 PM , Rating: 3
Nothing. Your $1000 Intel Core 2 Extreme X6800 is used for word processing.


RE: A Card for Everything
By Monkie on 9/6/2006 1:35:02 PM , Rating: 3
"Scratches head" and considers making a word processing CPU


RE: A Card for Everything
By SirPsyko on 9/6/2006 1:54:38 PM , Rating: 1
I think I'll go for the "PoPU"... The Porno Processing Unit.

It'll vary between really loud fans and silent operation.
Every time you use it you run the risk of getting a life-threatening virus.

That's money right there. ;)


RE: A Card for Everything
By Lazarus Dark on 9/6/2006 3:01:49 PM , Rating: 2
quote:
"Scratches head" and considers making a word processing CPU

too late... I just patented it! Ha!


RE: A Card for Everything
By jtesoro on 9/7/2006 9:51:18 PM , Rating: 2
Keep quiet! Patent it, lie low and wait for him to make a lot of money, then sue him for all he's worth! :)


RE: A Card for Everything
By LVHQ on 9/6/2006 2:04:28 PM , Rating: 2
Soon, our expensive CPUs are gonna be nothing more than a bona fide crossbar switch for all the other xPU's. Kinda makes you wonder if the Cell processor is a little ahead of its time.


interesting..but
By kattanna on 9/6/2006 11:55:25 AM , Rating: 2
interesting...but..i think they are gonna have to join with someone else and get their chip as an added feature to another add-in card type..

maybe aegia or sound cards





RE: interesting..but
By clementlim on 9/6/2006 1:04:26 PM , Rating: 2
I will only have 2 cards in my extension slots related to gaming. Graphics card (GeForce 6800 GT atm) and sound card (SoundBlaster X-Fi atm). If games require more than that, I'm not playing them. AI? Well, I find human challengers much more intriguing than computers.


RE: interesting..but
By therealnickdanger on 9/6/2006 1:32:04 PM , Rating: 2
I have yet to see A.I. that can accurately replicate the random stupidity of humans. No matter how often game developers talk up their A.I. or allow them to cheat, it never even comes close to human opponents. Oh wait, that's right, it doesn't come close because we don't have bandwidth starved, $300, PCI-32 A.I. card yet!!!


RE: interesting..but
By Regs on 9/6/2006 1:54:31 PM , Rating: 2
I agree with a lot of you. With all these cards I'd imagine it would be hard for every single game developer to write a game for every one. It should be plug-and-play and not plug-and-wait.


RE: interesting..but
By captchaos2 on 9/7/2006 10:11:43 AM , Rating: 2
This is getting out of control. What's next? The force feedback chip? The porn processor? The flux capacitor card?

I think the motherboard manufacturers should just license the tech and include it on the mobo so I can have one or two PCI slots available.


RE: interesting..but
By linkgoron on 9/7/2006 12:14:19 PM , Rating: 2
How much for the porn processor?


RE: interesting..but
By kelmon on 9/8/2006 3:34:30 AM , Rating: 2
I entirely agree. As much as I like the idea of realistic physics and AI in games additional dedicated processors are just complicating the design of the computer, adding additional hardware requirements (maybe) and, more importantly, are a shocking waste unless all you do is play games and all games support these new chips. Personally, I'd much rather see such technology employed using the additional processor cores that are being added to processors since these can also be used in other computing tasks and therefore represent a more efficient use of existing resources. Sure, it won't be as fast as a dedicated processor but it makes the technology more accessible and it won't be such a waste of money.


By Scorpion on 9/6/2006 12:34:09 PM , Rating: 3
...but I see it as the wrong move at the wrong time. What we would all like to get away from is all of the many many cards in our PCs taking up space and generating more heat, for one that consolidates all of these "similar" peices of equipment into one. They are all very similar, just specific "flavors" if you will of the same fundamental component.

We have GPUs that are more complex than our CPUs. Then along came Physics offloading. The "computational offloading add-on" is now reaching a crowding point in the market. I feel the industry will be heading into a "condensing" phase by migrating these "framework specific" logic units in with the CPUs.

I could be completely wrong, but I just don't see this as necessary, or short lived in what I see as the emerging trend in the industry.




By petermjacks on 9/6/2006 12:39:47 PM , Rating: 2
Well said. With dual core all the rage these days, why not use one core for physics/ai and leave the other for typical processing? There, you just saved $500 and have two open pci slots.


By Format1 on 9/6/2006 12:42:32 PM , Rating: 2
I see this as the direction AMD is going... I believe in the future we will see motherboards with like 4 sockets... you can fill all four with processors of different types and totally customise a system to your needs... All of the sockets would have direct access to each other and memory... I think this is a great direction to go in.


By msva124 on 9/6/2006 6:49:29 PM , Rating: 2
quote:
What we would all like to get away from is all of the many many cards in our PCs taking up space and generating more heat, for one that consolidates all of these "similar" peices of equipment into one


I believe it's called a CPU.


all-in-one accelerators
By tygrus on 9/6/2006 6:30:44 PM , Rating: 3
Combine AI & Physics accelerator on one card.
Could also add the Network interface.

Could products like CleerSpeed (FPU/ALU array) or Xilinx (FPGA's) be used in a more general way to accelerate a variety of tasks. Physics, AI, cryptography (en/de), data de/compresion, video&audio processing, science (eg. F@H, fluent, HPTC).

If GPU's can do physics and GPGPU then do they want to make the GPU the only type of add-on card you need?

I'd rather one for graphics and one for everything else.

OT: Just about everytime I go to post a new message the comment system returns an error message and I have to remember to copy my text (to re-post later) before clicking the button or I'll loose it :(




RE: all-in-one accelerators
By ET on 9/7/2006 4:44:15 AM , Rating: 2
OT: Just about everytime I go to post a new message the comment system returns an error message and I have to remember to copy my text (to re-post later) before clicking the button or I'll loose it :(
A fellow sufferer. The comment system login/posting problems have been annoying annoying me for so long.


RE: all-in-one accelerators
By jtesoro on 9/7/2006 10:11:07 PM , Rating: 2
This is extremely annoying to me too. It typically takes me a couple of tries or so to get my comments in. The error also happens to me just browsing around the site. I'd be more understanding if we got some explanation why this is happening, but I haven't seen one yet. If there was one provided before, I'd appreciate it getting posted again.


Correct me if I'm wrong...
By petermjacks on 9/6/2006 12:17:04 PM , Rating: 2
...but won't games have to be written specifically to take advantage of such hardware? If that is the case, old games will see no difference, and at the same time, I can't see big game manufacturers (e.g. Vault, etc.) writing their software to take advantage of start-up companies offerings. Yes/no?

This is a ship I'm not jumping on.




RE: Correct me if I'm wrong...
By mendocinosummit on 9/6/2006 12:22:36 PM , Rating: 2
Maybe the same scenario as the PPU


RE: Correct me if I'm wrong...
By smitty3268 on 9/6/2006 12:27:33 PM , Rating: 2
It is - they would need to release an API which would allow games to use the APU, and would fallback on the CPU if it wasn't there. Exactly the same situation as the PPU.


Much better chance for these guys!
By raisinbrainMMM on 9/6/2006 12:28:12 PM , Rating: 2
These guys have a much better chance than PhysX.

Before physX came along there were already several Physics enginges developed (havok, etc), so wide adoption of the PhysX API is being limited by the competition.

With AI there are no current widely use AI engines. If these guys can simplify development of AI like havok did for PhysX, they might see a positive adoption of their technology!




RE: Much better chance for these guys!
By raisinbrainMMM on 9/6/2006 12:29:25 PM , Rating: 2
*Ment to say "like havok did for physics" not physx


By Assimilator87 on 9/6/2006 3:27:09 PM , Rating: 2
You know that Ageia offers their physics engine for free if you incorporate PhysX acceleration, right? I think that's a much better deal than paying lots of money for a competing solution that doesn't offer anything unique.


I like this idea
By Kochab on 9/6/2006 2:23:07 PM , Rating: 2
One aspect of this AI enhancement will be better line-of-sight tracking. I've played games (Quake3 & UT2004) where the bot will track an enemy through walls or landscape. In other words this AI will actually make the bot less inhuman in its capabilities. Camouflaging oneself will actually work against human-like AI.

Also who says this has to be limited to games? It may be the "killer app" focus of this company but effective AI can be used for just about anything.




RE: I like this idea
By nunya on 9/7/2006 1:09:50 AM , Rating: 2
You don't need any external hardware for that. One of the things that amazed me about Farcry was being able to hide in the bush while NPCs walked by unaware. It's one of the great reasons that Farcry is giving Duke 3D a run for the money as my favorite fps of all time.


RE: I like this idea
By jtesoro on 9/7/2006 10:04:15 PM , Rating: 2
Actually, that kind of AI where you can hide in the bushes without being seen is also in Operation Flashpoint. Didn't like the victory conditions in that game though. In one mission, I got promoted to commander, stayed back and directed the squad to eliminate the attacking team without a single casualty. In spite of that I got chewed on by the general and the mission was considered a failure. I had to personally shoot a few enemies to get past that mission, and was congratulated even if half of my team was killed in the process! Gave up on the game after that.


Wow.
By Schadenfroh on 9/6/2006 12:01:40 PM , Rating: 3
Man, this will be great to go with that KillerNIC and that Agea PhysicsX card!




RE: Wow.
By sdsdv10 on 9/6/2006 12:02:47 PM , Rating: 2
Yes, yes it would indeed!


Not ready yet
By smitty3268 on 9/6/2006 12:30:00 PM , Rating: 4
From http://arstechnica.com/news.ars/post/20060905-7665...

quote:
If you're hoping that the NPCs in your favorite title will start acting smarter anytime soon, you'll be disappointed. Right now, it's unclear whether Intia is anything more than a name associated with a web site designed to attract venture capital funding. There is nothing indicating when Intia will be shipping, when AIseek's SDK will be available to developers, or what type of hardware will be necessary to use it.




generic sockets
By CU on 9/6/2006 12:48:13 PM , Rating: 2
I think we need some really generic sockets. Where we can put in CPU's, GPU's, AIPU's, PPU's, etc. You could put them in any combination you want to customize your computer to your needs. They would also share the same system ram. I think this would be a geat plan, but very hard if not impossible to get everyone to agree on one type of socket/interface. Another option is how about someone just make a gaming card, a xbox on a card. It would have the GPU, AIPU, and PPU all on one card. Either I way I expected a AI card as soon as I heard about the physics card.




RE: generic sockets
By SonicIce on 9/6/2006 1:38:34 PM , Rating: 2
isn't that what PCI is for? :P


Differance?
By Trisped on 9/6/2006 3:48:09 PM , Rating: 3
And what is the difference between this and what the physics cards and GPU do?

Currently I have a CPU which can do pretty much anything, though not necessarily fast, and I have a GPU which can do lots of simple math rather slowly. So what is this AI PU suppose to do? It sounds like it is suppose to do the same thing a GPU and PPU do, lots of simple math at 600 MHz or less.

If anything, all we need is a 2 GPU set up with the second GPU being used for graphics overflow, Physics, AI, and anything else that is stupidly threadable.




whats next?
By Mitchy on 9/6/2006 12:12:38 PM , Rating: 2
The Intia proc is a block of cheese and the board is a cracker.

For heavens sakes, what's next a 3d's 3d accelerator? OH WAIT thats SLI sorry. like, seriously, who wants to waste a PCI Slot on an AI Chip that'll cost probabaly 300$ ?

Some people like their computers dumb ok??? lol




Oh boy
By Griswold on 9/6/2006 1:48:06 PM , Rating: 2
I smell another dud.




Only my Opinion
By Xajel on 9/6/2006 2:58:58 PM , Rating: 2
Just a crap thing for games & gamers...




AI performance. What about the code?
By Hare on 9/6/2006 5:58:51 PM , Rating: 2
Ok, maybe the card will have a lot more processing power than CPUs. The problem is, who will program the AI? I doubt the problem with artificial intelligence is that we currently lack processing power. I belive the issue is more about the difficulty of writing advanced AI.




"If you look at the last five years, if you look at what major innovations have occurred in computing technology, every single one of them came from AMD. Not a single innovation came from Intel." -- AMD CEO Hector Ruiz in 2007

Related Articles
The Forecast for AGEIA
June 5, 2006, 6:43 PM













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki