Print 63 comment(s) - last by Snuffalufagus.. on Jan 27 at 5:35 AM

ATI X1900XT Board Layout

Retail X1900XT Board And Cooler
R580 has been the talk of the town for months as what Radeon X1800 should have been. We got an early sample of the card and put it through the paces

ATI’s Radeon X1800 launch came and went in October with very little fanfare.  The R520 ASIC was hampered by delays and redesigns, and the final product was a far cry from the GeForce 7800GTX (G70) killer.  Ultimately, what killed R520 was the lack of commitment to produce quantity – which in turn made the chip difficult to obtain, and even harder to buy for a competitive price.  NVIDIA made the same mistake with their GeForce 7800GTX 512 chip as well, but this is a look at ATI’s newest, not a history lesson.

Even though the official launch of Radeon X1900 is not until next week, one of our writers in Taiwan was able to over-night a card back to our labs.  There are two standalone versions of Radeon X1900; the Radeon X1900XT and the Radeon X1900XTX.  The X1900XTX is essentially the same card as the X1900XT, but with slightly higher clock speeds.  One tier one AIB told us ATI switched from the “Platinum Edition” to the “XTX” naming due to the low quantity stigma associated with Platinum Edition.  Finally, there is also a Radeon X1900 CF (CrossFire) card expected at launch as well.  This card has the same core clocks as the X1900XT, but replaces some of the outputs with a VHDCI connector for CrossFire support.  ATI’s low end cards support CrossFire directly over the PCIe bus, but X1900 will unfortunately require the bulky cable.  Radeon X1900XT and Radeon X1900XTX come with twin DVI outputs, but a dongle comes with the retail card can convert the DVI signal to VGA instead. 

Radeon X1900, or R580, is in many ways what the R520 should have been before all the die respins.  R580 uses the same memory controller found on R520, with the 512-bit internal ring bus.  Externally, the chip can address 256-bits at a time.  Our Radeon X1900XT came with 512MB DDR3 running at 1.45GHz, but the X1900XTX version comes with a default memory clock of 1.55GHz.  As anticipated, the R580 core features 16 pixel pipelines with 48 pixel shader processors.  R580 is produced on a 90nm process.  Unfortunately, all X1900 series cards are double width cards; the additional width is needed to properly cool the GPU.

ATI has an internal presentation to show the distinction between each of the new cards:

Product Name

X1900 CrossFire

X1900XTX  512MB

X1900XT 512MB

Core Speed




Pixel Shader Processors




Memory Speed




Memory Size

512MB DDR3

512MB DDR3

512MB DDR3

Memory interface





TV Output


S  Video

S  Video


DVI-I x 1,          VHDCI x 1



VIVO (Video In/Video Out)




HDTV Support




Shader Model




AVIVO Support




Crossfire Support

Master Card

Crossfire Ready

Crossfire Ready

H.264 Support




We ran our X1900XT against our eVGA GeForce 7800GTX 256MB in a few popular game timedemos.  The cards were both benchmarked on an Opteron 165 workstation with 2GB of PC-3200.  We used the latest NVIDIA driver, but the Catalyst 6.1 driver did not detect the Radeon X1900.  Thus, we used a beta driver from late December.  Our tests are no where near extensive, but they should give you a good idea of what to see during ATI’s launch next week.  All settings are default unless noted elsewhere. 

Game Test

GeForce 7800GTX 256MB

Radeon X1900XT 512MB

FEAR 1600x1200 No AA

48 FPS

61  FPS

FEAR 1920x1200 No AA

41 FPS

 55  FPS

FEAR 1600x1200 4xAA

25 FPS

 45 FPS

FEAR 1920x1200 4xAA

 21 FPS

 37 FPS

COD2 1600x1200 No AA

31 FPS

39 FPS

COD2 1920x1200 No AA

28 FPS

 35 FPS

COD2 1600x1200 4xAA

25 FPS

 35  FPS

COD2 1920x1200 4xAA

22 FPS

 29  FPS

BF2 1600x1200 No AA

61 FPS

63 FPS

BF2 1920x1200 No AA

60 FPS

 60 FPS

BF2 1600x1200 4xAA

53 FPS

57 FPS

BF2 1920x1200 4xAA

46 FPS

51 FPS

ATI has told its partners that the X1900 launch will be an actual product launch; meaning we should see inventory in the stores on the 24th.  This may be true for ATI branded inventory, but AIB partners say it will be several weeks before they have reliable shipments of inventory ready for merchants. 

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

By skeeter123 on 1/19/2006 5:04:55 PM , Rating: 2
why didn't they compare this to the 512mb gtx?

RE: 256?
By KristopherKubicki on 1/19/2006 5:07:10 PM , Rating: 3
Because I don't have one.


RE: 256?
By shabby on 1/19/2006 7:24:10 PM , Rating: 2
Couldnt you borrow one from Anand? :)
Could you post some x1800xt scores to see how they differ?

RE: 256?
By maevinj on 1/19/2006 5:30:51 PM , Rating: 2
yeah that is a little misleading. I was thinking wow ati finally created a card to beat the 7800gtx which they did but im sure the 256mb memory advantage also helps. Hopefully they'll do a hard launch on this product instead of a soft launch

RE: 256?
By BillyBatson on 1/20/2006 3:59:39 AM , Rating: 2
Most benchmarks showed little to no advantage with 512vs256 so i wouldn't worry too much about it being unfair.
For these cards to be popular they are going to have to be competitive with their pricing.
Does this mean the x1600 is the low end card? (6600/7300) and the 1800 is the midrange? (6800gs/7800gt)

RE: 256?
By Fluppeteer on 1/20/2006 7:32:49 AM , Rating: 2
Most benchmarks don't take advantage of the extra memory.
However, the advantage of the 512MB 7800GTX comes less
from it having twice the amount of memory as from the
memory being clocked half as fast again in comparison
with the 256MB 7800GTX (probably more significant than
the minor core speed difference).

The R580 cards look to have RAM clocked faster than the
comparison G70 card. At high resolution, it's not too
surprising there's a difference, regardless of how fast
the core is (and I'm sure the R580 has a very fast core).
They may well be faster than the 512MB 7800GTX cards too,
but I'd expect it to be much closer than with the 256MB
parts. It'll be interesting to see the two officially

RE: 256?
By CKDragon on 1/21/2006 12:48:20 AM , Rating: 1
Whoa, Captain Marvel. How's the new life in the rock?

RE: 256?
By BillyBatson on 1/24/2006 3:40:14 AM , Rating: 2
the rock!?

By defaultluser on 1/20/2006 12:48:58 AM , Rating: 2
When you consider the 7800 GTX has the exact same number of ROPs, less fragment pipes, and a much lower clock speed, this performance borders on pathetic. If adding such a huge number of fragment pipes is so important, it should boot the card's performance much higher than this.

But everyone who has been paying attention knew this wouldn't be all that spactacular. More than one fragment pipe can be excellent (witness the 6600 GT with 2:1, the 6800 GS and 7800 GTX with 1.5:1), but ATI's approach is just wasteful.

When you consider how much this design mimics four x1600 cores, you start to understand why it disappoints. the x1600 never gets to use all its fragment pipelines (hell, the 6600 rarely gets to use all its fragment pipelines effectively, and it has 4 less!), and the limitation of only four texture units means multitexture effects are held back.

Thus, we have 3 times as many fragment pipes, but with the same number of ROPs and texture units, the design is hindered on multitexture and will probably show little AA performance improvement. At least Nvidia has the smarts to keep the number of extra fragment pipes reasonable, and to provide more texture units than ROPs for good multitexture performance. Even G71 is expected to follow this mantra.

By Ronin on 1/20/2006 1:07:54 AM , Rating: 1
I have one word for you.


By defaultluser on 1/20/2006 1:12:56 AM , Rating: 2
Nice troll. What do you do for an encore, actually say something intelligent? Perhaps add something insightful to the discussion?

By theprodigalrebel on 1/20/2006 1:24:13 AM , Rating: 3
Wow, you consider 37fps in F.E.A.R at 1920x1200 [4xAA] for a single card to be pathetic...BTW, this card is the X1900, not the X2800...people really oughta stop with the "the nv 8800 will kick its ass". Of course it will...considering that there is atleast 6 months before the next generation arrives...

By Clauzii on 1/20/2006 4:29:46 AM , Rating: 2

By feraltoad on 1/20/2006 5:15:29 AM , Rating: 2
6 months! I was hoping Nv's G71 would arrive end of Jan or Feb! Did they change that? I thought they were waiting for ATI to rollout their card to trump them w/it, since the 7800gt was a no brainer purchase for a vid card. Are you talking about the G71? or the G81 or wutever coming out in 6 mths? If so where did u hear that? All my lurking and googling sez end of Jan. 1st of feb.

By Clauzii on 1/21/2006 2:29:44 AM , Rating: 2
G71 Is a G70 revision, isn´t it? G80=Next generation.

By ColossusX on 1/20/2006 12:57:44 PM , Rating: 2
well, this card is beating a card that has been out for nearly a year, so its not quite fair either, now is it.

Disappointing Performance
By Assimilator87 on 1/19/2006 10:18:36 PM , Rating: 2
It looks to me like the X1900XT is only slightly faster than the 7800 GTX. In F.E.A.R. @ 1600X1200, the X1900XT is only 27% faster. Comparing it to a 7800 GTX 512, the gap will grow smaller and when the (8800 GTX???) I think the X1900XT will lose the performance crown so long as nVidia can actually clock the G70 architecture to the 700MHz+ that's been speculated. I personally don't see how they can get the clocks that high considering that 6800 Ultra was clocked at a measly 400Mhz. As for the higher clocked X1900XTX, if it truly is just another name for the Platinum Edition then I'm guessing availability will be very limited as we've seen in the past so I don't think it'd be fair to make comparisons with it just yet. Any word on a specific G71 launch date?

I eagerly await the in depth review because the Opteron is making both the 7800 GTX and X1900XT look awful slow for next gen cards.

RE: Disappointing Performance
By Heatlesssun on 1/19/2006 11:54:54 PM , Rating: 2

I don't know what you're talking about. Run these cards in a dual config at high resolutions and most CPUs are working pretty hard.

This gerneration of nVidia and ATI cards are at the verge of Pixar looking games. A little ways to go, but, 3DMark06 it looking great! It should only be a year or two and games will be looking like this.

RE: Disappointing Performance
By Sharky974 on 1/20/2006 12:02:14 AM , Rating: 2
These are Beta drivers and whatnot too.

In one bench it was nearly twice as fast as the GTX (45 FPS vs 25 FPS).

Also the clocks on that EVGA GTX are either 450 or 490 core. So it's already like halfway to 512 GTX status almost.

But mainly: Beta drivers!

RE: Disappointing Performance
By Sharky974 on 1/20/2006 12:03:45 AM , Rating: 2
Oh yeah, and it is "only" the XT.

The XTX should get another few percent bump.

RE: Disappointing Performance
By defaultluser on 1/20/2006 12:59:20 AM , Rating: 2

the card will be released on MONDAY. If the drivers weren't completely ready for release, they wouldn't have announced the release date to the entire industry.

RE: Disappointing Performance
By Snuffalufagus on 1/27/2006 5:35:06 AM , Rating: 2
WTF - ONLY 27%

Yeah, that sucks.

No Stoppin' here for nvidia!!
By klingon on 1/20/2006 3:14:21 PM , Rating: 1
Well....nvidia is chargin' up for the upcomin' card.....the GEFORCE 7900GTX. The card has a 32-bit pixel processor with a core of 700Mhz......jus check out this page to kno more details abt the card...

RE: No Stoppin' here for nvidia!!
By Clauzii on 1/21/2006 2:39:52 AM , Rating: 2
32-bit pixelprocessor? Are nVidia DOWNgrading picturequality? :))

I think U mean 32 Pixel Pipelines. ;)

RE: No Stoppin' here for nvidia!!
By Clauzii on 1/21/2006 2:46:38 AM , Rating: 2

RE: No Stoppin' here for nvidia!!
By Noobie9876 on 1/21/2006 5:56:49 AM , Rating: 2
You yourself said:
7800GTX: 24 PS/TMU Units, 16 ROPs
X1900XT: 48 PS, 16 TMU/ROPs

Many sites claim that:
7900GTX: 32 PS, 32 TMU, 16 ROPs.
Which has lead to much sites wondering who will be faster as nVidia goes for a 2:2:1 ratio whereas ATI is doing 1:3:1.

By Noobie9876 on 1/21/2006 5:59:26 AM , Rating: 2
Arrrr, in this notation ATI is doing 3:1:1 and not 1:3:1. The sites I saw stick to TMU, PS, ROPS order.....

RE: No Stoppin' here for nvidia!!
By Grafxguy on 1/21/2006 6:14:05 PM , Rating: 2
My source in Taiwan says NV isn't hitting the 700+ clocks in volume - the best they're getting for a volume part is high 400s, which puts 7900 in the performance realm of the GTX512MB. And besides, there is no volume on 700MHz memory available. BTW, volume production on that is now mid-April, and NV has scrapped their CeBIT launch.

By gersson on 1/19/2006 4:29:31 PM , Rating: 2
Wish this was out earlier cos I really wanted one. Now I'll wait for nvidia's turn (BTW, "envidia" means jealousy in Spanish)

By Mr Perfect on 1/19/2006 4:45:58 PM , Rating: 2
That's definetly a promising architecture, but is there any word on a X1900XL that can start some competione with the ~$300 7800 GT? I'd have to guess that most of us don't have $500 for a XT flavor card.

By Clauzii on 1/20/2006 4:33:40 AM , Rating: 2
I could imagine a X1900(XL) with i.e. 24 or 32 shaders instead of 48. It would still be a nice card, I think.

By MrSmurf on 1/22/2006 9:39:35 AM , Rating: 2
"BTW, "envidia" means jealousy in Spanish"

No it doesn't. It's not even a Spanish word. "celos" means jealousy.

By adiposity on 1/23/2006 2:28:01 PM , Rating: 2
It is a spanish word.

Definition in spanish:

Translation to english (envy):

It has the same root as "envy," although "jealosy" is a perfectly acceptable translation of "envidia."

Translation of "jealousy" into spanish:

Note translation "2". "Celos" is definition "1," so kudos to you.

I'm fluent in spanish, and "envidia" is a very common word in spanish-speaking circles, so I'm guessing you either looked this up and don't speak spanish, or are an early spanish student.


By NullSubroutine on 1/19/2006 5:29:30 PM , Rating: 2
Isnt it true these are going to be rather cheaper than the 512mb gtx?

RE: price
By KristopherKubicki on 1/19/2006 5:30:15 PM , Rating: 2
AIB partners did not have price information at the time I published this.


RE: price
By NullSubroutine on 1/19/2006 5:34:22 PM , Rating: 2
Why thank you for a such a promtive informative response, I did not expect

RE: price
By Ard on 1/19/2006 8:17:52 PM , Rating: 2
No. MSRP will be $649, same as the GTX 512.

RE: price
By MrSmurf on 1/22/2006 9:43:49 AM , Rating: 2
That may be so but where are the 7800GTX 512MB? The places I've seen them on at "on order" and cost $100 over MSPR which the X1900XTX will be at first but hopefully ATI can release enough of them so the price drops below MSRP.

Nice numbers!
By Clauzii on 1/20/2006 4:25:21 AM , Rating: 2
7800GTX: 24 PS/TMU Units, 16 ROPs
X1900XT: 48 PS, 16 TMU/ROPs

The 7800GTX can shade and texturize 24 pixels per clock cycle and output them to the framebuffer 16 pixels at a time.
The X1900XT can shade 48 and texturize 16 pixels per clock cycle and output them to the framebuffer 16 pixels at a time.

All this boils down to the X1900XT being able to put a LOT of Candy on the screen at a time.

I like!!

RE: Nice numbers!
By Assimilator87 on 1/20/2006 8:44:18 AM , Rating: 2
What's the point of creating lots of pixels per clock then limiting the output by the number of ROP units? It makes no sense to me. The output will only be limited further if G71 has the same number of ROP units, but more pixel pipelines.

RE: Nice numbers!
By hcforde on 1/20/2006 10:38:26 AM , Rating: 2
This is for the graphics the next big thing will be the physics processors -Ageia's PHYSX card. Also the 580 is programmable they have encoded video 10 times faster using this chip rather than the CPU. Yes 10 times not 10%. This chip is more than just the sum of its parts


RE: Nice numbers!
By Clauzii on 1/21/2006 2:32:03 AM , Rating: 2
There is a difference in calculating physical pixels and pixels with a lot of eyecandy on.

RE: Nice numbers!
By Thalyn on 1/23/2006 1:30:14 AM , Rating: 2
Consider how long it takes to execute a shader program. It's not going to be one cycle per shader, so having an equal number of SMUs and ROps is going to limit the actual pixel output anyway.

Taking it a step further, most (if not all) shader programs are going to take longer than 3 cycles to complete - so even a 3:1 ratio of SMUs and ROps will be limited by the number of shaders (remembering shaders can be rendered in parallel, so 3 can be running simultaneously on for one ROp path). Even a 10:1 ratio would still leave the performance hindered by the number of SMUs.

Multi-texturing is dead - get over it. Very few, if any, new release titles make heavy use of MT over shaders anymore, so the benefit just won't be there if you increase the number of TMUs or ROps; especially at the cost of SMUs. Since not spending money (and realestate) on TMUs can then mean more SMUs, anyone designing a graphics card these days would be daft to make that kind of trade-off.

Think back to the high-clock 4x2 architecture of the FX5800, then compare it to the 8x1 design of the 9700. That's about how long TMUs have been out of style for.


[i]SMU = Shader Map Unit[/i]

By spwrozek on 1/19/2006 4:48:02 PM , Rating: 2
I am an ATI fan but what I have to wonder is why they are only running 16 pixel pipelines? Nvidia has 24 pipes on their high end cards. When are we going to see that from ATI? This card sure does look sweet though.

RE: pipelines
By Wahsapa on 1/19/2006 7:07:25 PM , Rating: 2
i remember reading somewhere the X1900 has some 350+ million transitors... im guessing ATI doesnt do 24 pipelines because of yield issues

RE: pipelines
By KristopherKubicki on 1/19/2006 7:14:54 PM , Rating: 2
I dont think they could physically fit more pipes on the die with all those transistors.


RE: pipelines
By Thalyn on 1/19/2006 11:11:56 PM , Rating: 2
G70 is still a 16-pixel card. In the same way that R580 can process 48 shader ops simultaneously, G70 is capable of processing 24 - hence why some people believe it to be a 24-pipe card. Still only 16 ROps, though.

The only exception to this is when processing non-Z pixel data, where G70 is actually capable of processing 32 ROps; just like the NV40 before it.

By kilkennycat on 1/20/2006 2:39:28 PM , Rating: 2
... in their desperate attempts to overtake nVidia at the high end of the graphics-card market.

I'll bet that any purchasers of X1800XT cards will be furious if the X1900XT product does really ship on time. Having a very expensive video card for only a couple of months and then rendered instantly obsolete by the same manufacturer introducing a significantly more capable successor near the same price-point does not engender warm and cozy feelings.

Hopefully ATi and their partners will offer full-credit to X1800XT customers towards an exchange for a X1900-series card --- and pigs may fly too.....

ATi's customer-relations with their high-end customers is at an all-time low with the persistent missed committments over the past 9 months. The release of the X1900 series so soon after the X1800 should add fuel to the fire.

Presumably the X1900XT GPU fits directly on a X1800XT raw-ECB,otherwise there are also going to be some very angry "board partners".

Meanwhile nVidia is about to release the 7300GS to take a bite out of ATi's low-end graphics-card market. Since this is a 90nm chip and the 7800GTX512 has now been formally announced by nVidia to be "limited-shipment", what's the betting that the 90nm true successor to the G70 GPU is waiting in the wings ? I suspect that nVidia is waiting to see whether the X1900XT/XTX is a REAL launch or again just another paper one before deciding on the launch date for the 7800GTX successor.

By timmiser on 1/22/2006 1:21:42 AM , Rating: 2
You must be new to the video card reality?

By Griswold on 1/24/2006 8:26:42 AM , Rating: 2
Maybe they sell clues at ebay. Go check it out.

48 pixel shaders = mega eye candy
By chaos fractal on 1/21/2006 2:47:37 PM , Rating: 2
Rather than looking at pure fps maybe we should think abou eye candy. Hopefully this will mean the card will be able to layer on a lot more AA/AF and other IQ stuff without suffering as much of a performance hit. I'm not that bothered if a card can play at 100fps when I can't see a big difference between that and 40fps. However, I do notice if the image looks better so thats all I'm interested in.

RE: 48 pixel shaders = mega eye candy
By lamestlamer on 1/21/2006 3:53:37 PM , Rating: 2
Yeah, I think that we really need to see some 3Dmark06 numbers to get a grasp of how much this card can handle pretty pictures, as its design seems to be very capable of. As seen by the BF2 numbers, sparse pixels shaders make this card perform on the level of the 7800, as is to be expected by the 7800's 1.5x TMUs. I think we are really starting to see some specialization: Nvidia for massive resolutions, ATI for more eye candy. Anyone interested in Oblivion or Unreal3 should really be interested in the 1900. 48 high speed PS is just amazing for PS intensive rendering, but of course its 16TMUs is going to be worse than 24 or 32 in texture intensive rendering. The trend in game development is a steady increase in polys and textures, but a significant increase in shader ops. I think that the x1900 will probably be able to handle most of the games thrown at it for at least 2 years.

By Grafxguy on 1/21/2006 6:10:26 PM , Rating: 2
I can't wait to see the real numbers, but I'll be using 05 and real worl games, thanks. 3DMark06 is a joke - an X1600 beating an X850PE? That doesn't reflect any real world games I've ever heard of. And it runs a different rendering path for each of X1800, X1900 and GF7800GTX - where different amounts of work are being done, and different qualities of images being rendered. Now that FM generates 80% of its revenue from corporate demos, it's clear that they've rushed a half assed piece of crap to the market.

Replacement for the x1800 XT?
By RobFDB on 1/19/2006 5:35:29 PM , Rating: 2
Have ATI stopped shipping R520 cores (for the x1800 XT) to it's partners now, to be replaced fully by the R580? If so are we going to see the x1800 XT supply dry up in retail channels over the next few months?

By KristopherKubicki on 1/19/2006 5:42:22 PM , Rating: 3
Have ATI stopped shipping R520 cores (for the x1800 XT) to it's partners now, to be replaced fully by the R580? If so are we going to see the x1800 XT supply dry up in retail channels over the next few months?

Yes and Yes.


maybe. . .
By miketmnt on 1/20/2006 2:32:05 PM , Rating: 2
Maybe you should have benched it against the 7800 512 mb version and also where's Q4 or Doom3 atleast ?

By Griswold on 1/24/2006 8:25:36 AM , Rating: 2
Maybe they already said they didnt have a 512, which doesnt surprise me, as it's simply not available. Blame nvidia for not sending them a sample for one or another reason. Though, judging from some 512 numbers (with a different system though), the 1900 would win a few benches and lose a few. Now, if the x1900 is actually available at launch, its a winner, because I cant buy a 512GTX even if I wanted.

But I agree, D3 and/or Q4 numbers should have been provided.

By Dubb on 1/19/2006 7:32:23 PM , Rating: 2
any word on upcoming R520 or R580 fireGL cards...they're due for an update.

By JWalk on 1/20/2006 11:05:23 AM , Rating: 2
First, I have a question. Was the Opteron 165 overclocked?

Also, I did a little research. If you look at Anand's article on the 7800 GTX-512, you can get a few numbers for a comparison. These numbers won't be the definitive answer because the systems aren't the same, and we are dealing with beta drivers, but they are still interesting.

In FEAR at 1600 X 1200 with 4X AA, the GTX-512 scored 31 fps, while the X1900XT above scored 45 fps. Nice.

However, in Battlefield 2 at 1600 X 1200 with 4X AA, the GTX-512 scored 66.2 fps, and the X1900XT scored 57 fps. Unfortunately, the GTX-512 article didn't have any figures for COD2.

I get the impression that the X1900XT will win some and lose some againt the GTX-512, and the X1900XTX will most likely win more than it loses to the GTX-512.

So, it will come down to price and availabilty. Well, at least until the 7900 cards come out. I just hope both companies have good yields and can produce enough cards to keep things competitive. :)

By djandreee on 1/21/2006 7:22:41 AM , Rating: 2
And what are score will have ? 2 x X1900XTX 512MB .
Ati expend very extrem brutal Beast whit name X1900XTX 512MB

By MonsterSound on 1/23/2006 10:41:41 AM , Rating: 2
btw envidia means envy in Spanish.

By bob661 on 1/19/2006 6:55:50 PM , Rating: 1
They keep coming out with these crazy video cards!!!

x800xl 4 lyfe!
By phaxmohdem on 1/19/06, Rating: 0
"I'm an Internet expert too. It's all right to wire the industrial zone only, but there are many problems if other regions of the North are wired." -- North Korean Supreme Commander Kim Jong-il

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki