backtop


Print 46 comment(s) - last by Alpha4.. on Aug 20 at 1:04 PM


MSI X38 Diamond  (Source: MSI)
Expect Intel's high-end chipset to show up next month

Intel officially set its performance embargo on its upcoming X38 Express chipset for September 23. Motherboards based on the X38 Express chipset should show up in retail in early September, according to motherboard vendors. The September 23 non-disclosure lift date only applies to reviews and performance numbers for the X38 Express chipset. The situation will be similar to the P35 Express chipset launch, where motherboards were available before its Computex 2007 launch announcement and NDA lift date.

The new chipset is a member of the Bearlake family, which saw its initial debut with the G33 and P35 Express variants last June. Intel’s X38 Express succeeds the 975X Express that made its debut with Intel’s Pentium D Presler processors. Although the Intel 975X Express launched in late 2005, the chipset shared basics with Intel’s 945 and 955X Express chipset families. Intel decided not to refresh the 975X Express with a Broadwater variant and held out for Bearlake.

Intel’s X38 Express introduces PCIe 2.0 support to the LGA775 platform. PCIe 2.0 offers greater bandwidth over the existing PCIe standard – up to four gigatransfers per second, or GT/s, with the 20% encoding overhead accounted for. The chipset also supports dual full-speed PCIe x16 slots for ATI CrossFire multi-GPU technology. Intel guidance does not show any indication of support for NVIDIA's SLI Technology.

Officially, the Intel X38 Express chipset only supports DDR3 memory. However, motherboard vendors disagree and intend to release X38 Express based motherboards with DDR2 memory support. Motherboard manufacturers such as DFI, Foxconn, Gigabyte, MSI and others had DDR2-compatible X38 Express motherboards on display at Computex 2007. The DDR2-compatible solutions were either DDR3 and DDR2 or dedicated DDR2 supporting motherboards.

Expect motherboards based on the Intel X38 Express to pop up in retail next month. DailyTech estimates the cost of entry around $200 for a no-frills board and around $300 for boards that include a kitchen sink in the package. 


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

X38 will be special
By pauldovi on 8/15/2007 10:08:06 PM , Rating: 2
X38 is suppose to contain latency tweaks and other BIOS features never before seen. It should easily be the best overclocking chipset. It has a IHS!!!! I hear it uses a lot more power than the P35... that is a good thing. :)




RE: X38 will be special
By Alpha4 on 8/16/2007 2:56:54 AM , Rating: 2
Whats an IHS?


RE: X38 will be special
By larson0699 on 8/16/2007 3:10:15 AM , Rating: 2
Integrated heat spreader, the metal plate covering a chip's core and the surrounding area. In processors, it was first seen on the Tualatin-core Pentium IIIs and Celerons, and then Pentium 4. Nowadays, the only processors lacking an IHS are mobiles--there were a few S754 (AMD) boards that supported the use of the mobile Turion line, and I recall heatsink mounting issues due to the absence of an IHS on those chips.

The fact that a northbridge now needs one of these is ridiculous, but if it alleviates the problems with overheating NB's... who knows...


RE: X38 will be special
By qdemn7 on 8/16/2007 6:57:11 AM , Rating: 2
Definitely looking forward to these boards. Simply because of the IHS on the NB. Makes it a lot easier to watercool the NB without having to worry about crushing the core.


RE: X38 will be special
By Alpha4 on 8/20/2007 1:04:10 PM , Rating: 2
Goddam! I appreciate the clarification. I was of the impression most current mobiles used heat pipes to draw heat away from CPU & MXM graphics accelerators. Does that not fall under the same classification?


RE: X38 will be special
By larson0699 on 8/16/2007 2:59:58 AM , Rating: 2
Well, I certainly don't doubt that an audience is already lined up for this thing...

Maybe it's just my tastes, but I prefer my computer to be one of the LEAST power-hungry appliances in my home; no sense in sucking 100W+ at idle. I understand that this is an all-out platform by Intel--which will be heartily received by those who pride the maximum OC--but it kind of negates the energy efficiency of the CPUs they bin. I read that X38 is bound to set records with the juice it needs. I just wonder where it ends.

But definitely guys, there's a whole story to be had of northbridges and their evolution over the years. This is proof that the processor is only as good as the chip feeding it I/O's. *pointing at VIA, SiS*

Little relief (watt-wise) with 680i, but SLI and way better SSD throughput (and finally competition from Intel)... My money's on that Shuttle ITX with G33--hey, at least there's hardware T&L out of the box and an x16 there for later (and yet quite the OC for such a size). (They also make a NV 7025/AM2 ITX barebones, no T&L but CnQ, price...)


RE: X38 will be special
By IntelUser2000 on 8/16/2007 11:11:46 PM , Rating: 2
quote:
My money's on that Shuttle ITX with G33--hey, at least there's hardware T&L out of the box and an x16 there for later (and yet quite the OC for such a size). (They also make a NV 7025/AM2 ITX barebones, no T&L but CnQ, price...)


G33 doesn't have hardware T&L, only the G965 and G35 chipset does. And think all the Nvidia IGP does.


RE: X38 will be special
By larson0699 on 8/17/2007 1:46:16 AM , Rating: 2
!!!!

I was so about to give you the Wikipedia link, but thanks for calling me out there. The naming had me all confused. First, I thought G33 used GMA X3100, but in fact that's a mobile IGP--G33 actually has GMA 3100, which as you said does NOT have T&L.

(The G965 = GMA X3000 graphics = T&L.)

It's interesting how long this has been implemented in IGPs without my knowing. I had a Gateway with GeForce 6100 last year, and I remember that it wouldn't render lights or bumpmaps or anything fancy. I don't know if it was an issue with not having shaders.. but after a quick perusal of various IGP specs, it's obvious that all are terrible in games due to having one or two pipelines.

At least I'm still clear for my emulators!

@IntelUser2000 Thanks for clueing me in; I did more research.

Cheers.


Crossfire support still
By Polynikes on 8/15/2007 11:53:05 PM , Rating: 2
I wonder how much longer Intel will support their competitor's multi-GPU solution. I'm guessing they had Crossfire supported in this chipset before the AMD/ATI merger and didn't want to invest the money to remove it, or it would've required redoing the whole shebang.




RE: Crossfire support still
By larson0699 on 8/16/2007 4:08:54 AM , Rating: 3
Probably for a long time.

Intel doesn't have to worry about AMD taking any of their marketshare in chipsets. The same cannot be said for Nvidia. And, because Nvidia has their own multi-GPU solution, Intel has to have *some* ground. And I dread the day that Intel makes graphics cards for an SLI-type of their own.

It's irrelevant but noteworthy that many an AMD server board of today is littered with Intel core logic (mostly in the integrated networking).

Stuff like this is conducive to business. It will die off eventually, but not because of bitterness between rivals. You can bet that Intel will have their own fast lane (read: HTT) so *that* they don't have to license others' technology.


RE: Crossfire support still
By nrb on 8/16/2007 9:22:26 AM , Rating: 1
Intel aren't really "supporting Crossfire", they're simply supplying two video card slots. It's entirely up to the graphics card drivers whether to enable SLI/Crossfire between two video cards or not.

The fact that SLI will not be possible on X38 has nothing to do with Intel, it's because Nvidia has decided to explicitly prevent SLI in its drivers unless the drivers detect that they are running with an Nvidia-chipset motherboard.

The question you should therefore be asking is not "will Intel stop supporting Crossfire?", it's "will AMD alter its drivers so as to prevent Crossfire from working on Intel boards?" I wouldn't be at all surprised if that happened.


RE: Crossfire support still
By lumbergeek on 8/16/2007 12:12:31 PM , Rating: 2
They will not do that. Once they do that they would have to have a chipset solution themselves, and that means chipsets for Intel processors too in order to get/keep market share. AMD is quite happy to sell two video cards to someone who has an Intel Chipset motherboard. Hell, they'd probably be happy to see ANYTHING and EVEYTHING right now to raise some cash. I wonder what they want for their fab is Dresden?


RE: Crossfire support still
By Polynikes on 8/16/2007 12:43:18 PM , Rating: 3
Yeah, I guess they'd have to release another RD600. And we all saw how great THAT chipset was. (Compared to the hype, that is.)


RE: Crossfire support still
By nrb on 8/16/2007 12:53:59 PM , Rating: 2
quote:
Yeah, I guess they'd have to release another RD600. And we all saw how great THAT chipset was. (Compared to the hype, that is.)
The conspiracy theories say that RD600 (originally an ATI product) was once a far better solution, including x16/x16 Crossfire, but that after the ATI/AMD merger the new AMD management insisted that the chipset be deliberately crippled in order to discourage people from buying Intel CPUs for high-end gaming systems. This is supposedly the reason why it shipped several months after it was originally supposed to.


RE: Crossfire support still
By Polynikes on 8/16/2007 3:38:53 PM , Rating: 2
I don't know about any of that, but based on the hype I figured the RD600 would be making at least 800FSB and would be The Board To Get. Obviously many people were disappointed. There's a reason I don't jump on the latest and greatest. :P


Waste of Money
By gigahertz20 on 8/15/2007 8:33:56 PM , Rating: 1
Sounds like the new X38 mobo's will just be a waste of money, $200+ for what really? PCI Express 2.0...who cares, modern video cards don't even stress the PCI-E version 1.0 bandwidth.

Performance will no doubt be identical to the P35 motherboards, probably < 1% faster in benchmarks. I'm glad I bought the Gigabyte P35 DS3R for $130 and didn't wait for these waste of money X38 mobos.




RE: Waste of Money
By Anh Huynh on 8/15/2007 8:41:38 PM , Rating: 3
Yes, but the X38 Express supports dual PCIe 2.0 x16 slots. The P35 Express only supports one x16 slot + four lanes on whatever the second one is. So for those that want to run CrossFire HD 2900 XT's, the X38 Express would be the way to go for those building a new system.


RE: Waste of Money
By xsilver on 8/15/2007 8:46:07 PM , Rating: 1
The scary thing is if they do the same thing they did with AGP.

bring in a whole new set of PCI-2 only cards and force the market to shift - leaving people with PCI-1 Cards stuck with nothing to upgrade to.


RE: Waste of Money
By leexgx on 8/15/2007 9:03:12 PM , Rating: 2
PCI-E is 1v ,2v and 3v (when that one comes around)

all of them use the same 16 lane slot no problems


RE: Waste of Money
By HaZaRd2K6 on 8/15/2007 9:05:40 PM , Rating: 2
PCIe 2.0 is backwards-compatible with PCIe 1.0. The slots are physically the same, the only things that change are the power delivery and the transfer speed. I still love my Nvidia chipsets, though, but it's nice to see Intel push the industry forward (even though I'm an AMD fan).


RE: Waste of Money
By IntelUser2000 on 8/15/2007 9:59:35 PM , Rating: 2
quote:
Sounds like the new X38 mobo's will just be a waste of money, $200+ for what really? PCI Express 2.0...who cares, modern video cards don't even stress the PCI-E version 1.0 bandwidth.

Performance will no doubt be identical to the P35 motherboards, probably < 1% faster in benchmarks. I'm glad I bought the Gigabyte P35 DS3R for $130 and didn't wait for these waste of money X38 mobos.


Don't discount the chipset yet. From what the mobo manufacturers indicate and from PCWatch, its supposed to be a different silicon from the mainstream chipsets like the P35. They also say the die size is possibly ~180mm2.


sata??
By BillyBatson on 8/15/2007 10:58:52 PM , Rating: 2
Is it just me or do I only see 2 sata ports?




RE: sata??
By Devo2007 on 8/15/2007 11:35:29 PM , Rating: 2
I was about to comment on the same thing -- where are the SATA ports?


RE: sata??
By miahallen on 8/16/2007 12:08:05 AM , Rating: 2
In the picture, you can clearly see two more SATA port spaces (not occupied by physical connectors), right between the PATA port and fan headernear the bottom. Those are good for 2 more SATA ports each, giving a total of six (which IIRC is what the X38 is supposed to support).

However, it would make absolutely no sense for MSI to build this board without the extra ports supported, so your guess is as good as mine.


RE: sata??
By larson0699 on 8/16/2007 3:20:08 AM , Rating: 2
Given that the chipset is brand new, *maybe* that motherboard is an engineering sample. Sometimes you'll see missing DIMM sockets or the pinouts to a x16...

...except here, you have two SATA and traces to six DIMM sockets (of two types, at that). Never a fan of hybrid boards. One or the other. And since you can't use both at the same time, *something* is always going to waste on that board.

DDR3 is very fresh (read: expensive, unrefined) in the market. If you're more budgeted for graphics than that extra ~50M/s unbuffered, then you know that RAM *won't* be the priciest part of your build.


DDR3 isn't primetime and SLI is dead
By FXi on 8/15/2007 11:20:34 PM , Rating: 2
DDR3 is so not ready for primetime, price to perf is worse than any memory since RDRAM.

SLI? It's dead as a Dodo, people just don't see it yet. It's hell to support, and keeping it alive is just life support for a failing chipset business. Neither SLI nor Nvidia chipsets are really of any market significance, and Nvidia knows it.




By larson0699 on 8/16/2007 3:39:21 AM , Rating: 2
quote:
Neither SLI nor Nvidia chipsets are really of any market significance, and Nvidia knows it.
What's that supposed to mean?

I've long stuck with ATI, but even I can attest that Nvidia's products are solid (and released on time). Some gamers are all over SLI, and 'nuff said about their place in chipsets. Forget that ATI is AMD; Nvidia owns that too.

SLI memory, mad OC headroom (no matter the CPU), and a slew of Nvidia-specific features... All favor them. Maybe they won't reinvent the wheel for a while, but they're certainly nowhere near insignificant. Hell, when MS gave them up, they just showed up in every PS3 since.

It might be farfetched, but considering who they're up against, Nvidia's next market is in CPUs, where they can be competitive with all of Intel, all of (new) AMD. I wish.


By therealnickdanger on 8/16/2007 12:58:19 PM , Rating: 3
What? DDR3 is awesome! Sure it's expensive now, but it won't stay that way for long. DDR2 was questionable compared to DDR when it was first introduced and now it's the cat's meow, the bees knees, the cream of the crop. All DDR3 has to do is drop in price - it already 8% faster on average than DDR2, which some people will find worth the price. DDR3 will probably make a faster penetration into the mainstream than DDR2 did if they continue pushing the 2GHz barrier and lowering timings even further.

SLI dead? I know a couple people that run SLI rigs and would argue with you. NVIDIA chipsets aren't significant? Seriously, I want to know where you get your tech news now... NVIDIA chipsets are class-leading in most cases - depending on your use. They were the only choice back before Intel dropped their Core line on the world.


3 chip chipset?
By MonkeyPaw on 8/16/2007 7:42:02 AM , Rating: 2
I see 3 heat sinks on this board, 2 of them are where you'd normally find a solitary SB. Any idea what the 3rd one is for? Perhaps an external IDE contoller? Certainly seems like a very complex solution if that is the case, meaning this board will never be cheap.




RE: 3 chip chipset?
By jebo on 8/16/2007 10:43:46 AM , Rating: 2
Some of ASUS' P35 boards have another chip near the slots to aid in handling the PCI-E requirements of Crossfire (circumventing the P35 limitation). Somehow, the ASUS chip turns the x16/x4 configuration to x8/x8. Perhaps Intel is using a similar method to achieve x16/x16. Just a guess.


RE: 3 chip chipset?
By Anh Huynh on 8/16/2007 1:18:51 PM , Rating: 3
It's probably an integrated X-Fi. None of the other X38 Express boards have the third heatsink.


:|
By sirius4k on 8/16/2007 1:25:42 AM , Rating: 2
I was like yay... full 2x16 PCI-E 2.0 slots.. then went to WTF??? with CrossFire only.
Not for me :P

BTW.. What's with the 6 slots of memory? Does that broad support DDR2 and DDR3?
---
I wish nVidia would come up with something new... with support of DDR2 AND DDR3 and PCI-E 2.0.

And for those who whine about new technology... STOP READING tech articles :)
Coz there's always something new that you woun't like ;)




RE: :|
By larson0699 on 8/16/2007 3:57:44 AM , Rating: 2
You shouldn't dismiss a nice chipset like this on the sole premise of it not having SLI. If you're that into Nvidia, use one really powerful card.

Click the thumbnail of the motherboard. You can clearly see the difference in the position of notches in those DIMM sockets. YES, this board supports either type.

Nvidia WILL release something newer and better; they're not dumb. They won't wait for Intel to snatch their customers. I hate to say it (for fear of the "troll" bomb), but that's what AMD is for, especially after taking ATI. As of late they have overpromised and seriously underdelivered, which I think is just a scare tactic to wait for the next best thing from them (when they finally "get it right"). I don't see another K8-size revolution from them anytime soon.

As for whiners.. It's futile to try convincing someone not to be him/herself. Everyone needs his/her outlet, and for some that's here. You're the bigger person for NOT whining/flaming/trolling/FUD/etc. *because of* those who do. And by the laws of nature, there is always one to be had (along with the "yang", or those of us who balance them out). I think pride has something to do with how people spend their time here and elsewhere online, because I know I wouldn't say or do something without reasonable backing, pride being a part of that. And oh well that some are proud to complain. They give their two cents and then blip, out of the loop ("No X38 for me"). Don't hurt me none.
</ideals_preaching>


RE: :|
By Etern205 on 8/16/2007 11:31:01 AM , Rating: 2
Yes, that board supports both DDR2 and DDR3.
If you look at the notches closely, you can see it's placed
in different positions. 2 slots for DDR3 and 4 slots for DDR2.


gasp
By medavid16 on 8/15/2007 8:25:26 PM , Rating: 1
No SLI??? =( I thought they finalized everything, and Intel bought SLI license etc etc




RE: gasp
By Anh Huynh on 8/15/2007 8:40:15 PM , Rating: 2
There's no mention of SLI support from any of the vendors. The same rumors popped up for the 975X Express as well. Nothing ever came through.


RE: gasp
By Scabies on 8/15/07, Rating: -1
RE: gasp
RE: gasp
By Scabies on 8/15/07, Rating: -1
RE: gasp
By Lightning III on 8/15/2007 11:33:11 PM , Rating: 1
no0tebooks only


RE: gasp
By nrb on 8/16/2007 5:41:01 AM , Rating: 2
Sadly it looks as though the "SLI on X38" thing is just a mirage. There was what looked like good evidence for it at one point - some motherboard manufacturers actually advertised their X38 boards as supporting SLI at this year's Computex.

But sadly it seems this was a misunderstanding.

http://www.tweaktown.com/news/7978/index.html

There is, of course, absolutely no technical reason why X38 cannot support SLI; Nvidia has simply decided to screw Intel-chipset customers by disabling SLI support in its drivers unless you're using an Nvidia chipset motherboard. Nvidia could easily choose to switch SLI on for X38 boards in any new driver release, but they won't because they want to sell more Nvidia chipsets. (I'm constantly puzzled as to why this isn't a violation of competition laws).


meh ...
By xxsk8er101xx on 8/15/2007 10:56:49 PM , Rating: 1
Look I like Intel products. I like AMD products.

This chipset sucks. It's nothing new or fantastic. Pci express 2.0 ... yippy, show me pci express cards besides video cards? i can probably count them all on my hand. No SLI support? why bother having 2-16x pci-e slots then?

Why should i care about pci-e version 2 when there are barely any pci-e cards available for version 1?

DDR3? why not stick with DDR2? it's the same performance.

It's ahead of its time i guess - but honestly is it worth the 300 dollar price tag? Maybe in 3 years this chipset will be in demand.




RE: meh ...
By Lightning III on 8/15/2007 11:37:38 PM , Rating: 2
Hey it will drive down the price of the 975x and thats a good thing


RE: meh ...
By JasonMick (blog) on 8/16/2007 8:32:13 AM , Rating: 2
quote:
DDR3? why not stick with DDR2? it's the same performance.


The performance of having one of the new 1800 MHz DDR3 modules blows away DDR2 performance, if you have money to invest in it. Soon even higher MHz. models will be pushing the performance even higher.

I think it is pretty arrogant to state that it is an insignificant improvement.

If you don't believe me, check out the benchmarks:
http://anandtech.com/memory/showdoc.aspx?i=3053

Remember, this is enthusiast stuff, if you are paying 300 dollars for a motherboard, you are probably and enthusiast and care about OC'ing and framerates, which DDR3 will greatly improve. Of course you need to pair it with a powerful processor and video card, but why would you not, if you were spending so much???


Kitchen sink!!!
By ccbr01 on 8/15/2007 10:01:04 PM , Rating: 4
It better include one... I didn't get one when I bought my Asus P5W DH last summer. :(




Dual PCI-E Good For Me
By jeromekwok on 8/16/2007 7:09:26 AM , Rating: 2
Sorry, I don't use SLI or XFire. The second PCI-E looks good to put a high performance RAID5 SATA controller. I have too many recorded TV programs and junks in my harddisk, and looking for a flexible but economical way to expand redundant storage a few times a year.




"Folks that want porn can buy an Android phone." -- Steve Jobs

Related Articles













botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki