backtop


Print 99 comment(s) - last by Dfere.. on May 29 at 8:50 AM


The Army has decided to upgrade all of its computers, like those shown here (at the NCO Academy's Warrior Leaders Course) to Windows Vista. It says the adoption will increase its security and improve standardization. It also plans to upgrade from Office 2003 to Office 2007. As many soldiers have never used Vista or Office '07, it will be providing special training to bring them up to speed.  (Source: U.S. Army)
Army will upgrade all its computers to Vista by December

For those critics who bill Microsoft's Windows Vista a commercial failure for failing to surpass Windows XP in sales, and inability to capitalize in the netbook market, perhaps they should reserve judgment a bit longer.  Just as Windows 7 hype is reaching full swing in preparation for a October release, the U.S. Army announced that like many large organizations, it will wait on upgrading to Windows 7.  However, unlike some, it is planning a major upgrade -- to Windows Vista.

The U.S. Army currently has 744,000 desktop computers, most of which run Windows XP.  Currently only 13 percent of the computers have upgraded to Windows Vista, according Dr. Army Harding, director of Enterprise Information Technology Services.

It announced in a press release that it will be upgrading all of the remaining systems to Windows Vista by December 31st.  The upgrade was mandated by a Fragmentary Order published Nov. 22, 2008.

In addition to Windows Vista, the Army's version of Microsoft's Office will also be upgraded.  As with Windows, the Army is forgoing the upcoming new version -- Office 2010 -- in favor to an upgrade to Office 2007.  Currently about half of the Army's computers run Office 2003 and half run Office 2007.

The upgrade will affect both classified and unclassified networks.  Only standalone weapons systems (such as those used by nuclear depots) will remain unchanged.  Dr. Harding states, "It's for all desktop computers on the SIPR and NIPRNET."

Army officials cite the need to bolster Internet security and standardize its information systems as key factors in selecting a Windows Vista upgrade.  Likewise, they believe that an upgrade to Office 2007 will bring better document security, and easier interfacing to other programs, despite the steeper learning curve associate with the program (which is partially due to the new interface, according to reviewers).

Sharon Reed, chief of IT at the Soldier Support Institute, says the Army will provide resources to help soldiers learn the ropes of Windows Vista.  She states, "During this process, we are offering several in-house training sessions, helpful quick-tip handouts and free Army online training."

The U.S. Army will perhaps be the largest deployment of Windows Vista in the U.S.  Most large corporations keep quiet about how many Windows Vista systems versus Windows XP systems they've deployed.  However, past surveys and reports indicate that most major businesses have declined to fully adopt Windows Vista.  Likewise, U.S. public schools and other large government organizations have only, at best, partially adopted of Vista.


Comments     Threshold


This article is over a month old, voting and posting comments is disabled

RE: Missing the point
By descendency on 5/23/2009 5:39:11 PM , Rating: -1
Give me one reason other than available memory to allocate that 64 bit is important. (I'm not going to be waiting because the other added "benefits" of the 64-bit architecture are minimal at best...)

Considering that fact and the fact that the average home user could get away with a minimalistic OS that uses under 4 gb of RAM (because email, twitter, etc doesn't consume gigabytes of RAM...), the expansion to 64-bit is a lot less important than some techies (most of whom don't know what they are talking about) would like you to believe.

SSD (omg those expensive things... yes) support would be far more important to the average user than 64-bit support. The 450$ SSD in my computer has improved boot time more than any other device I own. (my computer is getting towards needing an update... but being a college student working a university [low paying] job, I have to pick and choose...)

How many true 64-bit processors are on the market now? Lots right? Wrong. Itanium is the ONLY true 64-bit processor. The rest are 80x86-64 processors (or "extended 32-bit" processors). Just as an aside, AMD claims to own that instruction set... I wonder how that will play out in court when/if intel tries to claim AMD is breaking the rules with the x86 instruction set... That's really way beyond the point.

If MS was purely aiming for the consumer market, they would be aiming for things like SSD integration, DirectX 11 (which will be huge in video playback quality and processing), and the like instead of things that enterprise users would need (like 64-bit).

I've always been slow to adopt OSs because most of them are buggy from day 1 (XP hasn't crashed on me in over 1.5 years... no viruses or headaches either). I will probably upgrade to Vista soon (or just wait 3 more years for Win 7 SP2).

There is no need for the average consumer to use Vista or any other OS except MS wants to sell copies of it's OS to them so they force you to buy it for things like DX10 (which isn't even required yet...)

Eh, that's probably a lot of useless information.


RE: Missing the point
By Master Kenobi (blog) on 5/23/2009 5:52:37 PM , Rating: 2
Many consumer PC's bought at BestBuy and the like are now 64-bit. Get with the program.


RE: Missing the point
By descendency on 5/23/09, Rating: -1
RE: Missing the point
By foolsgambit11 on 5/23/2009 6:56:41 PM , Rating: 5
You've got your terms mixed up. 'x64' is a shorthand (or possibly a non-AMD-specific reference) for x86-64. IA-64 is the Itanium instruction set. They are not compatible, but x86-64 isn't somehow less than 64 bit because of that. It is a 64-bit instruction set with the ability to run older programs designed for previous x86 instruction sets, potentially all the way back to the 8086, I guess. The Pentium's (and 386's) instruction set, for instance, wasn't less than 32 bit just because it was built to be compatible with the old 8086 and 286 instruction set.

But you're right that most computation done on consumer computers doesn't require 64 bits. It's rare you'll be dealing with a number greater than 4.3 billion on a home computer, other than the previously mentioned case of allocating system resources. When we're talking about users' computations, the advantage of natively-computing 64-bit numbers is rarely needed.

However, I think you're wrong that over-engineering computer resources is a problem. Having access to vast computing resources may encourage sloppy coding, but it's not what's keeping people from taking advantage of a 10x+ increase in computing speed. That speed-up is there whether we have 1 GB or 16 GB of RAM. What the additional resources do allow for is sloppy coding. It sounds bad, but it lowers the bar to entry for people who want to develop programs. Anything that allows more people to compete in a marketplace should be good for the consumer overall.


RE: Missing the point
By sinful on 5/23/2009 6:58:36 PM , Rating: 2
quote:
Let me make this perfectly clear, there are two instruction sets that are "common" in CPUs right now (one is FAR more common), x86 and x64. There is ONE architecture implements x64 instruction set, the Itanium (well and Itanium 2). There are boatloads of CPUs (even ones sold at best buy) that support extended x86 (AKA x86-64).


Itanium is IA64. It has nothing to do with x64.


RE: Missing the point
By zebrax2 on 5/23/2009 7:16:03 PM , Rating: 2
whats not needed now does not mean its not needed tomorrow.
as 64bit(or x86-64) slowly take the chunks from the market programmers/developers and hardware vendors will also slowly start giving focus to it. and BTW it does give some performance improvement when the program is coded properly.
vista 32 vs vista 64
http://www.extremetech.com/article2/0,2845,2280811...


RE: Missing the point
By croc on 5/23/2009 7:52:51 PM , Rating: 2
"Let me make this perfectly clear, there are two instruction sets that are "common" in CPUs right now (one is FAR more common), x86 and x64. There is ONE architecture implements x64 instruction set, the Itanium (well and Itanium 2). There are boatloads of CPUs (even ones sold at best buy) that support extended x86 (AKA x86-64)."

There are several RISC based CPU's on the market, not one. Every major 'big iron' MFG has /is making RISC based CPU's. IBM's PPC, Sun's Sparc systems, HP uses a variation of the Alpha, (Not the bastardized Itanium that Intel seems to have abandoned...) Fujitsu has their own CPU's, etc. All are RISC based platforms, and all have been since DEC proved the advantages with the original Alpha.

X86 CPU's are all CISC based, and most modern X86 based processors use a variation of AMD's AMD64 extended instruction set.

Please stop passing mis-information.


RE: Missing the point
By Pryde on 5/23/2009 8:56:16 PM , Rating: 4
The terms x86-64 and x64 are often used as terms to refer to x86-64 processors from any company. There is not such thing as x64.

Intel Itanium (formerly IA-64) architecture is not compatible with x86.

AMD have since renamed x86-64 to AMD64. AMD licensed its x86-64 design to Intel, where it is marketed under the name Intel 64. You can be sure that if Intel lost its AMD64 license that AMD would lose its x86 license.


RE: Missing the point
By rninneman on 5/25/2009 2:40:08 AM , Rating: 3
You are very confused/misinformed about 64 bit computing. x64 is the marketing name Microsoft gave to their software that runs on AMD's x86-64 now known as AMD64 and Intel's EM64T now known as Intel 64. Most developers have adopted the same marketing name. The Itanium series CPUs run IA64 software which is a completely different ISA than x64. Itanium cannot run x64 code, only IA64.

While the x86 architecture is technically CISC, the advent of SSE has morphed the ISA into a hybrid of sorts. To much to go into for this post.


RE: Missing the point
By TA152H on 5/25/2009 3:46:14 AM , Rating: 5
You are worse than he is.

Do any of you people posting have any idea what you're talking about? It's like reading nothing but blowhards that have no clue, but like to hear themselves post. Shut up if you don't know what you're talking about!

To clear up all the misinformation posted by people who obviously are not in the computer industry, we'll start with x86 and the Itanium.

In fact, the original Itanium DID run x86 code in hardware. Intel decided to remove it and use software emulation.

The Alpha is no longer being developed, but was 64-bits from when it was conceived. It is also a very overrated, but still decent instruction set. People who have never used an Alpha, and know essentially nothing about it like to post how much better the instruction set was than everything else. In fact, it was not. They made extremely expensive processors, and spent enormous amounts of money on hand-coding a lot of the logic, making the implementation effective in some cases. In reality, it was always swapping places with IBM's POWER, so was never clearly a superior product. Well, for long anyway. They leapfrogged each other. It was a horrible thing to work with though, and got hotter than Hell. It's still made, mostly for OpenVMS, but HP is not doing further development on it.

Intel has certainly not discontinued Itanium, or even deprecated it. They continue to gain market share every year with the Itanium, and HP has many operating systems that use the Itanium, and are very committed to it. They did just delay the Tukwila again though, which is an ongoing problem, since their current processors are still based on 90 nm technology. But, it is still be developed, and still gaining market share.

Most of all, you people have to stop talking about 64-bits and x86-64 like they are the same thing!!!!!!!!!!! 64-bits is not very important, but x86-64 is not just a move to 64-bits. Were it so, the stupid posts about numbers exceeding 32-bit values would actually not be stupid, but sadly, they are. x86-64 got rid of a lot of baggage, like MMX, x87 instruction set, memory segmentation (which was still around on the 386 instruction set, but since the largest segment was 32-bits, it was transparent if desired, an no one used it), etc... It also added things, like eight more registers, which can have an important impact on performance, especially with L1 caches getting slower (4 clock cycles on the Nehalem). There are disadvantages to 64-bit addressing as well, and it's why the 8088 had memory segmentation in the first place (most people think it was to protect applications from each other, but it was not, and was never used that way). On the 8088(or 8086), you'd just have to specify a 16-bit address, and based on the values in the appropriate segment address, you'd generate a real 20-bit address (Intel used some weird shifting of the index register, and then added it to the offset address). So, you'd save memory by not having to specify the 20-bit address. You'd have to update your index registers, of course, when you needed something outside of the 64K segment you were in, but memory at that time was very small, and it was considered a worthwhile trade-off.

64-bit addresses also consume more memory, and lower code density. Of course, we have so much memory now, it does not really matter. Well, except, it lowers performance, because we have caches now, and lower code density means you hit a slower cache, or main memory more often. And caches can not simply be increased in size to compensate for this ugly side-effect, since the larger a cache is, the slower it is, all other things being equal, so you have to add more wait states (for example, the Conroe's L2 cache was one clock cycle faster than the Penryn's because of the additional size. The Nehalem's is five clock cycles faster then the Penryn's; that's why they made it only 256K - it's also faster because it's not shared between cores anymore).

So, if there were no changes besides going to 64-bit, you'd generally expect x86-64 to be slower due to lower code density - unless you were using more memory, or somehow using very large numbers. For most applications, it would be slower though. That's why you see uneven performance. The enhancements, like 16 registers can improve performance, but the lower code density can lower it. Whichever is more relevant within that workload will dictate whether you see an increase or decrease in performance.

Oh, and DEC did not invent the RISC concept, or even come close to it. Why do you pass this misinformation on, when you clearly have no idea what you're talking about! It's infuriating when idiots do this, and then someone repeats it because they don't realize you're an idiot.

CDC was the first to create a true RISC processor, in a machine called the 6600. It used a 10 MHz processor attached to 10 barrel processors. It could not even load or store data, all this was done by the barrel processors that kept it fed.

IBM also released the RT PC long before the Alpha, which, of course, was a RISC processor. Berkeley might also get offended by you saying Alpha proved RISC's superiority. But, I'll let you do that research on your own, assuming you actually dislike passing on misinformation. There's no evidence of that though.


RE: Missing the point
By amanojaku on 5/23/2009 9:18:10 PM , Rating: 4
quote:
x86 is "32 bit". x86-64 is 32 bit instruction set with ability to address 64 bits (2^64) of RAM.
Sigh...

When people refer to 64-bit CPUs they are referring to the word size , i.e. the size of the CPU registers (data and/or instruction) and the amount of data processed at once. 64-bit data registers yield the following native ranges:

Unsigned Integers - 0 to 4,294,967,295
Signed Integers - -2147483648 to 2147483647
Floating Point - See http://en.wikipedia.org/wiki/IEEE_754
Memory addresses - 0 to 16 exbibytes (colloquially, and incorrectly, referred to as exabytes)
Bus component transfer - 0 to 64 bits transferred between bus components (PCIe slots, CPU sockets, RAM slots, etc...)

There are exceptions, however, generally based on practicality. 16 exbibytes of RAM is inconceivable today due to cost and complexity of manufacture. I challenge you to find any company that CAN manufacture that much RAM globally in a year, let alone for one system. CPU manufacturers reduce memory address space to reflect the limits of available memory (pebibytes) in order to shrink CPU die size. Why bother producing memory addressing for memory that won't exist for a few years, if not decades? As higher RAM densities are produced the CPU address space will increase accordingly.

AMD, IBM, Intel, SUN, VIA, and other companies produce native 64-bit CPUs with the addressing hobbled, and in some cases the bus width is limited to 32-bit. All other features are 64-bit. The x86-64 instruction set architecture includes 32 and 64-bit registers, appropriately activated when the OS chooses an operating mode.


RE: Missing the point
By foolsgambit11 on 5/24/2009 3:55:22 PM , Rating: 5
I think you've got the ranges for 32-bit, not 64-bit, at least for signed and unsigned integer. 64-bit is about 0 to 18 quintillion or so.


RE: Missing the point
By amanojaku on 5/24/2009 5:41:24 PM , Rating: 2
You are 100% correct. Thanks for pointing that out!

Unsigned integer range: 0 to 18,446,744,073,709,551,615
Signed integer range: -9,223,372,036,854,775,808 to 9,223,372,036,854,775,807


RE: Missing the point
By RamarC on 5/23/2009 6:10:40 PM , Rating: 5
most of your argument is no different than the arguments against windows 98 and windows xp. to the casual user, any changes "under the hood" are invisible so they focus on the UI.

but i dare you to try to run modern apps on xp with only 512mb of ram which was typical when xp debuted. i also dare you to attach a 1tb sata drive to xp-rtm. a service pack had to address hard drive size limitations. rather than continue to patch xp, vista/win7 have been rearchitected to handle the increased demands of today's software and exploit the increased functionality of today's hardware.

and as for "true 64bit" you sound like one of the techies who don't know what they're talking about. itanimum has no 32bit support but that doesn't make it the only "true" 64bit processor. intel made a conscious decision not to PROVIDE 32bit support, but it certainly could have. in simple terms, the only true restriction of a processor is that it cannot run software that is "larger" than its memory access. so a 64bit cpu could run 32bit and 16bit code. but a cpu with 16bit memory access will NEVER be able to run 32bit code.


RE: Missing the point
By descendency on 5/23/09, Rating: -1
RE: Missing the point
By foolsgambit11 on 5/23/2009 7:03:18 PM , Rating: 3
Nope. Itanium runs the IA-64 instruction set. It's totally different from the x86 instruction set line, which includes x86-64 (commonly referred to as x64). IA-64 was designed to address certain inadequecies in the x86 instruction set that made it less than optimum for certain operations. Those changes made it incompatible with the old x86 instruction set. You're right about many of the details, but not about the names.

I think it makes much more sense for business users to be behind consumers. Certain businesses which perform especially computationally-intensive operations would be an exception, but for the most part, business use is about reliability more than performance. Reliability is less of an issue in the consumer space.


RE: Missing the point
By fsardis on 5/23/2009 7:10:24 PM , Rating: 5
Are you insane or smoking funny stuff today?

The enterprise market should be on the latest OS and the consumers one generation behind? And you expect to be taken seriously?
Simple scenarios to consider:
1- Hey boss, we got a 10GB database holding our financial data but because the drivers for the new OS on the server are not quite ironed out we had a crash and now it is corrupt. We have to restore from backup now....
2- Awww man, this stupid latest OS crashed and I lost my pr0n collection...

Now tell me which one would be more catastrophic. Care to explain to me why the enterprise would want such a high risk? You are asking basically the people who run mission critical system for their company and perhaps for a great number of people on the planet, to be the beta testers while your average Joe will only lose a few GB of pr0n. Good thinking there, please never apply for network design jobs.

As for your 64bit rant, I will leave it at that and only say that a few years back the same was claimed by "experts" such as yourself for 16bit and 32bit. I mean 640KB of RAM should be enough for everyone yes? And nobody can ever fill 20GB of space, can they? And while we are at it, let's all go back to DOS and let the enterprises use features such as the Aero Glass and Widgets because they increase the productivity so greatly in the work environment.

And yes, I have written code and yes, I am working with hardware and software daily and yes, I design networks.


RE: Missing the point
By foolsgambit11 on 5/23/2009 7:12:48 PM , Rating: 2
Exactly. It's an 'if you build it, they will come' kind of scenario. Give people the resources, and they'll find a good use for it. Yes, I mean 'good', as in, useful, just like the uses found for the expanded capabilities of 32-bit.


RE: Missing the point
By CSMR on 5/24/2009 9:41:08 PM , Rating: 2
Agree with the conclusion but a business upgrading client OSes isn't going to have data loss from any bugs; we're not talking about servers here?


RE: Missing the point
By DeGhost on 5/24/2009 4:45:49 AM , Rating: 1
i do
already running out of space
and i have 3 TB of storage
and what does 64bit instruction have to do with drive size?
enterprise ran into storage size problem long ago and there is an extension to make a bigger address table, i forgot what it is called
today's computer are written with million+ lines of code, object oriented programing and high level language add overhead, if you want lean mean code you could try coding in assembly, i challenge you to write something with a friendly gui for an "average" joe.


RE: Missing the point
By DeGhost on 5/24/2009 4:33:21 AM , Rating: 3
Misleading
“Considering that fact and the fact that the average home user could get away with a minimalistic OS that uses under 4 gb of RAM (because email, twitter, etc doesn't consume gigabytes of RAM...), the expansion to 64-bit is a lot less important than some techies (most of whom don't know what they are talking about) would like you to believe.”

By your usage the “average” home user could use 512 ram and a 800mhz p3 with 20 gig of space on hard disk. That might be you?

“SSD (omg those expensive things... yes) support would be far more important to the average user than 64-bit support. The 450$ SSD in my computer has improved boot time more than any other device I own. (my computer is getting towards needing an update... but being a college student working a university [low paying] job, I have to pick and choose...)”

From all the benchmarks and uses for ssd I read over the year. Ssd doesn’t seem to be that great to boot with, its real strength is nonexistent search time for out of order data like a database. It might be better to raid your drives for faster boot. And for budgeting “student”. I don’t see why you need a ssd, raid solution is way cheaper and have comparable speed

“How many true 64-bit processors are on the market now? Lots right? Wrong. Itanium is the ONLY true 64-bit processor. The rest are 80x86-64 processors (or "extended 32-bit" processors). Just as an aside, AMD claims to own that instruction set... I wonder how that will play out in court when/if intel tries to claim AMD is breaking the rules with the x86 instruction set... That's really way beyond the point..”

This makes no sense to me. Actually, x86 is owned by intel and amd made the extended instruction set for 64bit which was so widely popular becuase of backward compatibility that intel adopted. Pulling from my head(might be wrong) all current mainstream processors, except the atom have amd’s x64 instruction set.

“If MS was purely aiming for the consumer market, they would be aiming for things like SSD integration, DirectX 11 (which will be huge in video playback quality and processing), and the like instead of things that enterprise users would need (like 64-bit).”

That again made no sense to me. What are you playing back with that requires ssd(which you seem to think is the end all solution),if your doing raw video editing raid would work better because of thruput vs price(you can probably raid 10 drives for the price of a ssd). And dx11 in this scenario made no sense. Direct x is an api for computer generated graphics. And video processing I would think means encoding, which is more cpu bound then anything else.

“I've always been slow to adopt OSs because most of them are buggy from day 1 (XP hasn't crashed on me in over 1.5 years... no viruses or headaches either). I will probably upgrade to Vista soon (or just wait 3 more years for Win 7 SP2).”
Personal preferences, do whatever you want.

“There is no need for the average consumer to use Vista or any other OS except MS wants to sell copies of it's OS to them so they force you to buy it for things like DX10 (which isn't even required yet...)”
Not true, vista is actually a lot more user friendly then xp. Your “average” consumer is computer illiterate, they just want something that works, since they are average I don’t think they will install a new thing every week, and a popup that ask are you sure is not that annoying to them. Vista is way more secure then xp when you click every link that pops up. And no one is forcing you to buy dx10, just like no one is making you use a computer. But dx 10 is an evolutionary upgrade.
“Eh, that's probably a lot of useless information.”

Well yes it was because all it seems to be is a rant on Microsoft and getting everyone to buy a ssd.


RE: Missing the point
By Veerappan on 5/28/2009 10:14:58 AM , Rating: 2
“If MS was purely aiming for the consumer market, they would be aiming for things like SSD integration, DirectX 11 (which will be huge in video playback quality and processing), and the like instead of things that enterprise users would need (like 64-bit).”

I think he was attempting to imply that the GPU Computing features of DX11 (comparable to OpenCL) would allow developers to write generic DX-based programs to do decoding/re-encoding of video streams on the GPU instead of on the CPU, thereby speeding up the processing.

Other than that, yeah, this guy is seriously misinformed.


"We don't know how to make a $500 computer that's not a piece of junk." -- Apple CEO Steve Jobs














botimage
Copyright 2014 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki