Print 99 comment(s) - last by Dfere.. on May 29 at 8:50 AM

The Army has decided to upgrade all of its computers, like those shown here (at the NCO Academy's Warrior Leaders Course) to Windows Vista. It says the adoption will increase its security and improve standardization. It also plans to upgrade from Office 2003 to Office 2007. As many soldiers have never used Vista or Office '07, it will be providing special training to bring them up to speed.  (Source: U.S. Army)
Army will upgrade all its computers to Vista by December

For those critics who bill Microsoft's Windows Vista a commercial failure for failing to surpass Windows XP in sales, and inability to capitalize in the netbook market, perhaps they should reserve judgment a bit longer.  Just as Windows 7 hype is reaching full swing in preparation for a October release, the U.S. Army announced that like many large organizations, it will wait on upgrading to Windows 7.  However, unlike some, it is planning a major upgrade -- to Windows Vista.

The U.S. Army currently has 744,000 desktop computers, most of which run Windows XP.  Currently only 13 percent of the computers have upgraded to Windows Vista, according Dr. Army Harding, director of Enterprise Information Technology Services.

It announced in a press release that it will be upgrading all of the remaining systems to Windows Vista by December 31st.  The upgrade was mandated by a Fragmentary Order published Nov. 22, 2008.

In addition to Windows Vista, the Army's version of Microsoft's Office will also be upgraded.  As with Windows, the Army is forgoing the upcoming new version -- Office 2010 -- in favor to an upgrade to Office 2007.  Currently about half of the Army's computers run Office 2003 and half run Office 2007.

The upgrade will affect both classified and unclassified networks.  Only standalone weapons systems (such as those used by nuclear depots) will remain unchanged.  Dr. Harding states, "It's for all desktop computers on the SIPR and NIPRNET."

Army officials cite the need to bolster Internet security and standardize its information systems as key factors in selecting a Windows Vista upgrade.  Likewise, they believe that an upgrade to Office 2007 will bring better document security, and easier interfacing to other programs, despite the steeper learning curve associate with the program (which is partially due to the new interface, according to reviewers).

Sharon Reed, chief of IT at the Soldier Support Institute, says the Army will provide resources to help soldiers learn the ropes of Windows Vista.  She states, "During this process, we are offering several in-house training sessions, helpful quick-tip handouts and free Army online training."

The U.S. Army will perhaps be the largest deployment of Windows Vista in the U.S.  Most large corporations keep quiet about how many Windows Vista systems versus Windows XP systems they've deployed.  However, past surveys and reports indicate that most major businesses have declined to fully adopt Windows Vista.  Likewise, U.S. public schools and other large government organizations have only, at best, partially adopted of Vista.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

RE: Missing the point
By croc on 5/23/2009 7:52:51 PM , Rating: 2
"Let me make this perfectly clear, there are two instruction sets that are "common" in CPUs right now (one is FAR more common), x86 and x64. There is ONE architecture implements x64 instruction set, the Itanium (well and Itanium 2). There are boatloads of CPUs (even ones sold at best buy) that support extended x86 (AKA x86-64)."

There are several RISC based CPU's on the market, not one. Every major 'big iron' MFG has /is making RISC based CPU's. IBM's PPC, Sun's Sparc systems, HP uses a variation of the Alpha, (Not the bastardized Itanium that Intel seems to have abandoned...) Fujitsu has their own CPU's, etc. All are RISC based platforms, and all have been since DEC proved the advantages with the original Alpha.

X86 CPU's are all CISC based, and most modern X86 based processors use a variation of AMD's AMD64 extended instruction set.

Please stop passing mis-information.

RE: Missing the point
By Pryde on 5/23/2009 8:56:16 PM , Rating: 4
The terms x86-64 and x64 are often used as terms to refer to x86-64 processors from any company. There is not such thing as x64.

Intel Itanium (formerly IA-64) architecture is not compatible with x86.

AMD have since renamed x86-64 to AMD64. AMD licensed its x86-64 design to Intel, where it is marketed under the name Intel 64. You can be sure that if Intel lost its AMD64 license that AMD would lose its x86 license.

RE: Missing the point
By rninneman on 5/25/2009 2:40:08 AM , Rating: 3
You are very confused/misinformed about 64 bit computing. x64 is the marketing name Microsoft gave to their software that runs on AMD's x86-64 now known as AMD64 and Intel's EM64T now known as Intel 64. Most developers have adopted the same marketing name. The Itanium series CPUs run IA64 software which is a completely different ISA than x64. Itanium cannot run x64 code, only IA64.

While the x86 architecture is technically CISC, the advent of SSE has morphed the ISA into a hybrid of sorts. To much to go into for this post.

RE: Missing the point
By TA152H on 5/25/2009 3:46:14 AM , Rating: 5
You are worse than he is.

Do any of you people posting have any idea what you're talking about? It's like reading nothing but blowhards that have no clue, but like to hear themselves post. Shut up if you don't know what you're talking about!

To clear up all the misinformation posted by people who obviously are not in the computer industry, we'll start with x86 and the Itanium.

In fact, the original Itanium DID run x86 code in hardware. Intel decided to remove it and use software emulation.

The Alpha is no longer being developed, but was 64-bits from when it was conceived. It is also a very overrated, but still decent instruction set. People who have never used an Alpha, and know essentially nothing about it like to post how much better the instruction set was than everything else. In fact, it was not. They made extremely expensive processors, and spent enormous amounts of money on hand-coding a lot of the logic, making the implementation effective in some cases. In reality, it was always swapping places with IBM's POWER, so was never clearly a superior product. Well, for long anyway. They leapfrogged each other. It was a horrible thing to work with though, and got hotter than Hell. It's still made, mostly for OpenVMS, but HP is not doing further development on it.

Intel has certainly not discontinued Itanium, or even deprecated it. They continue to gain market share every year with the Itanium, and HP has many operating systems that use the Itanium, and are very committed to it. They did just delay the Tukwila again though, which is an ongoing problem, since their current processors are still based on 90 nm technology. But, it is still be developed, and still gaining market share.

Most of all, you people have to stop talking about 64-bits and x86-64 like they are the same thing!!!!!!!!!!! 64-bits is not very important, but x86-64 is not just a move to 64-bits. Were it so, the stupid posts about numbers exceeding 32-bit values would actually not be stupid, but sadly, they are. x86-64 got rid of a lot of baggage, like MMX, x87 instruction set, memory segmentation (which was still around on the 386 instruction set, but since the largest segment was 32-bits, it was transparent if desired, an no one used it), etc... It also added things, like eight more registers, which can have an important impact on performance, especially with L1 caches getting slower (4 clock cycles on the Nehalem). There are disadvantages to 64-bit addressing as well, and it's why the 8088 had memory segmentation in the first place (most people think it was to protect applications from each other, but it was not, and was never used that way). On the 8088(or 8086), you'd just have to specify a 16-bit address, and based on the values in the appropriate segment address, you'd generate a real 20-bit address (Intel used some weird shifting of the index register, and then added it to the offset address). So, you'd save memory by not having to specify the 20-bit address. You'd have to update your index registers, of course, when you needed something outside of the 64K segment you were in, but memory at that time was very small, and it was considered a worthwhile trade-off.

64-bit addresses also consume more memory, and lower code density. Of course, we have so much memory now, it does not really matter. Well, except, it lowers performance, because we have caches now, and lower code density means you hit a slower cache, or main memory more often. And caches can not simply be increased in size to compensate for this ugly side-effect, since the larger a cache is, the slower it is, all other things being equal, so you have to add more wait states (for example, the Conroe's L2 cache was one clock cycle faster than the Penryn's because of the additional size. The Nehalem's is five clock cycles faster then the Penryn's; that's why they made it only 256K - it's also faster because it's not shared between cores anymore).

So, if there were no changes besides going to 64-bit, you'd generally expect x86-64 to be slower due to lower code density - unless you were using more memory, or somehow using very large numbers. For most applications, it would be slower though. That's why you see uneven performance. The enhancements, like 16 registers can improve performance, but the lower code density can lower it. Whichever is more relevant within that workload will dictate whether you see an increase or decrease in performance.

Oh, and DEC did not invent the RISC concept, or even come close to it. Why do you pass this misinformation on, when you clearly have no idea what you're talking about! It's infuriating when idiots do this, and then someone repeats it because they don't realize you're an idiot.

CDC was the first to create a true RISC processor, in a machine called the 6600. It used a 10 MHz processor attached to 10 barrel processors. It could not even load or store data, all this was done by the barrel processors that kept it fed.

IBM also released the RT PC long before the Alpha, which, of course, was a RISC processor. Berkeley might also get offended by you saying Alpha proved RISC's superiority. But, I'll let you do that research on your own, assuming you actually dislike passing on misinformation. There's no evidence of that though.

"I mean, if you wanna break down someone's door, why don't you start with AT&T, for God sakes? They make your amazing phone unusable as a phone!" -- Jon Stewart on Apple and the iPhone

Most Popular ArticlesAre you ready for this ? HyperDrive Aircraft
September 24, 2016, 9:29 AM
Leaked – Samsung S8 is a Dream and a Dream 2
September 25, 2016, 8:00 AM
Yahoo Hacked - Change Your Passwords and Security Info ASAP!
September 23, 2016, 5:45 AM
A is for Apples
September 23, 2016, 5:32 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki