Print 134 comment(s) - last by kayronjm.. on Apr 17 at 5:26 PM

The truth comes out about User Account Control

Microsoft's Windows Vista operating system has been lambasted ever since it was launched for consumers in January 2007. Diehard Windows users balked at the steep system requirements, sometimes sluggish performance, inadequate driver support, and varying products SKUs at multiple price points.

One feature that has caused quite a bit of controversy with consumers has been the User Account Control (UAC) that is included in Windows Vista. UAC prompts nag users for simple operations such as going to device manager, emptying the recycle bin, or installing/uninstalling an application.

David Cross, a product manager responsible for designing UAC, gave the real reason for UAC at the RSA 2008 conference in San Francisco yesterday. "The reason we put UAC into the platform was to annoy users. I'm serious," remarked Cross.

Cross added that Microsoft's unorthodox method to stop users from wreaking havoc with their systems and to stop software makers from making applications that delved too far into the Windows subsystem was a necessary move.

"We needed to change the ecosystem, and we needed a heavy hammer to do it," Cross added. Cross went on to say that although UAC may be seen as an annoyance to some, but its lasting implications are far more beneficial to Vista users. "Most users, on a daily basis, actually have zero UAC prompts."

Many would say that many users have zero UAC prompts on a daily basis because they have already disabled UAC -- not so says Microsoft. According to Cross, 88% of Vista users have UAC enabled and 66% of Windows sessions do not encounter UAC prompts.

Comments     Threshold

This article is over a month old, voting and posting comments is disabled

Sluggish Performance
By UppityMatt on 4/11/2008 11:10:25 AM , Rating: 4
This is just my thoughts here so don't go crazy down rating me or anything. Does anyone else get the impression that the more our hardware advances the less programmers worry about actually shrinking and making code more effective. I have taken several programming classes in college and the professors would always mark us down if we used a double instead of an INT when we were suppose to. It almost seems like alot of programmers that actually do it for a living just get the job done and don't worry about optimizing. Vista to me seems to fit this bill, i know they crammed alot into it but i think they could have easily worked more on optimizing.

RE: Sluggish Performance
By Chris Peredun on 4/11/2008 11:25:25 AM , Rating: 5
Does anyone else get the impression that the more our hardware advances the less programmers worry about actually shrinking and making code more effective?

Yes, especially in the learning environment. XNA isn't helping that matter much - from my strolling around the XNA Developers forum, far too many people seem content to use horrifically inefficient methods and letting the PC/360 grind its way through them.

Though I would love to see their heads explode if you asked them to code something for an ARM platform without floating-point; say, the Nintendo DS.

RE: Sluggish Performance
By Mitch101 on 4/11/2008 3:32:53 PM , Rating: 2
Any of you played the game Transportation Tycoon when it was a DOS based game? In its game format back in the dos days? Chris Sawyer should have taken a shot at making the game into an OS. Multitasking in that game was excellent and having the many windows open tracking various items. It took a lot to slow that game down and that was in the 66Mhz cpu days. Plus the game never crashed. I think the original game fit on a floppy too.

That game had me amazed at how well he had programmed that game.

The game did in the end have a bug that he must have not used double for the cash amount because you could exceed the cash limit late in the game. It didn't crash but your money would turn from positive to negative billions. Just one bit!. Still awesome programming at its best.

Most code is a mess today. Calling the same objects multiple times instead of once just kills me. Why didn't he get all the elements the first time he connected to the database? Why so many calls? Grrr!

RE: Sluggish Performance
By smitty3268 on 4/11/2008 10:30:40 PM , Rating: 2
No need to use a double, just a long. Or at least an unsigned int which would have doubled the limit and caused the overflow to go back to a more reasonable 0 rather than -2 billion.

Was just reading this blog which was also talking about repeating the same code over and over again.

RE: Sluggish Performance
By BladeVenom on 4/11/2008 7:29:14 PM , Rating: 5
What Intel giveth, Microsoft taketh away.

RE: Sluggish Performance
By leidegre on 4/12/2008 5:55:23 AM , Rating: 4
I'll have to disagree, there are great talented people out there, but the bar is not as high as it used to be.

Yet, I can not just ignore the fact that people seem to waste computational power, a lot. As a developer myself I'm very picky about a particular software. I expect optimal performance and if that is not the case, I go otherwise.

The worst lie of them all is that we seem to have so much computational power that it doesn't matter any more. That's just out right stupid.

RE: Sluggish Performance
By Funksultan on 4/14/2008 7:45:13 AM , Rating: 4
Well, there is a ripple effect that becomes important here...

The purpose of marking you down is so that you truly understand the difference between int and long, not because it is going to effect the performance of your immediate program, but because it's important for you to know. An easy analogy is ebonics. Yes, we all can understand slang, but if we use ti to the point where we forget the use/spelling of the proper words, we lose something.

In the same way... yes, my quad-core is gonna chew up any code I write for it, pretty much no matter how sloppy I am, so I suppose I don't have to worry about long vs int. I might go years not caring, and eventually forget about ints all together.

What happens when I'm an OS programmer? Or what happens when I've trained 2 generations of junior programmers, and nobody uses ints anymore? Now, instead of a few occurances, the problem starts to magnify.

A coder who knows how things work vs. one who doesn't is like the difference between an architect, and someone nailing boards together. (yes, a Costanza moment)

RE: Sluggish Performance
By darkpaw on 4/11/2008 11:31:57 AM , Rating: 5
I would have to agree with you there. People that learned in a lower level language where they had to manually manage all of their memory allocations (C, Assembly) tend to use the smallest they need. People that learned something like Visual Basic in school tend to just assign the biggest they can so they don't have to worry about size.

I did a lot of embedded/C type stuff in my undergrad and learned good memory management practices then. It really seems the focus on current development for general purpose systems is just get it coded as fast as possible, memory footprint doesn't matter.

I like Vista and don't mind the memory requirements at all, it's not like 2gb is expensive. I do worry about where this trend is going though.

RE: Sluggish Performance
By murphyslabrat on 4/11/2008 12:31:55 PM , Rating: 2
I like Vista and don't mind the memory requirements at all, it's not like 2gb is expensive. I do worry about where this trend is going though.

You have Vista to thank for that, you know. ^_^

You can get a 2GB S0-DIMM now for $35 at the egg.

RE: Sluggish Performance
By ImSpartacus on 4/11/2008 3:41:52 PM , Rating: 3
2x2Gb is actually cheaper per GB than 2x1GB (it wasn't always that way). I say bargain bin A-Data 2x2GB for $50 after a $10ish rebate.

Who needs to pay 200$ for 2GB of DDR3 when $50 can get you similar performance and double the capacity?

It wasn't always this way, but Vista is certainly becoming manageable with current prices.

RE: Sluggish Performance
By mindless1 on 4/11/2008 9:32:40 PM , Rating: 2
I think you are missing the point, which is not memory cost but that at any given memory bandwidth it still takes longer to shuffle that code around. We're effectively cancelling performance gains and user productivity, the truth is people can't do the most common things on a PC any faster than they could 10 years ago (assuming certain things abandoned like floppy discs, dialup internet access) while in any other activity the user would certianly be faster after years of practice and using equipment potentially > 10X as fast.

RE: Sluggish Performance
By goku on 4/12/2008 5:16:23 AM , Rating: 2
I couldn't agree more.

RE: Sluggish Performance
By boogle on 4/11/2008 11:46:59 AM , Rating: 5
Using physically smaller datatypes is rarely a performance optimisation, its a memory optimisation. For example if the CPU reads in 32bit chunks (like all 32bit Intel CPUs), then using a 16bit short will not improve performance whatsoever. Storing 2 shorts in a single int would give a nice boost though (reading in two variables at once). Also floats (I know, I know, not a double) are just as fast as ints - it's the conversion process to/from int that is slow.

The performance issues aren't trivial like this, they arrise from the various utilities and libraries in use. For example rather than storing the result of a library call, it is called multiple times from within the very same method. Or using a generic object instead of a specific datatype (which in .NET/Java results in LOADS of boxing/unboxing), and so on.

The answer is to realise the cost of every method call, and program accordingly. .NET especially allows you to either write code almost as fast as C++, or write code that takes an eternity. This is what seperates a good programmer from a bad one - but given the current shortage of developers, bad programmers are all over the place. Want easy money? Graduate and program .NET.

But either way, genuine optimisation is an extremely complex subject. Deadlines are short. To say 'optimise' is significantly easier said than done. If we went back to 'real' programming, Windows NT would be a pipe dream and Windows 3 would only just be finishing development.

RE: Sluggish Performance
By Locutus465 on 4/11/2008 11:56:42 AM , Rating: 3
Actually quite the oppisit... Useing data types smaller (or wider) than the bit width of the processor will in many cases result in a preformance hit beleive it or not.

RE: Sluggish Performance
By boogle on 4/11/2008 12:00:04 PM , Rating: 2
Actually quite the oppisit... Useing data types smaller (or wider) than the bit width of the processor will in many cases result in a preformance hit beleive it or not.

No no, I completely agree. However, you can get great speed improvements by putting lots of values into a single large stream (aka. using MMX/SSE) and working on them all at once.

RE: Sluggish Performance
By Locutus465 on 4/11/2008 12:35:50 PM , Rating: 4
that takes a lot of optimisation (fortunetly compilers do a lot of the heavy lifting these days for normal folk), and honestly for an OS I'm not so sure you would even see a difference. Addtionally you only get a gain if you're performing the same operation on all data values. It terms of Window's "bloated" state, I wouldn't look to carelessness being the reason why their OSs are getting heavier. Probably more a desier to increase the number of built in features (and running a good portion of them by default) as to being the reason why Windows OS code is getting heavier.

Just think, vista has:

Automated defragmetation
Pretty 3d ui
Media center (home prem+)
Tablet PC functions
Extended networking featuers
and a host of other features you either had to get third party or buy a specialized version of windows to get.

RE: Sluggish Performance
By KristopherKubicki on 4/11/2008 12:00:09 PM , Rating: 2
He's right.

RE: Sluggish Performance
By Locutus465 on 4/11/2008 11:53:24 AM , Rating: 3
The answer in general commercial software, particularly with web based systems is yes... One would hope that Microsoft is designing their systems more efficiently though (if not packing with thousands of new features)... I'm guessing the quality of Microsoft's code is good since the last time Windows OS code was leaked into the open it was met with very good reviews (Windows 2000).

RE: Sluggish Performance
By PAPutzback on 4/11/2008 11:59:02 AM , Rating: 2
I think you are comparing different types of coders. I imagine on a new release there is quite a bit of code that wasn't optimized in order to get the product out the door with the plan for the service packs to supply the optimizations.

And a lot of the programmers who do it for a living like myself have so many pending requests you have to spit out code so fast to keep up or find a different living.

A lot of the time your managers might not have coded a day in their life so they have no clue what goes into program. I have had reports take longer to program to get what the user is expecting than a large SSIS package.

And then you have MS spitting out a new platform every six months implemented only half way. They gave us VS 2008 and it doesn't work with reporting services so now you have to maintain two VS installs or run a VMC.

RE: Sluggish Performance
By Locutus465 on 4/11/2008 1:53:22 PM , Rating: 2
Hows the stability of 2008? I've been hesitent to even consider giving it a try considering how flaky 2005 is. VS.Net was my all time fav environment in the 2002/2003 days, starting with 2005 I'd have to say it still integrates all the tools better than anything else I've used, but daaaaaaaaaamn does it sacrafice reliability.

RE: Sluggish Performance
By darkpaw on 4/11/2008 2:50:54 PM , Rating: 2
I've been using 2008 since a bit before launch. I'm really not all that happy with the stability at all. It loves to lock up, especially when using one of the new features split screen code/web view.

If you haven't migrated your code and don't plan on using LINQ, I'd probably stay with 2005. LINQ looks really interesting, but I haven't actually gotten a chance to use it in any work yet.

RE: Sluggish Performance
By jvillaro on 4/11/2008 3:56:03 PM , Rating: 2
Actually I've had a really good experience with it. I've been using it for a while now and even though I'm currently using it on a 1.4 ghz celeron with 768mb of ram the only problem I've had is the expected slowness of this configuration. Specialy because right now I'm working with WPF.

LINQ is not the only reason to change, if you intend to use WCF, WPF and/or WF it's a necesity.

You will have to be carefull when migrating projects with web services or WCF services created on VS2005 so back up just in case and if posible start out with both installed just in case.

RE: Sluggish Performance
By darkpaw on 4/11/2008 4:02:22 PM , Rating: 2
Just out of curiosity (and completely off topic), do you do primary or forms development?

I do ASP.Net work for one project I'm on and I've run into the locks quite a bit when switching from source / view /split on both my primary dev systems. I'm running fairly high end systems (Q6600 desktop and T9300 laptop, both 4gb) and have the issue with both. It does get really annoying after a while.

RE: Sluggish Performance
By jvillaro on 4/11/2008 4:35:49 PM , Rating: 2
Well that could be the difference in our experiences with VS2008. I primary do win forms, wpf forms, services and such. I haven't done anything ASP.NET on VS2008, a partner of mine is starting something and doing some tests but hasn't commented anything. Maybe if we start a full proyect with ASP.NET we could get the problems your talking about.
In WPF the split views are heavy, but again I think it's the machine.
It's not perfect I must admit, but it has worked for me. I remember the same partner say a couple of times VS2008 has locked up on him (although not while using ASP) but VS2005 has done it too so I can't say it's better or worse.

PD: I'm using Windows XP right now if it helps to know. Next week I'll upgrade to a T5550 laptop with 4gb and Vista (all will be running in 64bit) so maybe I'll have an update about all this

RE: Sluggish Performance
By Locutus465 on 4/11/2008 4:28:16 PM , Rating: 2
Hmm, well my company primarly develops a web app (and a few supporting windows apps). So I guess we'll have to be careful with this... Primarly I would like to see us switch to .Net 3.0 so we can take advantage of some OOXML tools in the new framework, but we're doing a huge UI overhaul so it's not the time to be changing out frame works. We'll see what happens, right now the big '05 annoyance is the compiler just getting hung up mid compile, still haven't figured out what's up but it feels like a race condition of some sort.

RE: Sluggish Performance
By Chadder007 on 4/11/08, Rating: 0
RE: Sluggish Performance
By fic2 on 4/11/2008 4:08:18 PM , Rating: 2
Well, they were pushed to downgrade the graphics card requirments so that Intel's embedded crappy graphics would work.

RE: Sluggish Performance
By Captain Orgazmo on 4/11/2008 1:23:09 PM , Rating: 1
Funny I made the same comment on another post, and got downrated, and some guy responds saying that if I want fast performance with a computer, boot DOS on a Core 2, and he gets rated to 5.

As I said in the aforementioned comment, for most users, common tasks such as word processing, databases and spreadsheets have, if anything, slowed down in the last few years, especially with Vista. I'm no linux geek or IT pro, but I know that there are fully functional linux builds that reside in 1/5th or less the memory of Vista.

RE: Sluggish Performance
By pauldovi on 4/11/2008 1:41:08 PM , Rating: 5
That would be all well and good if those Linux distro's offered the same functionality as Vista. Vista doesn't use that much memory just because it is bloated. It uses the memory to store the most common programs to these launch faster. Why let the excess memory sit around and do nothing?

By the way... I have seen a spyware laden slow Vista PC boot 5 times faster than a linux kernel.

RE: Sluggish Performance
By sprockkets on 4/11/08, Rating: 0
RE: Sluggish Performance
By Locutus465 on 4/11/2008 4:29:41 PM , Rating: 2
You do if you move up to 4GB memory and start seeing load times for your frequently used apps drop to astonishingly low times.

RE: Sluggish Performance
By goku on 4/12/2008 5:20:53 AM , Rating: 2
blah blah blah. The good ole' "it's superfetch at work, that's why it consumes 4X the memory" excuse at work once again. If you disable superfetch, you'll see that vista uses at a minimum 4X the memory of windows XP, and XP isn't exactly "lean".

RE: Sluggish Performance
By swizeus on 4/11/2008 1:26:35 PM , Rating: 2
well, from me, someone who just know how to read task manager and reading status from my laptop, program these days stressing too much harddrive with swaps and these programs just taking too much memory from what i see they should take. Just to add things worst, these program don't clean up their mess, so windows have to reorganize itself when other program wants to run...

RE: Sluggish Performance
By Screwballl on 4/11/2008 1:27:51 PM , Rating: 2
hopefully they step away from the vista path and go a different route as it appears they MIGHT be with Windows 7/Vienna

A minimalistic variation of the Windows kernel, known as MinWin, is being developed for use in Windows 7. The MinWin development efforts are aimed towards componentizing the Windows kernel and reducing the dependencies with a view to carving out the minimal set of components required to build a self-contained kernel as well as reducing the disk footprint and memory usage. MinWin takes up about 25 MB on disk and has a working set (memory usage) of 40 MB. It lacks a graphical user interface and is interfaced using a full-screen command line interface. It includes the I/O and networking subsystems.

RE: Sluggish Performance
By goku on 4/12/2008 5:23:24 AM , Rating: 2
Hardly minimalistic, 40MB is a lot of memory to use when you don't even have a gui.

RE: Sluggish Performance
By pauldovi on 4/11/2008 1:34:37 PM , Rating: 2
In those software classes they teach you it is far cheaper to double your computer power than to double your software efficiency.

RE: Sluggish Performance
By TheOneStorm on 4/11/2008 2:05:47 PM , Rating: 2
Not to downplay Yahoo, but this entails YUI. We use it at my work, and sure, it gets the work done quickly and allows many of our developers who aren't good at JS to make something fancy. It just sucks the browser to no end and gives us very bad feedback on our website. I hear it every day from fellow co-workers and consumers.

Luckily for me (yes, I'd rather people be concerned about speed), browsers are still not as "advanced" at using all of the CPU to run JavaScript (unlike .NET). Web developers who are good with memory management and code performance actually have very slick, fast web sites.

RE: Sluggish Performance
By AlexWade on 4/11/2008 2:08:28 PM , Rating: 1
I think poor performance has more to do with the compiler than anything else. A good compiler can make a ton of difference. But with CPU's getting more powerful and memory getting cheaper, programmers might not be using good compilers.

RE: Sluggish Performance
By Locutus465 on 4/11/2008 2:26:27 PM , Rating: 2
As microsoft uses Visual Studio.Net (should be obvious), I would say this is not a case of Vista being compiled with a compiler that outputs inefficient binaries.

RE: Sluggish Performance
By Ammohunt on 4/11/2008 3:37:58 PM , Rating: 1
I would have a greed with you before i setup and installed Windows 2008 Server. It has a 512mb memory footprint base build. Of course server doesn't have aero and all the other eye candy Vista has enabled(1.3GB footprint on my Vista Ultimate install) While i am sure Miscrosoft could tighten the code alot i feel that they have produced the best version of windows yet with the Vista Code base.

RE: Sluggish Performance
By wetwareinterface on 4/11/2008 3:42:52 PM , Rating: 3
there's an old it joke it goes like this...

2 programmers one from the linux community one from the microsoft campus are at a security convention. the linux guy turns to the microsoft guy and asks what they use in-house to do speed optimization. the microsoft guy responds "speed optimization? it's intel's job to make the code run faster.". and then the microsoft employee asks the linux dev what tools there are on the linux platform to ensure they don't break the tree. the linux dev responds "we use green procesors like transmeta, why?" if you don't know what a code tree is the joke isn't funny, so here it is a code tree is a conglomerate of the different dev teams output and if you introduce something that causes a fault either in your own code or somewhere else it's called breaking the tree.

and this is the problem in software, there are 2 distinct types of coders one type has no development schedule to adhere to and puts what features they want to include ala small independant dev or government or other in-house corporate type it developer. the other type is on a schedule and has a list of features they need to integrate with and add to their own code ala big software house.

one side has the design goal of more efficient code and gets to work against a set system that doesn't change.
the other has the design goal of interoperability and feature compatibility and hiding of features that will break the code module they are working on.

neither side is right or wrong. it's just how it is. making code more efficient isn't the primary goal of an os developer, exposing and hiding features that will make or break the code base is.

RE: Sluggish Performance
By fic2 on 4/11/2008 4:02:14 PM , Rating: 2
Yeah, several years ago I worked on an embedded system that had a windows gui interface. The idiot windows programmer "designed" the interface that I was supposed to send info to him. It was so big it forced them to use a larger heap. I spent six months trying to talk them into using a much simpler and smaller interface (only sent the data you asked for instead of everything everytime). During my discussions with the gui people one of my arguments was that it was slow. Their response - buy a faster computer. I told them they would never make it in the embedded world where if you want a new cpu it will cost you at least $250M in respinning the board.

RE: Sluggish Performance
By Some1ne on 4/11/2008 7:28:56 PM , Rating: 2
I have taken several programming classes in college and the professors would always mark us down if we used a double instead of an INT when we were suppose to.

As well they should, and not because you weren't using the most size-efficient approach. Floating point values are imprecise, and should really only be used when your code requires floating-point arithmetic. In all other cases an int/long is better.

If, however, you also get marked down for using a double where a float would have sufficed, or for using a long where and int would have sufficed, then you are getting marked down to space-efficiency reasons.

And also, yes, I believe your general point is correct. Programmers generally do not waste a whole lot of time optimizing the space-complexity of their code, now that machines with gigabytes of RAM have become common. Optimizing for time-complexity is still fairly common however, and generally speaking doing so entails some sort of tradeoff in space-complexity. You could, for example, process all the data in a file by reading only one byte of data into memory at any given time (assuming that there are no interdependencies between the bytes of whatever thing it is you are processing), but it would be much more efficient, in terms of execution time, to allocate a buffer of several KB's or so and periodically read the file data into that. It would use more than 1000 times as much memory, but it also wouldn't take all day to execute.

RE: Sluggish Performance
By RedStar on 4/12/2008 5:01:33 AM , Rating: 2
Iremember when CD-roms were about to come out. At the time, there were a few huge games topping 14 meg!

At the time, some people worried that the advent of the the cd-rom would cause game sizes to explode.

Well guess what.... they did! =P

RE: Sluggish Performance
By theeq on 4/12/2008 4:16:37 PM , Rating: 2
Don't suppose you saw Will Wright's keynote speech a few years back?
He was talking about the procedural code in Spore (then still news) and he talked about how coders are relying more on stored data than on algorithms and space savers.
Just imagine if the old mindset of "make it as small and efficient as possible" still reigned? How much faster would things be going, I wonder (if it was noticeable).

RE: Sluggish Performance
By otispunkmeyer on 4/12/2008 8:10:26 PM , Rating: 2
i know exactly what you mean

i have been writing a small program for my engineering project. is nothing fancy and nothing even the most meager of computers couldnt handle so it almost doesnt matter how bad the coding is but....

i found myself doing exactly as you described. in the interests of getting it done asap (and not having too much of a clue what i was doing) i just used doubles for everything, my code was all over the shop, i had loads of functions to call etc etc

now im not a software guy, im a mech eng guy so programming is certainly not my forte but even i know that what i did was terrible

i thought, i have a 2.4ghz core 2 duo, 4 gigs of ram and a very very lean xp install. it doesnt matter if the code is crummy.... brute force will see me through

RE: Sluggish Performance
By The0ne on 4/13/2008 2:14:51 AM , Rating: 2
Yes a lot of programmers slack off from making the code more efficient and effective but it's really not their fault. I've been around for 15years now and from my experience the programming is as it is due to demands placed on them by their boss or company. It is always crunch time. Quality, one of my expertise, is no different. You weight the risk and go with what you think is the most sensible. I have friends working in MS and they want to write better code but schedule does not allow that. Just a small view from my side.

RE: Sluggish Performance
By Mike Acker on 4/13/2008 8:25:16 AM , Rating: 2
the concept that "people are more important than computers" was promoted by IBM from the '60s. Of course they wanted you to trade your Model 50 for a Model 65, but, after all, they were a marketing organization, mainly.

the attitude remains pervasive today: if my program runs slowly the problem is YOUR computer, certainly not my program

how do you change the mindset of a culture? you take advantage of their foolishness and beat them in sales.

RE: Sluggish Performance
By lco45 on 4/14/2008 4:12:13 AM , Rating: 2
As a programmer myself I see a lot of poorly written code, but I think people can be too fussy about making code perfect.

When I see a ROM for donkey kong that has a whole game written in 3000 bytes I have take my hat off to the guy who thought that out, but what most people don't realise is that code is not supposed to be masterpiece everytime.

When you're give 3 weeks to write a program, and you get it done in 3 weeks and it's fast enough for the users, you have done a good thing. It doesn't matter if that code is 10 times slower than it needs to be, so long as it is fit for purpose, and you can get on with your next job.

The other thing to remember is that there's a helluva lot off code to be written out there, and there's not that many really good programmers. For every programmer who knows how to allocate memory and use integers rather than doubles for their short loops there's 10 programmers who are barely stopping their brains from exploding as they gaze at the requirements doc...

SO IN SUMMARY! The human body isn't perfect, but it's good enough to cap out a couple of children and raise them before it gives out, and crappy code is usually good enough to do the trick, and will be replaced in a couple of years anyway...

RE: Sluggish Performance
By Major HooHaa on 4/17/2008 11:47:51 AM , Rating: 2
I have not taken the Vista plunge yet, I am still using Windows XP on my single core 2.6GHz Athlon 64. This runs games such as Team Fortress 2 just fine.

I take the view that XP has had time to mature, while Vista is the next brand new thing and is a bit of a mess and a resources hog.

My brother has upgraded and got Vista though, using a quad core processor and 4 GB's RAM. With the changing of the way sound works between XP and Vista, he eventually gave up on his Creative X-Fi sound card. He sold his Creative card and bought something else from another manufacturer.

I get the feeling that programmers have it easy in some ways today, with all that memory and processing power at their disposal. I wonder what would happen if a modern programmer tried to code a game for the old Atari VCS\2600 games console?

"We’re Apple. We don’t wear suits. We don’t even own suits." -- Apple CEO Steve Jobs

Most Popular Articles5 Cases for iPhone 7 and 7 iPhone Plus
September 18, 2016, 10:08 AM
No More Turtlenecks - Try Snakables
September 19, 2016, 7:44 AM
ADHD Diagnosis and Treatment in Children: Problem or Paranoia?
September 19, 2016, 5:30 AM
Walmart may get "Robot Shopping Carts?"
September 17, 2016, 6:01 AM
Automaker Porsche may expand range of Panamera Coupe design.
September 18, 2016, 11:00 AM

Copyright 2016 DailyTech LLC. - RSS Feed | Advertise | About Us | Ethics | FAQ | Terms, Conditions & Privacy Information | Kristopher Kubicki