ExtremeTech has a newly published article called Speed Up Windows Vista, which promises “tweaks that can help you turn up the throttle on your new operating system.” Of course, the usual Digg mob has descended on it.
My summary? Pure, unadulterated crap. Half the advice is painfully obvious, the other half is downright dangerous, like the ridiculous advice to “experiment with services [and] [s]treamline the system by shutting down as many services as you can.” Uh, that’s a really bad idea.
Even the commenters at ExtremeTech noticed that there weren’t any, you know, benchmarks or test results to actually substantiate what any of this stuff does.
And ExtremeTech, like so many Ziff-Davis sites[*], insists on chopping articles like this one into a dozen pieces so you have to click-click-click-click to read it. (Here’s the single-page, ad-free, printer-friendly link so you can scan this load of rubbish without driving yourself crazy.) [Oops, not surprisingly Ziff-Davis doesn’t allow direct links to their printer-friendly page. So if you must waste your time, go to the main page and click the Print button (in light gray type, under the Options heading beneath the post itself). Thanks to Ian Easson for the heads-up.]
Once upon a time, ExtremeTech published some interesting stuff. These days, it’s just plain junk.
Your “printer friendly” link isn’t. It instead links to the multi-part article.
Good post Ed and good advice. I use Vista and like it but do find it slow for many things (on my admittedly older PC).
I’m interested in your thoughts on this one though:
http://www.crn.com/software/198702242
Is the performance delta really that bad? If so, iyo, why is Vista seemingly so incredibly resource intensive? Is there at least a good news story beyond 1GB of RAM? Because otherwise I’m scratching my head trying to figure out why MSFT can’t do in 1GB what others (OSX, Linux) – including XP – can do in 256-512MB? It also seems to be a huge inhibitor to installed base upgrade adoption – not to mention leveraging the OS in other devices (a la Apple TV). Again, maybe – hopefully – there’s some good LT story? It doesn’t seem plausible that MSFT’s engineers are just that much worse. Anyway, interested in hearing your thoughts if you get the chance or can point me to some other post where you went through it.
The only performance information anyone really needs is in Vista: Inside and Out. That, and the right hardware. I would not run Vista on a computer that is not already capable of running the premium versions of Vista.
Loyd Case is still a knowledgeable and informative writer, even if the rest of ExtremeTech has gone downhill a ways.
I had already disabled much of the “extras” the article mentions during setup. The only game, for example, I kept was chess, and I disabled all tablet functions. I don’t think that stuff creates a performance hit, but if I never need or use it, why load it? On the other hand, shutting off Windows Error Reporting shoots yourself in the foot, since that directly contributes to making Windows better for everyone.
None of these can substitute for faster hardware and more memory, though.
I just built a new Core 2 Duo system, and Vista eats every bit of hardware I throw at it. I wrote about the minimum hardware you’ll want in a new system on my blog (not the minimum you need, but the minimum you’ll want). While Vista ran fine in my old one, Vista Explorer really slows down in older systems, it seems. Generally, the same rules apply: get the most you can afford, but save half the money by not buying the ‘ultimate’ of anything. You’re really looking to get that 1.0 rating on the Windows Experience Index, imo.
The more research you do on tweaking the more you find 99% of all tweaks to be either useless or worse REDUCE performance. It is ironic that after I “un-tweak” a system how much faster it is.
Disabling services has only one effect on performance = improved boot times and you damn well better be sure of what those services do before you disable them.
As with anything else on a clean install the only thing that is really going to help is better hardware. Something people just don’t want to accept. Don’t get me wrong some legitimate optimizations can make certain things more responsive or “feel” faster but usually at the expensive of some other feature or visual effects.
The further along Windows gets, the more self-tuning it becomes, and the more contraindicated micro-tweaking seems to be. I’ve come almost completely 180 degrees on this in the last several years; I used to think that the only way to get Windows to run well was to tweak it to death. The way Windows has been engineered has changed so much since then, though, that I suspect people may end up doing more harm than good by trying to manage everything.
A friend of mine put it this way: it’s a little like that magic gasoline additive that for some reason is always sold by someone else. Why not just add it to the gasoline anyway? If you get an answer that involves magic or a conspiracy theory, that tells you everything you need to know about what’s really going on.
Oh, and one other thing: their SATA disk “hack” doesn’t even work on some systems — mine, for instance. The Policies tab on my disks is nothing like what they have there — the two checkboxes in question don’t appear.
I have to disagree with you on the disabling of services, Ed.
On a higher-end machine, it’s unnecessary, true.
I’d like to make the case that it is beneficial when RAM is at a premium. Services take up memory, and if you don’t have a lot of it to go around, disabling truly unnecessary ones can be good. You just have to know what you’re doing and keep a record of any changes you make.
Optimizer disabling services does nothing for performance except improved boot times = fact:
http://forums.anandtech.com/messageview.aspx?catid=34&threadid=1678445&enterthread=y&arctab=y
Optimizer, I agree with you in theory, but go back and read the portion I quoted from this article, in which the author recommends “experimenting with services” and “shutting down as many as you can.” That’s wrong and stupid advice. You will spend more time fussing and tweaking and fixing broken stuff than you will save in a year.
And which services do you think are “truly unnecessary”? The example that everyone likes to focus on is Tablet PC Input Services. Care to hazard a guess on how much memory you will “recover” by disabling it? Here’s the answer: 224K. That is roughly four hundredths of one percent of the memory on a system with the minimum 512MB required by Vista. even if you disable ten such services you still haven’t recovered even one-half of one percent of your system RAM.
Like I said, snake oil.
“You will spend more time fussing and tweaking and fixing broken stuff than you will save in a year.
This is the same argument I use against installing a third-party defrag program on a desktop machine: you waste more time moving stuff around than you gain back speeding things up. (A server is another story, but many workstations that have enough free space on disk will probably not benefit from micromanaged defrag.)
Optimizer, it is a Myth that something taking up RAM must be hurting performance. Windows efficiently pages out what is unnecessary. The only legitimate performance reason to disable certain services is reduced Windows Boot times. The problem is on high end machines a handful of the default services will probably not be noticeable. A second or so here or there. Granted I personally can see no reason to leave a service running I know for sure I will never use, I just do not have any illusions of a performance improvement by disabling it.
Serdar, then why does Vista come with an automatic defrag program? Just for kicks? That is another Myth:
http://www.diskeeper.com/diskeeper/myths/hard-drive-wear.asp
Myth No. 4: You can wear out your hard drive if you defragment too often.
Not true. The truth is, your drive is going to work much harder if you never defrag at all! It is a common misconception that defragmentation is stressful to disk drives. In reality, fragmentation results in many more disk accesses.
Here is an example: If you have a file that is fragmented into 50 pieces, and you access it twice a day for a week, that’s a total of 700 disk accesses (50 x 2 x 7). Defragmenting the file may cost 100 disk accesses (50 reads + 50 writes), but thereafter only one disk access will be required to use the file. That’s 14 disk accesses over the course of a week (2 x 7), plus 100 for the defragmentation process = 114 total. 700 accesses for the fragmented computer versus 114 for the defragmented computer – the benefits are obvious.
If the defragmenter is performing the defragmenting during system idle times you are wasting no time at all.
Andrew: I’m not against defragging at all; in fact, the fact that Vista does this stuff in the background on a schedule is a huge plus. What I was specifically inveigling against was micromanaging defrag — i.e., optimizing file or directory placements, that kind of thing. I’ve tried such stuff and have never noticed much discernible benefit, all other things being equal.