This guest post is by Igor Leyko, Microsoft MVP – Windows System & Performance, and is reprinted here with his permission.
There are many sites that change Windows behavior or settings, some of which even disable services or functionality. It is interesting that almost none of these sites explain the real benefits of tweaking Windows. You may see assurances such as "The system seems to be much faster" or promises of a 10 to 20 percent performance gain. I’ve even seen a promise to make a Windows system up to 50 times faster!
However, it is difficult to find any measurable results. Few sites present actual results metrics, and when they do, the numbers primarily report decreased boot times. [I ignore the "fake" with changing Windows Hardware Abstraction Layer (HAL) during installation to 486C compatible.] While you may decrease the boot time, there are not strong dependencies between boot time and performance. The boot process has significant differences with work processes. Moreover, decreasing boot time may actually increase application start times.
Can you have metrics that show the real effects of any tweaking? In theory, yes. But in practice, no.
During the Windows 98 era, I conducted a study on the influence of the ConservativeSwapfileUsage registry key on overall system performance and used mathematical statistics methods to analyze the results. I found I had to conduct hundreds of tests to get accurate and trustworthy system performance results. Do you think someone can spend a week or two accumulating test results to determine the effects that can be achieved by tweaking Windows? I think the answer is no.
Windows is a complex operating system with background processes that may affect performance test results. Therefore, results from repeated tests may differ by 2 to 3 percent, or more. The effect of tweaking is, on average, less than 2 to 3 percent. However, after one or two tests, it is hard to say if there is a real difference in results or just random deviation.
So, what about claims such as those mentioned above: "The system seems to be much faster." Are these claims false? No, the system really may seem faster after tweaking. However, it may not actually be any faster. We tend to perceive things we want to believe regardless of their actual existence.
Have you seen how wine tasters work? They taste numbered rather than named wines. The blind method makes the comparison valid. However, when you know what you are comparing, there is no way to avoid subconscious reactions. Unfortunately, this is why useless programs may sell very well. Just believe the computer must be faster and most likely you’ll perceive that it is.
Ten years ago I wrote a program to improve Windows performance on Pentium I computers with certain hardware. The program didn’t work with any other hardware, but its description was very convincing—so much so that several e-mails I received from users demonstrated the power of suggestion. Users wrote feedback such as: "The log file says ‘not installed’; however I am SURE it is, because things seem A LOT faster."
Now let’s turn to another side of "tweaking theory." Do you believe developers are missing opportunities to enhance Windows performance, or that they don’t consider tweaks? I don’t. The Windows performance team inside Microsoft runs a variety of performance benchmarks/workloads on a wide spectrum of machines.
At first, the results of the struggle for performance may not seem very impressive, but a 5 percent difference in performance difference is a large difference. And a couple of simple and easy tweaks may result in performance increases that are as high as 10 percent.
You may not feel that 5 to 10 percent performance increases can really make a difference, but they actually can. The primary operational system task is to run user applications. The fewer resources the operating system uses, the faster applications run. Suppose Windows (n-1) might normally use 10 percent of computer resources and an application uses the remaining 90 percent. Or suppose, instead, that Windows (n+1) is twice as efficient (a great achievement) and uses 5 percent of computer resources. In that instance, the application will run 95/90 = 1.056 times faster. This example shows that a significant difference in the operating system development can give only a small gain in overall performance. And this is a huge contradiction for tweaking to improve performance. Tweaking just can’t give the results you want.
I asked Michael Fortin, Distinguished Engineer for Windows, if the Windows team studies published tweaks and tips. Fortin said, "I asked a bunch of [team] people and was a little surprised. What I found is that most people do things, but collectively it all started to also look like buzz about nothing. With one exception: uninstall stuff you aren’t using."
So, I conclude that almost all Windows tweaks are fairly useless when it comes to speeding up your computer. To achieve significant results, you’ll need to buy a new computer or upgrade your existing system; at the very least, you’ll need to uninstall some rarely used programs.
Finally, I want to give you some tips on considering a few commonly suggested "optimizations":
- If you see a tip to set SecondLevelDataCache registry key, keep in mind that it has not been used since Windows 2000 SP1.
- If you see a claim that the DisablePagingExecutive key may increase performance, it is false. It actually may decrease overall performance; however, because it decreases response time, the system seems faster.
- Setting the processor or core numbers in Msconfig cannot speed up booting because all cores are used by default.
Igor Leyko has worked with computers since 1974. He is a Windows System & Performance MVP. This is an abstract of an article that will be published in Russian at www.iXBT.com.
13 thoughts on “Windows tweaking and optimization: myths and reality”
My assumption is that Windows is set up with a series of trade-offs. Performance being one of them, with drive space usage being another.
I have a habit of fixing the page file size because of a tip I read for Win95. The assumption then was that Windows had to decrease the page file size due to space issues. I don’t care about that, so want to optimize performance. I doubt that my tweak has any effect on today’s systems, but it’s part of the reliability package I put on all my managed computers. My systems are reliable now, and I don’t want to change anything I’m doing.
I suppose you could call that superstition, in a sense.
The best teeak available is to get rid of crapware.
Great article. The best “tweak” is to understand how the OS actually works, which in turn leads you to appreciate the supreme value of very fast CPUs, state of the art network and disk components, and gargantuan amounts of RAM working in conjunction with a 64-bit OS. If you have all that, the OS then takes care of maximizing the effective and efficient use of memory and CPU cycles.
I’ve been saying this for years, really. After a certain point, Windows became self-tuning enough that tweaking and second-guessing only did more harm than good.
Don’t install anything you don’t really need, and remove anything you don’t find yourself using. About the only “tweak” I’ve found that had any noticeable impact is putting the swap file on a different physical drive than the OS, but even that I’m not too sure about anymore, in these days of 64-bitness and 8GB desktop systems.
“uninstall stuff you aren’t using.” How many times I’ve said that to countless numbers of people. I get so many questions about that statement too. I all depends on the program. Over the years I have grown to prefer programs that really don’t “install” any more than just copy files and create shortcuts in the Start Menu. Those types of programs have the smallest impact on your system by just being there. There are a number of programs today that after installation, I do a “clean-up” to clear all the un-needed attributes they put on a system. Like update routines that start up (sans a few that do need regular updates like Java.)
Often I get asked “Should I not have so many games?” Games fortunately tend to be the “copy” type installs. Rarely do they place additional tasks on the system. Their is an exception in that some on-line games install stuff like Punk-Buster, and anti-cheat system, which taxes the computer continually. Or real-time updates, which usually can be disabled anyway. But for the most-part games usually don’t have that great of an impact on day-to-day use.
I’ve found that many of the “tweaking” and speed up programs are the worst for adding tasks that slow a system down. GO FIGURE! Irony at it’s best. Others that really irritate me are PDF programs, and other so-called Office tools. And of course hardware drivers can be a real tax on your system. So many printers, cameras and MP3 players today come with a plethera of un-needed software. It’s almost more of an un-install just to install an all-in-one printer driver set! I also hate it to see when an ISP includes a CD with their cable or DSL modem. This software is just NOT needed! If the ISP is not simply a plug in and it works type ISP, find a new source if you can. Many of these CDs install stuff that cannot be uninstalled. At least not without considerable effort.
Ed, thanks for posting a great article!
“Tweaking” is retarded. As you noted, if boosting Windows performance by 10% was really possibly by simply changing a registry setting, why wouldn’t it be set that way by default?
The fact is, many of these tweaking suggestions are outright irresponsible. As a developer, I can tell you it’s pretty maddening to track down the cause of an error that manifests itself months after a clever user decided to disable the Network Location Awareness service that wasn’t impacting the system in the first place.
But the thing that really drives me nuts is the geniuses that recommended disabling Superfetch because “it consumes nearly all your system memory right after boot!” 1) Unused memory is wasted memory. Period. 2) Superfetch is designed to give it back as soon as other applications need it. and 3) By not prefetching these binaries, you’re GUARANTEEING slower application startup times.
Go to hell you stupid tweakers. And while you’re there see if you can f–k with the thermostat to make it more efficient.
Oh I may as well add a useful tip after my rant… Go get SysInternals Autoruns.exe and find out just how many third-party startup processes have wormed their way into your system. These are the things you want to worry about. Leave the operating system stuff alone.
“The Windows performance team inside Microsoft runs a variety of performance benchmarks/workloads on a wide spectrum of machines.”
Curious why there is never a publication coming from this group, or the availability of these benchmarks themselves. (maybe bootvis?) Sure would be nice to see methodology, results and conclusions.
“So, I conclude that almost all Windows tweaks are fairly useless when it comes to speeding up your computer.”
Actually your results argue that while your criticisms to methodology and interpretation are correct your above conclusion is no more supported by the “facts” than the performance enhancement supporters. It is just they are required to provide the proof. It would be better to conclude that there simply is no demonstrated testable basis to support the claims of the tweakers, and reason to believe what little evidence is offered is highly inadequate.
The Windows Performance team has been blogging for years. Some very detailed posts on all sorts of aspects of Windows performance:
This was a very timely article for me. I agree with all its conclusions, and the one about uninstalling stuff you don’t need is true – but I find that more of a vulnerability fix than anything else (the useful “rule of resource ratios” given in the article shows why; I’ll be using that in the future). Uninstalling stuff you don’t use isn’t even much of a disk space saver, especially as hard drives have gotten so much larger and Windows hasn’t gotten all that much bigger. I still often run into (and sometimes am swayed by, if not for long) arguments that cleaning the registry (for the gain of a few bytes – thanks for that article, Mr. Bott) or regularly scheduled hard drive defragmentation (how much user time and power has been wasted on that over the years, I wonder? I remember staring at the GUI of the Windows 95 defragger in fascination…things seemed much simpler back then). These seem like Good Ideas, instead of the time wasters they often become. I find that the only Windows adjustments worth making are those that I can answer Yes to this question: “Will this will save me more time (or at least more aggravation) than it takes to find and change it?” That is how we use Windows anyway – unless the holy bug of “tweaking” manages to bite us and set us off wasting a whole day’s productivity trying to outsmart Windows.
Recently, I installed XP yet again – its age and vulnerability are really starting to catch up to it – and looked up some “tweak” sites just to remind myself of any useful settings that I would like to change or investigate. Here’s one that I think nicely details the sort of quandary that the Windows developers (and endusers) regularly face. The specific facts are less relevant than the process and conclusions, which I think apply broadly to the long-standing question of “When to Defragment?” and many other issues between user involvement and inconvenience vs. elegance and usability that have cropped up over the years.
In XP’s registry, HKEY_CURRENT_USER/Control Panel/Desktop contains a key, “MenuShowDelay,” that allows you to control the delay before a menu is shown (it’s important to note this is separate from any delay incurred when reading the contents of said menu from disk, or if Windows hiccups – I don’t know exactly how that muddies the picture further, so we’ll assume the menu is in memory; it’s important to note this setting won’t magically erase any disk access delays). This setting even affects navigating your Bookmarks menu in Firefox. In most Windows XP installations, the setting is at 400 milliseconds, or 4/10 of a second. For a long-time first person shooting games player, on first glance that seems unforgivably slow. Instead of changing the value to “100” for 1/10 of a second, as the “tweak” sites usually recommend, I changed it to 0. (I’ve done that before – hope springs eternal…)
That particular change breaks menu navigation. Instead of being able to drag the mouse in the shortest line (“as the crow flies” if you were looking at a map) over to the desired item in the newly opened menu, the moment the cursor hovers over another menu item from the first menu, the new menu will close up. Either you have to be extremely quick (to beat your mouse cursor updating…not too easy with a 4000dpi gaming mouse!), or drag it down the narrow width of the first menu item to the next, or always position your cursor to the very right of the menu so as not to lose your current focus. What’s the benefit?
The “fix” is only for those times when you accidentally hover over the wrong menu item and want to select another. In that situation (which isn’t too uncommon), waiting 4/10 of a second is pretty extreme. But a too-low setting isn’t productive when you navigate nested menus more than two deep (the biggest likely examples are very complicated Bookmarks menus – which I don’t have – or the Start Menu, which is unavoidably nested). Most users simply see the word “delay” and go “great, I can fix that!” Well, that’s misleading; that “delay” also controls how long a previous menu stays alive when you move away from the item that triggered it, by accident or on purpose! It’s not up to Microsoft to explain every nuance of the Registry in offline tooltips or in the menu setting (that’s what TechNet is for, if I’m not mistaken) – introducing more bloat for tweakers to complain about – but wouldn’t you think most leading “Tweakers” (capital T) who compile lists of such information could do a better job (especially as some of them find ways to get paid to do it)? And there, of course, the casual “tweaker” who looks up such information online is wholly at the whim of the limited knowledge and testing of the “Tweaker” who posts this information in the first place.
That’s an awful lot of writing to cover just one setting. I’m also reminded that there ought to be a similar discussion to be had over the amount of time it takes a Tooltip to appear…another simmering debate I’m sure.
There are of course solutions to be dreamed up – perhaps advanced (very!) heuristics to try and “guess” what the user’s next step is (and this wouldn’t work anyway; it’s called “mind reading” and if you could do that you wouldn’t need a mouse or any computer prediction), or perhaps a requirement that the user click each menu item to bring up the next (which hasn’t been used for the obvious reason that it introduces new steps and more drudgery). This was, as I said, an example where it’s pretty reasonable to conclude that you could gain some time. It’s surprising, but a lot of folks have this simple view of computers (and other machines) as as simple as a wheel that only goes in one direction, achieving one definite end, rather than a device that needs to be able to work correctly in multiple situations, when multiple ends sometimes explicitly conflict with each other. What good would a wheel that only turns in one direction be on the highway?
Ultimately, for most situations, the setting chosen by the Windows team is the right one, or at least a nicely balanced compromise. There have been some “tweaks” (“adjustments” might be a term more correct, not to mention more respectful of the Windows teams’ work over the years) that I routinely use, such as locking the paging file into one large size and one location, changing XP to Classic view, turning off the Last Accessed disk update write, turning off all the visual fluff (but keeping some useful settings – such as viewing the contents of folders while dragging), and a few other instances. My comprehension of even that short list isn’t 100% (the swap or paging file is complicated and has been a source of infamously bad tweaks; locking the size of the paging file to the maximum Windows normally suggests probably isn’t a good idea with some particularly resource-hungry applications, but it has worked for me so far, and I dislike the way Windows manages to fragment that swap file in normal usage – this is the sort of irrational thinking that typifies most “tweaker” thinking when the appearance of coolness is more important than actually getting any work done, I freely admit), but close enough. Ultimately, however, this is all knowledge that I have gained only through years of working with (and occasionally breaking) Windows to the point where I’m pretty confident about what I need and what I (most importantly) am annoyed by or don’t need. And I could have saved myself much time by having been more well-informed by the start.
A final thought: When I first used Vista, I reflexively turned off Aero and defaulted to the classic Windows 95 (or is that Windows 3.11?) way of doing things. (Hey, thinking of UAC as something useful and not the shady aerospace concern from DOOM was a big change!) Only much later, when first using Windows 7, did I discover that parts of the new menu setup were actually quite useful and to my liking – I was getting things done faster than with XP on more powerful hardware. But then there are other Microsoft-made “tweaks” that I don’t much care for, like accessing Sleep mode through a menu hover, instead of the old XP way via a separate window. Some things never change…but the lesson here is to only mess with stuff when you know what it does and you have tested out all the other combinations.
I have a simple question which i suspect the answer won’t be simple
are there any tweaks that actually improve windows performance?
First off, Edwin, I use a defrag program (either 3rd party or built-in from the XP SP3 OS) on some occasions to reorganize files on my hard drive for faster access [hey, I use an HP Pavilion computer that was manufactured in late 2000; used to have WinME but recently has WinXP]. Not always does any disk defragmenter program improve performance; only in some cases it does. I know from personal experience. From time to time, I have to clean out a few gigabytes worth of a ton of un-needed files on my PC before I run the defrag tool. You just need to know when to actually use the defragmenter program.
As for your simple question, Paul, there are SOME tweaks out there on the web that you find that MAY improve windows performance. Just don’t use ALL of the tweaks mentioned out there. Some are good in certain hardware configurations and others are un-necessary.
Remember that the so-called “tweaks” are NOT for everyone.
Windows 7 has vastly improved the user experience. Boot-up times is quick. Start-up times are better.
The remaining performance issues are program launches. They are still slow at times. Opening files still remain a problem.
The most annoying performance problem is the missing cursor. What happened to my cursor? It might take a few seconds to find it after I launch something, or it does a resource hogging task like play a video.
Comments are closed.