Thursday, November 15, 2007

What Intel Giveth, Microsoft Taketh Away

“What Intel giveth, Microsoft taketh away.” Such has been the conventional wisdom surrounding the Windows/Intel (“Wintel”) duopoly since the early days of Windows 95. In practical terms, it means that performance advancements on the hardware side are quickly consumed by the ever-increasing complexity of the Windows/Office code base. Case in point: Microsoft Office 2007 which, when deployed on Windows Vista, consumes over 12x as much memory and nearly 3x as much processing power as the version that graced PCs just 7 short years ago (Office 2000).

But despite years of real-world experience with both sides of the duopoly, few organizations have taken the time to directly quantify what my colleagues and I at Intel used to call “The Great Moore’s Law Compensator (TGMLC).” In fact, the hard numbers above represent what is perhaps the first ever attempt to accurately measure the evolution of the Windows/Office platform in terms of real-world hardware system requirements and resource consumption.

Over the next several sections I hope to further quantify the impact of TGMLC and to track its effects across four distinct generations of Microsoft’s desktop computing software stack. To accomplish my goal I’ll be employing a cross-version test script – OfficeBench – and executing it against different combinations of Windows and Office: Windows 2000 + Office 2000; Windows XP (SP1) + Office XP; Windows XP (SP2) + Office 2003; and Windows Vista + Office 2007. Tests will first be conducted in a controlled virtual machine environment under VMware and then repeated on different generations of Intel desktop and mobile hardware to assess each stack’s impact on hardware from the corresponding era.

Click Image to View Our Interactive Results Table

About OfficeBench: The OfficeBench test script is a version-independent benchmark tool that uses OLE automation to drive Microsoft Word, Excel, PowerPoint and Internet Explorer through a series of common business productivity tasks. These include assembling and formatting a compound document and supporting workbooks and presentation materials, as well as data-gathering through simulated browsing of a web-based research database. OfficeBench is available for free download from the ( web site as part of the DMS Clarity Studio testing framework.

The Stone Age

Back in 1999, when I was working as an advisor to Intel’s Desktop Architecture Labs (DAL), I remember how thrilled we all were to get our hands of Windows 2000 and Office 2000. Finally, a version of the Windows/Office stack that could leverage all of the desktop horsepower we were building in to the next generation Pentium 4 platform. I remember it was also the first time I had a fully-scriptable version of the Office Suite to work with (previous versions had supported OLE automation only in Word and Excel). Shortly thereafter, the first version of OfficeBench was born and I began my odyssey of chronicling TGMLC through the years.

First-off, let me characterize the state-of-the-art at the time. The Pentium 4 CPU was about to be unveiled and the standard configuration in our test labs was a single-CPU system with 128MB of RDRAM and an IDE hard disk. While a joke by today’s standards, this was considered a true power-user configuration suitable for heavy number-crunching or even lightweight engineering workstation applications. It was also only marginally faster than the previous generation Pentium III, a fact that Intel tried hard to hide by cranking-up the CPU clock to 1.5GHz and turning its competition with rival AMD into a drag race. It’s a decision that would come back to haunt them well into the next century.

Sadly, I didn’t have access to an original Pentium 4 system for this article. My engineering test bed was long ago scrapped for parts, and I doubt that many of these old i840 chipset-based boxes are still in use outside of the third world. However, we can at least evaluate the software stack itself. Through the magic of virtualization we can conclude that, even with only 128MB of RAM, a Windows 2000-based configuration had plenty of room to perform. During OfficeBench testing, the entire suite consumed only 9MB of RAM, while the overall OS footprint never exceeded 50% of the available memory. Clearly this was a lean, mean version of Windows/Office and it chewed through the test script a full 17% faster than its nearest competitor, Windows XP (SP1) + Office XP.

The Bronze Age

The introduction of Windows XP in 2001 marked the first mainstream (i.e. not just for business users) version of Windows to incorporate the Windows “NT” kernel. In addition to improved Plug & Play support and other improvements, XP sported a revamped user interface with true-color icons and lots of shiny, beveled effects. Not wanting to look out of style, and also smelling another sell-up opportunity, the Office group rushed out Microsoft Office XP (a.k.a. “Office 10”), which was nothing more than a slightly tweaked version of Office 2000 with some UI updates.

Hardware had evolved a bit in the two years since the Windows 2000 launch. For starters, Intel had all but abandoned its ill-fated partnership with RAMBUS. New Intel designs featured the more widely supported DDR-SDRAM, while CPU frequencies were edging above 2GHz. Intel also upped the L2 cache size of the Pentium 4 core from 256KB to 512KB (i.e. the “Northwood” redesign) in an attempt to fill the chip’s stall-prone 20 stage integer pipeline. Default RAM configurations were now routinely in the 256MB range while disk drives sported ATA-100 interfaces.

Windows XP, especially in the pre-Service Pack 2 timeframe, wasn’t all that more resource intensive than Windows 2000. It wasn’t until later, as Microsoft piled-on the security fixes and users started running anti-virus and anti-spyware tools by default, that XP began to put on significant “weight.” Also, the relatively modest nature of the changes from Office 2000 to Office XP translated into only a minimal increase in system requirements. For example, overall working set size for the entire suite during OfficeBench testing under VMware was only 1MB higher than Office 2000, while CPU utilization actually went down 1% across the three applications (Word, Excel and PowerPoint). This did not, however, translate into equivalent performance. As I noted before, Office XP on Windows XP took 17% longer than Office 2000 on Windows 2000 to complete the same OfficeBench test script.

I was fortunate enough to be able to dig-up a representative system of that era: A 2GHz Pentium 4 system with 256MB of RAM and integrated Intel Extreme graphics (another blunder by the chip maker). Running the combination of Windows XP (SP1) and Office XP on bare iron allowed me to evaluate additional metrics, including the overall stress level being placed on the CPU. By sampling the Processor Queue Length (by running the DMS Clarity Tracker Agent in parallel with Clarity Studio and OfficeBench), I was able to determine that this legacy box was only moderately stressed by the workload. With an average Queue Length of 3 ready threads, the CPU was busy but still not buried under the computing load. In other words, given the workload at hand, the hardware seemed capable of executing it while remaining responsive to the end-user (a trend I saw more of as testing progressed).

The Industrial Revolution

Office 2003 arrived during a time of real upheaval at Microsoft. The company’s next major Windows release, code named “Longhorn,” was behind schedule and the development team was being sidetracked by a string of security breaches in the Windows XP code base. The resulting fix, Windows XP Service Pack 2, was more of a re-launch than a mere update. Whole sections of the OS core were either replaced or rewritten, and new technologies – like Windows Defender and a revamped firewall – added layers of code to a rapidly bloating platform.

Into this mess walked Office 2003 which, among other things, tried to bridge the gap between Windows and the web through support for XML and the ability to store documents as HTML files. Unlike Office XP, Office 2003 was not a minor upgrade but a major overhaul of the suite. And the result was, not surprisingly, more bloating of the Windows/Office footprint. Overall memory consumption went up modestly to 13MB during OfficeBench testing while CPU utilization remained constant vs. previous builds, this despite the fact that the suite was spinning an extra 4 execution threads (overall thread count was up by 15).

Where the bloat took its toll, however, was in raw application throughput. Completion times under VMware increased another 8% vs. Office XP, putting the Windows XP (SP2) + Office 2003 combination a full 25% off the pace of the original Windows 2000/Office 2000 numbers from 3 years earlier. In other words, with all else being equal – hardware, environment, configuration – Microsoft’s desktop computing stack was losing in excess of 8% throughput per year due to increased code path complexity and other delays.

Of course, all else was not equal. Windows XP (SP2) and Office 2003 were born into a world of 3GHz CPUs, 1GB RAM, SATA disks and Symmetrical Multithreading (a.k.a. Hyper-threading). This added hardware muscle served to offset the growing complexity of Windows/Office, allowing a newer system to achieve OfficeBench times slightly better (~5%) than a legacy Pentium 4 system, despite the latter having a less demanding code path (TGMLC in action once again).

Welcome to the 21st Century

Given the extended delay of Windows Vista and its accompanying Office release, Microsoft Office System 2007, I was understandably concerned about the level of bloat that might have slipped into the code base. After all, Microsoft was promising the world with Vista, and early betas of Office showed a radically updated interface (the Office “Ribbon”) as well as a new, open file format and other nods to the anti-establishment types. Little did I know that Microsoft would eventually trump even my worst predictions: Not only is Vista + Office the most bloated desktop software stack ever to emerge from Redmond, its system requirements are so out of proportion with recent hardware trends that only the latest and greatest from Intel or AMD can support its epically porcine girth.

Let’s start with the memory footprint. The average combined working set for Word, Excel and PowerPoint when running the OfficeBench test script is 109MB. By contrast, Office 2000 consumed a paltry 9MB, which translates into a 12x increase in memory consumption (i.e. 170% per year since 2000). To be fair, previous builds of Office benefited from a peculiar behavior common to all pre-Office 12 versions: When minimized to the Task Bar, each Office application would release much of its non-critical working set memory. This resulted in a much smaller memory footprint, as measured by the Windows performance counters (which are employed by the aforementioned DMS Clarity Tracker Agent).

Microsoft has discontinued this practice with Office 2007, resulting in much higher average working set results. However, even factoring this behavioral change, the working set for Office 2007 is truly massive. Combined with an average boot-time image of over 500MB for just the base Windows Vista code base, it seems clear that any system configuration that specifies less than 1GB of RAM is a non-starter with this version. And none of the above explains the significantly higher CPU demands of Office 2007, which are nearly double (73% vs. 39%) that of Office 2003. Likewise, the number of execution threads spawned by Office 2007 (32) is up, as is the total thread count for the entire software stack (615 vs. 370 – again, almost double the previous version).

Clearly, this latest generation of the Windows/Office desktop stack was designed with the next generation of hardware in mind. And in keeping with the TGMLC pattern, today’s latest and greatest hardware is indeed up to the challenge. Dual (or even Quad) cores, combined with 4MB or more of L2 cache, have helped to sop-up the nearly 2x greater thread count, while 2GB standard RAM configurations are mitigating the nearly 1GB memory footprint of Vista + Office 2007.

The net result is that, surprise, Vista + Office 2007 + state of the art hardware delivers throughput that’s nearly on par (~22% slower) with the previous generation of Windows XP + Office 2003 + the previous state of the art hardware. In other words, the hardware gets faster, the code base gets fatter and the user experience, as measured in terms of application response times and overall execution throughput, remains relatively constant. The Great Moore’s Law Compensator is vindicated.


As I stated in the beginning, the conventional wisdom regarding PC evolution could be summed up in this way: “What Intel giveth, Microsoft taketh away.” The testing I conducted here shows that the wisdom continues to hold true right up through the current generation of Windows Vista + Office 2007. What’s shocking, however, is the way that the IT community as a whole has grown to accept the status quo. There is a sense of inevitability attached to the concept of the “Wintel duopoly,” a feeling that the upgrade treadmill has become a part of the industry’s DNA. Forces that challenge the status quo – Linux, Google, OS X – are seen as working against the very fabric of the computing landscape.

But as recent events have shown us, the house that “Wintel” built exists largely because of a fragile balance between hardware evolution and software complexity. When that balance gets out of whack – as was the case when Vista became delayed, leaving Intel with a hard sell for many of its newer offerings – the house can quickly destabilize. And when that happens, it will be up to one of the aforementioned outside forces to seize the initiative, topple the “Wintel” structure and perhaps change the very nature of desktop computing as we know it.


Unknown said...

The term "Wintel" shouldn't be used and the term itself demonstrates lack of knowledge. Windows is not the only PC operating system, and Intel is not the only PC processor brand.

Anonymous said...

Lack of knowledge? Sure, other x86 operating systems exists, but one cannot ignore that we are under a massive Intel/Microsoft monopoly and that this same monopoly dictates the whole computers hardware.

Research Staff said...


When someone says "Wintel," everyone else in the room knows exactly what they're talking about. It's accepted industry nomenclature for describing the dominant CPU vendor and OS platform, nothing more.


Anonymous said...

I'd suggest updating your results slide. The first set of charts use different scales. At first glance I thought that the 512MB config ran faster than the 1GB memory config due to the scale change. (which is not on the other graphs.)

Anonymous said...

I would like to know how Windows does with other office suites like Open Office. It would be interesting to see how much of this is office and how much is windows.

Anonymous said...

As everybody in the IT industry should now by now, the proper term is GNU/Wintel.

- Peder

Anonymous said...

I didn't really see in the article an explanation of how the 32/64 bit problem was addressed. It seems that the article had good points, but may have left out a little bit of the details. Doesn't the fact that using 64 bit words instead of 32 bit words automatically make most things use twice the memory? To store/address an integer it now takes one 64 bit word rather than one 32 bit word. When we jump to Vista, I'm assuming that you used the 64 bit version. Shouldnt we expect a bit of a performance hit when taking that into account and comparing it to 32 bit platforms?

Research Staff said...

All testing was conducted using the 32-bit version of Windows Vista. Apologies for any confusion. We probably should have been more explicit about which version was employed.


Anonymous said...

When you say "version independent" you're ignoring something. The implementations of the OLE automation system have changed significantly over the years, and frankly more recent versions do a lot more error checking of OLE parameters and such which surely slows the automation performance.

One of the reasons office was so fast was because they sacrificed security for speed. Now that Microsoft is giving security the lead it's not surprising that various things that take automated input are slower and use more CPU. It's DOING more.

I think you need to find a more objective test. For example, how does Office 2007 run on XP, how does Office 2003 run on Vista?

What's more, as resources expand, so do the optimizations. When Windows 95 came out, one of it's primary goals was to run as fast as Windows 3.1 on 8MB of memory. By various tests, it achieved that. However, if you added another 8MB of memory (for a total of 16MB) it was significantly faster than Windows 3.1 (even with 16MB of memory). Sure, it used more resources, but this made it significantly faster.

Merely using more resources only compares the fact that one version is doing more than the other. It doesn't compare the potentials of both versions.

Anonymous said...

The article was concerning Office bloat. It was very informative but would have been easier to show the bloat if each version of Office was tested under one version of Windows: 2000 on XP, 2003 on XP, 2007 on XP.

Unknown said...

While I suspect that the premise you are working on is true and the results are likely close to reality, the use of VMWare (or any other VM platform) for the testing largely invalidates the entire process. This is largely due to the mangling of timers in the virtual sessions which throws off any time measurement used in those environments. Finally, the virtualization of the host CPU does not allow visibility of processes run on the host OS, so if a task should kick off on the host (or another guest if any are running), you will start to lose time slices in the guest without being able to measure it.

VMs are good for functionality testing, and can even be good in some limited cased for load testing, but measuring performance in this manner from within the VM itself is hopeless.

Anonymous said...

Just wondering if it possible for you to do a comparison between your current setups and a ubuntu/nix / openoffice mix running the same tasks?

I would be very interested in the results.

Anonymous said...

What about the idea that new versions of Office actually *do* more useful stuff? e.g. the additional horse-power required comes with benefits?

Anonymous said...

I'm not sure this has anything to do with MS\Intel. This is a classic problem for software development. When someone (or a team) sits down to create new or update existing software it is unlikely that the "speed" would be killer feature they add.

Also related if cases where development starts with the mind set that the hardware will "catch up". This invariably negate Moore's Law.

Tyler Montgomery said...

I thought the article was good at justifying the "The Great Moore’s Law Compensator" - the point was not to just show the bloat in Office over the years, but to show the that even as hardware has improved dramatically, Office and Windows have destroyed any gains.

Everyone at my office was excited to see Vista and Office '07 when we bought a new Vista machine. The analyst using the new machine (she has an average of 20 spreadsheets open at a time) was quick to voice her displeasure as it was no faster than her old eMachines PC running Office 2000 with 128mb RAM.

I've moved to running Linux and OpenOffice and really have no need to switch back to MS.

In the face of Walmart offering a $200 Linux Desktop with the free OpenOffice included...the Wintel monopoly could be in danger.

Anonymous said...

I personally would like to see the old software run on the new hardware and visa versa as a real comparison.
I would also like to see a comparison of common tasks, spell checking, formating Etc as a comparison to see if any of the new office stuff has any real value other than giving micro$oft more upgrade money for free.

Mackenzie said...

"What about the idea that new versions of Office actually *do* more useful stuff? e.g. the additional horse-power required comes with benefits?"

What do you do with Office that you couldn't do with Office 97? In my experience, most people don't even know how to set paragraph indentation (they hit |tab| instead), let alone use anything advanced. Don't mind me though. I don't use office suites these days. I use LaTeX. I still have OpenOffice installed for the inconsiderate nitwits that insist on attaching everything as Word documents instead of using something standard, like PostScript or PDF.

"Just wondering if it possible for you to do a comparison between your current setups and a ubuntu/nix / openoffice mix running the same tasks?"

I have an old Gateway with a Pentium II and 192MB of RAM. Obviously, it had a memory upgrade. I intend to take it up to a full 384MB and max it out some time in the future (I bought the memory, but I lost it in my room). It's even older than where this article starts off :) I've run Windows 98, Windows Me, Windows XP, Ubuntu 7.04, and Debian 4.0 on it.

Fastest to slowest (not stats, just memories):
1. Windows 98
2. Debian 4.0 with Enlightenment17, Ubuntu 7.04 with Enlightenment16, and Windows Me
3. Ubuntu 7.04 with GNOME
4. Windows XP

Anonymous said...

Hi, good article...

I keep wondering why all that bloat is justified, the improvements (or new stuff) in Vista + Office 2007 are minimal. Office XP was a great succesor to Office 2000, and Office 2003 a very nice upgrade for OfficeXP... however.

In my opinion all you get nothing more than a bloated and slow office suite that doesn't do much more than 2003 used to do.

So yes, Office 2007 is huge, bloated and slow... but does it do anything that justifies that?

Anonymous said...

Lack of knowledge???????

this guy actually worked for intel, im sure he has heard of linux and amd, but at the end of the day its all still wintel for the vast majority, as it was in the windows/office 2000 days.

Enough said

Anonymous said...

Think of how the world would be if every generation of new software offered more features, refined existing features, got more efficient and more secure, and ran faster than the previous version of that same software.

That would be every computer users dream - but it would be every hardware manufacturer’s nightmare. We would only need to upgrade our software and never have to upgrade our hardware. Such a scenario could never be allowed to happen as it would negate the need to buy any new hardware unless our existing hardware actually failed.

My point is that the entire hardware industry is absolutely dependent on Microsoft to bloat the code with each new generation - under the guise of additional features and increased security - in order to perpetuate the whole hardware upgrade system.

Even those of us who feel that our present hardware and software is sufficient for any future needs cannot rest secure in that hope because every month Microsoft, incrementally yet systematically, bloats our existing installed code with new “security patches” until we reach the point that we submit and buy new hardware.

Unknown said...

Fine article, and it raises many points to mind.

Firstly, as a developer who is regularly called on to add to existing systems feature sets, I have never seen a job where they required me to remove working features. So my extrapolation would be that evolving software projects rarely, if ever, reduce features and therefore footprint. I have worked on optimising existing products, but even then it is working against a cost/performance curve.

Plus, MS do not dictate the progress or speeds of processors. That is driven by very different factors.

And the previous post about security was bang on. Security, Interoperability(OLE/COM/COM+/.NET), GUI, hardware support, Media capabilities. All these have received and required a major overhaul.

It does look from your article that your time at Intel was very much a product of the old days of Intel/MS rivalry. Pretty much the only way you could up with the conclusion that MS are just trying to steal processor time and resources from either the user or Intel.

Anonymous said...

Nice article, thanks.

I have to agree with Tyler above on the Linux thing. I'm in the UK, so there's no Walmart, but local to me Tesco are doing a £180 linux box with 512MB RAM, which isn't bad at all. And I'm in love with my Ubuntu laptop from Dell. It's got 2GB of RAM and I'm currently only at 10% RAM usage (+15% cache) with Firefox and 20 tabs open.

How any Operating System could eat up power like Vista does is insane. I'd love to see you do some Linux benchmarks :)

I have Damn Small Linux running off a live CD in a 64MB RAM/Pentium II box and the GUI is more responsive than that of any Vista machine that I've come across yet. For some astounding results you should try that distro too.


Mackenzie said...

"Even those of us who feel that our present hardware and software is sufficient for any future needs cannot rest secure in that hope because every month Microsoft, incrementally yet systematically, bloats our existing installed code with new “security patches” until we reach the point that we submit and buy new hardware."

What Microsoft does has no bearing whatsoever on my hardware purchases beyond the fact that if they make a deal with a hardware manufacturer that results in the hardware manufacturer giving Linux users the finger, I will not buy from that manufacturer.

Anonymous said...

"My point is that the entire hardware industry is absolutely dependent on Microsoft to bloat the code with each new generation - under the guise of additional features and increased security - in order to perpetuate the whole hardware upgrade system."

While possibly true, that is a unsubstanciated conspiracy theory. If you have insider knowledge, please share.

Anonymous said...

I've recently realised that one of the reason extra functionality is never removed is because no-one ever takes the time to check whether there's actually anyone using them. If they did they'd find out 99% of their customers are suffering for the benefit of 1%.

I want maximum performance and productivity, not a bunch of features I never use (otherwise known as potential security holes). This article has perfectly highlighted that the mindset of only add, never take away of software vendors is a bad one, my own company being no exception!

Anonymous said...

This test features no Control. At no point is either the operating system or office suite left constant. Had you used a control and tested multiple versions of Office in XP SP2, you would find that Office 2007 actually has been cleaned up significantly over 2003 and actually has a lower base memory requirement. The reason that Office 2007 takes longer to accomplish tasks is because Office 2007 uses XML based documents, which allows for added flexibility. If you consider Office bloatware to be extraordinary, I suggest you perform a similar test on OpenOffice.

Given that all the office suites use compiled object code, even the most bloated are many times smaller and lighter than Open Office which uses Java bytecode and a virtual machine.

Yet people hail OpenOffice as the greatest thing since the coming of Jesus? I don't understand.

Mackenzie said...

"Office 2007 uses XML based documents"

Only in the sense that Microsoft thinks XML is a buzzword. XML involves plain text. Microsoft is using XML wrappers around the binary blobs in their format, and they're only doing it to make O2k7 work with 64bit too. Their attempts at getting OOXML to be a standard are utter idiocy. For that to be made a standard would mean standardizing breaking the XML standard.

Anonymous said...

To an earlier commenter: More features don't have to equate more resource usage. Features could be "made active" when they are used and stay dormant otherwise.

XP with its multitude of running services (Help service?) is an example of how it should not be done.

As for the post, sadly, it's so true. I hope ReactOS will eventually bring, hopefully not in the distant future, a Windows-compatible open-source alternative. Then we might even have Damn Small Windows that is 99% XP compatible but runs just fine on Pentium 2s with 96MB of RAM.

Anonymous said...

I enjoyed your article very much. I heartily agree that what Intel giveth, Microsoft taketh away. For years I have looked forward to getting my hands on the latest giga-whatever expecting sock-knocking performance, and nearly every time I'm disappointed.

I mean really, how does it take some abstract benchmark to tell me that 2GHz is better than 1, when I'm still waiting the same time for the system to boot, or a window to redraw, or an application to launch?

3D games seem to be the only app where you can clearly see where the latest in Intel/AMD technology delivers. You know the feeling, it's like Yeah, Now that looks Good! Couldn't do THAT on my Pentium 2, 233.

But that P2 ran Win2K and Office 2K reasonably well. When I moved it all to a P3/933, the experience turned sweet!

For fun, I loaded Win98 on to the P3. Bang! THAT system FLEW! A taste of what could be! If I hadn't gotten used to the better stuff Win2K had to offer, I would have stuck with 98 just for the speed.

OTOH, WinXP was a dog on my P3. Wouldn't touch it. I skipped the P4 entirely and built an AMD X2 4400, and now, XP shines.

Why in the world would I want to mess that up with Vista? Who would want a sports car with a trunk filled with concrete?

Long story short, we should stretch back our memories and recall what we were able to do comfortably with 64 Megs of RAM (ok, 128) and a Pentium 2 (hey! Quake ran ok on it). That in mind, it's not unreasonable to expect a Core Duo with a Gig of RAM to give a performance like a Ferrari dusting a Model T.

In other words, we should not give in to this FUD about security necessarily eating up all that extra power, new features eating up all that extra power, or any other thing (that is not immediately apparent and beneficial) why desktop applications don't scream, popping onscreen the instant after launch, crunching through tasks so fast we never consider getting up for coffee.

Oh, I know the arguments: faster lighter code doesn't sell new PC's or justify an OS upgrade. Not to mention that it's too expen$ive to hire programmers to go through existing code and optimize and remove bloat. Faster, lighter code doesn't make $$$... Blah blah blah.

Only because consumers allow it to be that way. Everyone complained that Macs under PowerPC could never quite impress. Apple (finally) heard and went to Intel.

Now, Dell and others offer new PC's with XP instead of Vista because consumers (us) demand it.

Rock on. Death to Vista ;-)

Anonymous said...

I remember balking at an old MSDN TV episode about 5 years go; you wanna know how much they care about performance? here you go, straight from the horse's mouth:

ROBERT HESS: So maybe the best thing of optimization is actually talking your manager into buying you a faster laptop, right?

GREGOR NORISKIN: You joke, but that actually is a recommendation that I give people, that adding hardware is not a crime. Obviously if you haven't considered performance throughout the lifetime of your application implementation design, and you don?t meet your performance requirements, you can always just buy more hardware, right?

ROBERT HESS: That works for me. Okay, thanks. Well thanks for joining me.
I?m sure my audience kind of learned an awful lot about optimization, some of the techniques available for it, and several tips and tricks and thoughts about the process.

Full show transcript here:

- Si

Unknown said...

For things to change, a line has to be drawn somewhere, meaning that at some point we should choose what combination software works well enough for us and stick to it. This will vary depending on the user, but we may see it is happening even now. My particular line in the sand is XP SP2 + Office 2003. There is no feature in Vista and Office 2007 that will make me sacrifice the performance and responsiveness I get from my particular PC. It may seem, given the widespread resistance to Vista, that this may be the case for lots of people. XP SP2 IS the most stable and secure OS Microsoft has released to date, so if we ALL stick to it we may have a chance to break the bloatware cycle. I happen to sell laptops for a living and can influence the buying decisions of at least my costumers. My advice for them is: stick with XP. If Vista comes preinstalled, uninstall it, buy an XP license and consider it an upgrade because of the performance/usability issues. In fact, I have performed this service for many customers by now, and will go out of my way to hunt for drivers when they are hard to find.

Anonymous said...

This was great to read. It codified my long term beliefs with reasonable quantifiable numbers. Perfect, of course not, but great for comparison.

I've got an old PII 266 with 64MB of ram that is great for most needs. It boots faster than my XP machine which has more cores, ram, drives, monitors. But it is at last becoming too slow for some (basic) tasks.

Having given up MS I've started moving much of my MS infrastructure over to Linux. I have no GUI and only the applications and services I actually want to run. My 5 year old servers seem almost new again.

When MS shows there new "tiny" kernal that takes 33MB and has only enough internals to act as a very light weight web/file server, it's clear that they are out of touch with their customer's needs and potentially reality as well.

Personally, to the good folks at intel and AMD, keep it up. Regardless of the OS I use, you have not made a platform yet that I haven't maxed out, overclocked and maxed out again. I don't need MS to fill my CPU cycles, I can do that myself.

Thanks for a great article.

Anonymous said...

I used to boot Win95 in 5 seconds on a K6-233, then 20ish seconds for Win98 on a P2-300. Next upgrade was Athlon 1.3G, a minute for Win2k, then almost 3 minutes by the time I gave it up 5 years later (updates but also other software slowed it down a lot). Now I run XP on an E6600 (C2D 2.4G) and it takes just shy of a minute to be done scratching...
Won't an Uber-geek write DX10 for Linux and save the world ?

Anonymous said...

It's unfortunate the amount of bias that goes with all of this and no comparable benchmarks with other software configurations.

Everyone loves to bash Microsoft, it's become the new "thing" in IT.

I would recommend to stop judging total ram usage as a bad thing. Sure, it's possible to use less ram, but that just means one needs to read data off of the hard drive more.

RAM is there for a reason. If it's not being utilized, then it is doing nothing for you.

At least with Vista the entirety of your ram is put to good use.

symbolset said...

Thank you for demonstrating so well this truth many of us have known for a long time. Word processing was a solved problem in 1984. By 1987 spreadsheets had all the functions a normal person would ever use. Databases took a little longer, but by 1990 that was sorted. An infant could have been born that day and by now would be almost of age to vote and we've seen no real improvement in productivity since.
As Tyler Montgomery said, "
In the face of Walmart offering a $200 Linux Desktop with the free OpenOffice included...the Wintel monopoly could be in danger." I couldn't agree more. I'm typing this on one of those very machines. The office package that comes with it works fine, loads fast, reads and writes MSOffice formats just fine. The machine boots fast, consumes little power, is ROHS certified and works well. It's also not as susceptible to the malware that plagues Wintel PCs -- Storm Worm is not supported.
Is now the time people will finally choose something else? Maybe, maybe not. For more than a decade I've watched the Wintel Duopoly marketing triumph. For now the only hope that I see for change is that Microsoft in its incessant hunger for growth will reach over into the hardware side and create a rift between the two that cannot be repaired.

Anonymous said...

Excellent blog, confirms what I have felt since Win2K. "Innovation" at Micro$oft means something entirely different to the average user. I have often wondered just who within M$ actually uses their software (office), it has only gotten worse since Office 2000. M$ is severely conflicted with competing priorities, and we as end users, suffer for their lack of vision. (Can you say FUD?)

Darrell said...

I've begun a facebook group called Computing Minimalists to resist Wirth's law that "software speed is decreasing more quickly than hardware speed is increasing". Among the software I recommend is Floppy Office, OffbyOne browser, i.Scribe for e-mail, Miranda IM, Sumatra PDF, Irfanview, mtPaint and Silentnight Micro CD burner 5, all for Windows/ReactOS. Obviously there are other operating systems such as Damn Small Linux/MeanPup and KolibriOS.

Anonymous said...

Hi all

Anonymous said...

Personally, the real reason for most of the bloat and slowness of most modern commerical software is the need to justify a sales decision. Most people wouldn't buy a new copy of Windows if its only new features were better security and fewer bugs.

I suspect that this is one area were Open/Free software will have a leg up over commercial software.

JG said...

About Office 2007 and XML

Some claim the MS OpenXML format is not true XML and is somehow "XML wrappers around the binary blobs." This is completely disingenuous and patently untrue in terms of textual document content. Embedded content such as images, sounds, etc... are stored in their native binary format, clearly the most efficient way. Obviously these people have not looked at an actual OpenXML document and are taking religious kooks such as Rob Weir[d] at face value.

So why does this XML matter? Well, the simple fact is that plain text encoded formats (like XML) WILL ALWAYS be slower then a comparable binary format, due to the extra step of parsing/encoding. The previous binary office formats are little more than on disk images of the document open in memory obviously quick to marshal... If OfficeBench is doing any ammount of saving/loading of XML encoded documents then it will be noticeably slower. Since Office 2007 can be set to use the older office 2003 by default it would be interesting to see a comparison of runs using that.

It is interesting to note a valid concern/argument in comparisons made of the specific and generic XML syntax of ODF and OpenXML. The OpenXML syntax is more "terse" and, while it's an equaly valid XML syntax to ODF XML, does sacrifice some schema flexibility for gains in parsing speed.

My meta observation is that I don't see an end to this tug-o-war between features and speed. I wonder how my nLite'ed and tweaked XP installs would stack up against windows 2000? Thank all that is good for utilities like that!

Anonymous said...

Perhaps you should consider what Bob Colwell said in a Stanford speech in 2003 (you can find the link to the talk on the Wikipedia page on him) about increasing hardware capability. He said (approximately) that it isn't about who can make the leanest application, but who can do something useful with the increasing capability. And if you can't do something useful with it, someone else can.

Users are not really checking the sizes of every program, and refusing to run ones that are not (by some random criterion) sufficiently efficient. But they will definitely notice if someone can put useful new features in their programs.

However, to improve the usefulness of a feature sometimes requires massive increases in complexity. For example in AI, you can make seemingly nice things happen with pretty little resources. But to do it 'right' takes exponentially more resources, and you will probably not see significant advantages before you are above some (pretty crazy) threshold.

Unknown said...

"By contrast, Office 2000 consumed a paltry 9MB, which translates into a 12x increase in memory consumption (i.e. 170% per year since 2000)."

has someone got their math wrong or is it just me?

Anonymous said...

JG yacks on about nonsense trying to defend MS's OOXML.

You really want to know what OOXML is about? Its a mechanism used to help maintain the need fpr Microsoft Office. To control a document standard allows MS Office to survive and prosper indefinitely. Office is one of the three main software licensing cash cows for Microsoft.

Its a known fact that there is no way to view a doc file without MS Office with 100% accuracy. MS talks about interoperability and openness, but the reality is, its a facade. They really have no intention to "interoperate" with anyone unless its under THEIR terms. The meaning of "openness" is defined by THEIR definition.

Why do you think they went with their own solution (OOXML) instead of taking part in OASIS and to adopt ODF? Its part of their "embrace, extend, extinguish" policy with a touch of market spinning.

Go look up "Halloween documents". These are leaked Microsoft memos that detail how they analyse and counter opensource.

They know opensource is successful because of standardised formats and protocols. By "extending these protocols and developing new protocols", they can maintain their dominance.

They know they cannot compete with a threat that hurts their fundamental business model. So they attack by other means.

You have to be either:
(1) Moron
(2) Invested in Microsoft
(3) Fanboi
...To be defending MS's OOXML.

Think about it this way. When they wanted to get OOXML fast tracked for ISO, did you not notice that countries were bribed by MS into voting "Yes" OR suddenly joined into the voting process?

They knew the ISO process has many weaknesses, and they went and exploited with the power and resources they have.

Now if you step back for a minute, have a think about why MS needs to pay third-parties to support their products! The fact is, they produce garbage. They have been doing it for the last 20yrs! Its only now that more people are catching on with MS's business practices. Its all thanks to the Internet!

Anonymous said...

"Some claim the MS OpenXML [sic] format is not true XML and is somehow "XML wrappers around the binary blobs." This is completely disingenuous and patently untrue in terms of textual document content."


One reason that OOXML is not valid XML is that it uses bitmasks. These are not part of XML, and inhibit extensibility and use of standard XML tools'.

Furthermore, OOXML allows the embedding of binary blobs. 'Ecma 376 section 11.3.1 "Alternative Format Import Part"

In fact, read that whole document to see just how much of a joke OOXML really is.

Anonymous said...

Interesting article. The conclusion I have drawn from this is a business model developed by the Wintel duopoly. Without this model the hardware/software developers would not survive, hence, technological progress.

Anonymous said...

Excellent, excellent article. I have just one tiny correction: OOXML is not an open format. It is binary cruft in an XML wrapper, and patent encumbered, don't let the Microsoft market-droids tell you anything different!

Microsoft have admitted their standard isn't open by trying to force it through ISO. If they had nothing to worry about they wouldn't have attempted all that bribery and corruption.

Some related reading:

Anonymous said...

I totally agree with the article. I'm no computer expert, but the trend of "more power, so fatter code" is ridiculous.
About Office: For me, the specific numbers are not important, the bottom line is: why can't I have a simple program to do simple tasks quickly? I think most people would be surprised to realize that all their typical documents could be typed up in Wordpad and no one would be the wiser. Wordpad! 4MB, negligible CPU time, and RTF - what could be better.
About Windows: I never could understand what the operating system was *doing* with all that memory. The job of the OS is to serve as a platform for other programs to run on... not take up more system resources than several other programs combined! I mean, really, does it really need to precache *everything* that you *might* use? In the mean time, it's slowing down everything that you *are* using. So if I pick a random feature it loads 10% faster, but runs 20% slower? Not helpful. And all those threads? Between the OS and the antivirus/security suites these days, it's 30-40 threads! That's just lazy programming. Is my computer really doing 40 things when it's not doing anything? That simply just doesn't make sense.
More about "Internet Security": The average home user needs a handful of ports open, and the rest of them stealthed. They need to virus check any downloads and not visit dubious websites. That's it. No constant background scanning, no monitoring for 'suspicious code', no 'execute disable bit' technology. Maybe keep the pop-up blockers, because those things are just annoying.

The *only* reason that I don't just flat-out dump Micro$oft is because I use software that isn't currently supported by Linux.

Anonymous said...

I felt the article lost credit when reading the specs used, windows XP RTM in my opinion was certianly noticebly more bloated than windows 2000 mainly because of themes. When I first used windows 2000 I had 256meg of ram but upgraded to 512 before I stopped using it, I moved to windows xp with 512 ram and upgraded my ram twice over the years and now have 2 gig of ram. Vista when the day comes I use it for my mainstream use I will probably have at least 3 gig of ram and possibly 4 and move to 64bit.

Anonymous said...

It just fits with our disposible attitude to everything in this world. If we DEMANDED that software got MORE efficient (not less) so a PC "seemed" more useful for a few years longer, people wouldn't be on the upgrade path. I can't see it in Wintel's interests that this happens. Heck, I still play with my 8-bit Atari and it seems fast to me for some things..... The HP laptop I'm working on now I guarantee will fall apart inside 3 years. Where has the idea of quality gone?

Alemao said...

As a software developer (Windows only) I'm very happy that every version of anything that MS does is slower than its predecessor.
That's when my company can start selling our own software, properly built, always tested in a P2-300mhz with 128Mb before sent to any customer.
(and to whoever started the discussion about the Wintel term, can't you just see that's not the point of the article?)

Anonymous said...

This is the complete opposite of what the Apple computers are doing. Same hardware goes faster with new OS. Have you tried running Vista on a P3 ... Ya right!

Anonymous said...

Talk about CPU utilization, threads and memory hogging are great, but for me, it comes down to one thing: user experience.

I work in IT, and if my users have a slower experience, without vastly increased feature set or enhancements, they won't be happy. Thus, we won't be considering Vista, until it is absolutely necessary.

symbolset hit the nail on the head, when he said Word Processing was perfected in the 80s. I suspect my users are like most in the world, and use Word for basic word processing with an occassional picture inserted when its an "important" document. They dont need Office 2007, voice recognition, and umpteen other things. They just want a fast, user friendly experience which Vista couple with Office 2007 doesnt give them

Anonymous said...

"Won't an Uber-geek write DX10 for Linux and save the world ?"
Another lucrative option would be game devs starting to use OpenGL more (here's to id Tech 5?).

"Everyone loves to bash Microsoft"
I don't hate Microsoft, I hate big fat slow awkward software. (I also hate it when tooltips frequently pop BELOW the taskbar. But I guess tooltips-above-taskbar is a Vista exclusive.)

"it's possible to use less ram, but that just means one needs to read data off of the hard drive more."
Good software saves memory by using efficient data structures and good overall design, not by constantly swapping.

Anonymous said...

Puppy Linux on Core 2 Extreme, SMP kernel, and init 3, ROCKS!!

Enough said!

Unknown said...

Honestly, the hardware industry can force the issue via the new "green" paradigm. We are seeing Intel do some of this already. Multi-core CPUs using less power, but running faster, solid state drives etc.

If business are given a choice of hardware that uses 30% less power, or more bloatware, they will go for the savings, simple economics. Then the software will follow because business will not upgrade if the total cost of ownership is going to be not only the cost of the new software and hardware, but continual power cost over time.

As for the consumer, PC gaming is continuing to fade as consols become more powerful than PCs. To be honest, outside of gaming, most laptops/desktops today can handle consumer mulitmedia needs. If you're like me and have 3-5 PCs in the house, lower power demands will make you happy as well.

Unknown said...

Honestly, the hardware industry can force the issue via the new "green" paradigm. We are seeing Intel do some of this already. Multi-core CPUs using less power, but running faster, solid state drives etc.

If business are given a choice of hardware that uses 30% less power, or more bloatware, they will go for the savings, simple economics. Then the software will follow because business will not upgrade if the total cost of ownership is going to be not only the cost of the new software and hardware, but continual power cost over time.

As for the consumer, PC gaming is continuing to fade as consols become more powerful than PCs. To be honest, outside of gaming, most laptops/desktops today can handle consumer mulitmedia needs. If you're like me and have 3-5 PCs in the house, lower power demands will make you happy as well.

Unknown said...

In response to Gary, I have to say that you appear quite fortunate and have a very optimistic view of the IT Industry as a whole.

Regarding the 'Green' uptake you could check out this interview about the issue.

It poses some good counter-points, despite being on the Inq...

Random Stuff said...

Isn't the whole point to develop software that uses all of my computer's processing power?

I mean, sure I could install Win3.1 on my laptop and have it run like the wind, but some of us actually do like things that look nicer and work better.

Besides, if you really want an absurdly fast but graphically depressing text editor, I'm pretty sure that wordpad is still in there somewhere.... by all means, feel free to use it.

Unknown said...

Random Stuff,

I don't think advancing software is the issue, but it's how it is advancing. Using old mainframe technology, i.e. shadow files, and the like is not advancing technology.

Some of the original ideas Microsoft had to Vista were cool. Get rid of the registry and use XML to handle configurations. Not only does it make maintenance far easier, and the applications load faster, it simplifies backing up. Other than the operating system you could have literally copied over an application's folder on a drive and it would have been backed up. Simply copy it back to the C drive to the same directory path after the OS is installed and you are done. This would have also made it more secure. Now that innovation. What they are coming out with now is more the same, but slower and bigger.

Mackenzie said...

Flat-file plain-text configuration files aren't a Brand New Microsoft Innovation. That's how Linux configuration has always been done. I have a bunch of hidden directories, one for each application, in my home drive that for the most part contain plain text files of the configuration. Enlightenment window manager isn't plain text in there, though, and I think that's really stupid. Fluxbox wins there. They have the *nicest* menu configuration file you could ever imagine.

Unknown said...

While I agree in premise to the text file comments, Windows 3.1 had them, the XML formatting is new for Windows at least. Since I've been working in Unix for 12 years I too am familiar with such technology in Linux. At least the flat file text is more advanced than VMS/Mainframe tech.

My main point in all of this is that bigger is not always better, and newer isn't necessarily new, i.e. Unix, DOS, Linux, Xerox, Macintosh, Windows. You can only open and close a file, and interface with hardware so many ways. The next step in the evolution of the OS should be towards efficiency. Maybe that's Linux, though I debate that it has also grown in girth.

Hopefully at the end of the day someone will realize the ideal system is fast, safe, and easy to use. So far nobody has been able to do all three together.

Anonymous said...

well, a ex vista machine running xp and office 2003 is pretty fast. just to through that in.

AMD Turion X2 Tl-58

Unknown said...

My Vista machine running FreeDos is blazing, so I'm not sure the XP to Vista speed comparison means anything.

Now to say XP runs all my applications and stores my data reliably and is faster than Vista, that is applicable. I'm sure this is how you mean it. The whole XP speed comments though get to me since it's really about the applications and how well they run on our machines.

Anonymous said...

My computer doesn't meet the system requirements for Vista, and XP does everything I need, so I don't see any reason to ever buy it. Heck, my father still uses a computer with Windows 95 - it boots up fast and he uses it for internet access without any problem.

As for word processing, I tried using the free trial version of Word 2007 for a month, but was glad to get back to 2002. It was slow, sometimes didn't fit well on the screen, and all the menus had changed. Even a simple word count took it a long time.

Word was about as fast on my old 386, and had the same number of features I actually needed - not much progress since then...As for Wordpad, it is sufficient for many purposes, but doesn't have columns, which I often need to use.

Anonymous said...

This is really interesting--good work. I was surprised that different RAM sizes made so little difference though, and I think it might have been because of the test methodology. Limited RAM makes a real computer slower because of paging, because the physical disk is at least thousands of times slower than RAM. However, if the host had significantly more RAM than the VM does, when paging hits the .vmdk files, they will be heavily cached by the host OS, and the slowdown should be much smaller. I suspect this is the reason there is virtually no difference between the 512MB and 1 GB Vista+2007 configuration performance (and likely the others too).

Anonymous said...

I wasn't going to 'break the ice', but since linux has been mentioned already, I thought I'd throw in my 2 cents' worth.

Puppy is indeed fast and lightweight, and even contains on the LiveCD applications to suit almost all 'normal' users. But, what's even better (or worse - for Vista) is that there is a version of Puppy linux that is made to look and act similar to Vista, has Compiz effects (better than aero by a million miles) and runs in RAM after loading from a 128MB CD.

In other words, more functionality does not have to cost 15 gigs of hdd and two or three gigs of RAM. It's just not necessary. Puppy runs fine on very old hardware. Compiz will blow you away (if you like eye-candy and added functionality) as long as you have a decent video card. The video card requirements aren't even near as hoggish as on Vista.

Linux is so cool. You can make it as lean or as bloated as YOU want. You can make it look like any version of Windows, or Mac OS, or like nothing ever seen before. Or you can find a version that is already configured just the way you need it, most likely.

Bloat is just not necessary for an OS to work for you.