Friday, February 19, 2010

(Editorial): Rebutting Ars Technica

Upated: Thanks to an industrious reader, we finally have outside confirmation that our understanding of this issue was dead on. Yes, Virginia, this means we were right and you were wrong. So before you go posting yet another idiotic comment parroting yet another clueless blogger from yet another obscure corner of the Internet, please read our editorial titled “What took you so long?

The remainder of this post has been retracted due to it being superseded by the above linked follow-up post.

44 comments:

mickro71 said...

When I read your blog I have to come to the conclusion you are 14 with a great understanding of the English language. Not so much with computers, but we already know this. It's laughable that you would post a rebuttal when you are still wrong and expect anyone to think differently. Your ideas are outdated and whether or not you choose to keep them isn't anyone's business but your own, but business is bad. it has to be. WHy continue to sell the idea that wooden wheels are still the way to go when even the horse drives a car.

fabarati said...

Isn't that a 13 year old book, that relates to NT4? Meaning, it came out before win 2000, win xp, win vista and win 7. That's quite a few kernel revisons a go.

Meaning that it doesn't actually have information on memory caching technology like superfetch, introduced in windows vista.

Meaning it's pretty much useless.

FYI, the *nix-based OS's (Linux, Unix, BSD, OS X) all have similar memory caching.

Now, go back to trolling the internet, and calling most people (including me, i'm guessing) liars, slanderers and people who know nothing.

Kuruk said...

Windows 7 has taken the path that free memory is idle and wasted.

So it uses free memory to cache and preload often used apps in the case that they may be used. Now if they are needed thier already in memory and start faster.

If the system needs more memory the cached ram can be disgaurded and used as needed.

On the whole it is a much better use of resources than sitting idle.

PaulHill said...

Rather than just slamming you, I suggest you read the following blog post:

http://blogs.technet.com/clinth/archive/2009/09/03/the-case-of-the-enormous-page-file.aspx

It'll probably illuminate what exactly is meant by committed bytes, free memory, reserved memory etc. etc.

William said...

I seriously cannot believe you are quoting an NT4 book regarding Windows 7 memory usage.

Well my Babbage analytic machine instruction manual claims your NT4 machine is using far too much memory!

Get on that!

Ken said...

The graph... adds up to more than 100% total. 88% of the time is spent at more than 85% of RAM used. and 50% of the time at 90%. and 15% of time at 95%.

His computer is using more than 85% of physical RAM space 150% of the time. Amazing. Peter needs to buy more ram.

Brad W. said...

What's truly shocking is that you won't acquiesce to the fact that your metrics are wrong for Windows 7.

Using Mr. Bright's PC in your rebuttal proves that the emperor has no clothes because it's easy to see (using the screenshots on ArsTechnica) that the reason the memory is used is as a result of SuperFetch.

Any wailing and gnashing of teeth to the contrary only continues to remove the already limited credibility you have.

Charles King said...

Looks like this post is inspired by the 'when you're in a hole, keep digging' philosophy.

Here, let me rebut your rebuttal: You're wrong. Period.
/rolleyes

Your 'sophisticated metrics analysis' completely fails to take into account Windows 7's memory management system. This has been pointed out to you several times now and all you;re doing is making a bigger fool of yourself.

luxifer said...

well... you actually really think that you "methology" es in any way meaningful?

1st: in your "explanation" of the methology used you don't give any hard facts or formulas - you're hand waving around some terms that shall make it sound technical and meaningful

2nd: by explaining your methology again, you indeed reveal how wrong your approach of meassurement is... as of how i understand your blurry explanations of the process you are meassuring nothing but ram access counts rather than how much ram is free or used in any way...

let me draw a picture to illustrate this:

if ram was a desk and the data in it was all the things on the desk and swap would be a shelf on the other side of the room your reasoning would be as follows:

- it's better to have free space on the desk
- if you're switching between objects on your desk to work with too quickly and have some things on the shelf, your desk must be full, causing you to go to the shelf more often (which is slow)

windows vista and 7 acutally do the following:

- giving you an assistent who learns which files and folders you use regulary and so gets those for you when you enter the office and put them on your desk just in case so you have them handy
- this assistent even updates that pile of stuff depending on what you're working on currently
- if you need the space used up by that pile of stuff you tell your assistent and he'll instantly sweeps stuff from that pile of the table and even sweeps that stuff you're most unlikely up to use currently
- with this mechanism you can safely consider the space used by that pile of stuff as being "free space"

this way there's a lot of traffic going on around your desk but you'll never run into space issues, when the size of your desk was reasonable, too, before you got that neat assistent

Henrik said...

@Ken:
The graph with the percentages is logically:

88% of time over 85% of memory
50% of time over 90% of memory

is very possible. From this you can deduce that
38% of time between 85% an 90% mem.
50% of time over 90% of memory

Now, what the graph *claims* about memory usage is an entirely different matter. :)

DrPizza said...

The original book from Helen Custer doesn't address SuperFetch, because it didn't exist back then.

Committed bytes is not the performance counter that matters. Available bytes is. Why? Because available bytes is the amount of memory that's available to applications instantaneously.

David Moisan said...

I'm waiting for Russinovich to weigh in. :) And wondering why a 10 year old book on NT is quoted.

A question we should all ask: Why is this important, again? What remediation is possible for a (gasp) 95% memory utilization?

Why should we trust your agent app?

You claim to evaluate machines in the financial industry? Why? What do you recommend to your clients?

My Windows 7 machine is running fine on a four-year old motherboard with 4G RAM. What problem am I having again?

Evil_Merlin said...

Once again, proving you have no right to even bother posting what you consider technical information, you've already been proven wrong. be a man. Shut your cakehole, get a better education on Windows 7 and move on.

avee0 said...

Now let's for now assume you're entirely right and your measurements are correct. In that case all you managed to prove it that Windows 7 users are more likely to push their machines to the limit. Even if the measurements are correct all we have now is a statistical correlation, not a cause.

And frankly, I think it's very reasonable to assume that current Windows 7 users have different usage patterns as well as newer versions of certain software. For all we know you are comparing a grandma playing solitaire on her XP machine with a heavy games trying to get the most out of his games. Or perhaps you are actually measuring the difference between Office version (wouldn't you agree that XP users are more likely to run Office 2007 where XP users might use Office XP or 2003?)

So even when the numbers are correct, you're probably comparing apples and oranges and certainly jumping to conclusions.

Brad said...

Nerd fight! My money is on Ars...

quux said...

Does anyone else find it alarming that, having found themselves in a fight with Peter Bright, exo decided to reveal statistics from Mr. Bright's personal computer?

I looked around exo's websites for anything resembling a privacy policy. This is something to keep in mind, if you've been considering installing their performance agent to any of your systems!

quux said...

Quoted from the article:

"It is common knowledge that this counter, more than any other, provides the most accurate picture of physical memory use under Windows. And it is also common knowledge that, as the value for this counter approaches the amount of physical RAM in the system, Windows will begin paging to disk more frequently."

I suspect you meant paging from disk there, but let that ride.

Exo Research, does your agent measure pagefile reads to resolve hard faults (Memory:Page Reads/Sec or Memory:Pages Input/Sec)?

If so, it should be easy to show that your premise is true. If you're not able to show a direct correlation between committed bytes and pagefile reads to resolve hard faults, you may be in a bit of a pickle here!

Andrew said...

"Sometimes, ignorance is bliss. Other times, it simply serves to embarrass you."

So true, only it seems you've got it wrong when it comes to who is ignorant.

However complicated and amazing your metrics may be, they're irrelevant when it comes to Windows 7 memory management.

justlookinginMD said...

http://technet.microsoft.com/en-us/sysinternals/default.aspx

Sysinternals read it live it.

Mark Russinovich's blog Read it live it.
http://technet.microsoft.com/en-us/sysinternals/bb963890.aspx

Mark's book's

http://www.amazon.com/Windows%C2%AE-Internals-Including-Windows-PRO-Developer/dp/0735625301/ref=sr_1_1?ie=UTF8&s=books&qid=1266599141&sr=1-1

You are so wrong my friend....

justlookinginMD said...

I'm sorry for the readers who do not know who Mark Russinovich is.

He is Dr. Mark Russinovich phd techinical fellow at Microsoft and part of the core windows OS team at Microsoft. Before that he owned a very successful company called sysinternals that Microsoft bought a couple years ago...One of the smartest computer guys in America not just Windows.

His pushing the limits blog is a must read for all tech's

http://blogs.technet.com/markrussinovich/archive/2008/07/21/3092070.aspx

ChrisCicc said...

Microsoft should sue you guys for libel and put you out of business. Because to continue to beat this drum after you've been so obviously shown is not right...

Jamie said...

Your claims about the "Committed Bytes" counter aren't even wrong.

Suppose I VirtualAlloc a 200 megabyte region of committed memory. Hmm, suppose I have 40 processes each do that. The "Committed Bytes" counter will increase by 8000 megabytes (assuming I have a "Commit Limit" that supports these allocations).

What will the paging I/O activity show as a result? What will be the effect on system performance?

Not a thing!

(Again, assuming I have a "Commit Limit" that supports these allocations AND those needed by everything else I'm running... but if I don't those VirtualAlloc's will simply fail, there will not be a slowdown.)

Furthermore, "Committed Bytes" shows *only* this sort of process-private allocation. It completely ignores another type of virtual memory use: allocations directly mapped to sections of files.

There is actually not a counter that precisely shows the total 3allocation of v.a.s., both private and mapped. The closest thing is probably the Process\Virtual bytes counter for the total instance. Due to shared mappings this value will be high.

But that doesn't matter. It is not at all uncommon for Committed Bytes to *vastly* exceed installed RAM all by itself.

That this does not in and of itself indicate a performance hit is because these counters tell you nothing about how often those virtual addresses are being accessed.

The whole point of virtual memory, paging, etc., is that most programs spend most of their time accessing a small fraction of their virtual address space... both code and data. It's perfectly fine to leave the rest out on disk. Why do you think you need to have RAM to hold things you're not accessing?

About the only thing that can really tell you "you need more RAM" are the paging I/O rates... and not just to the pagefile, either. (Uh, you ARE aware that the pagefile is very much not the only file involved in paging I/O?)

Before trying to defend yourself again, you need to do is read and understand the *current* version of Windows Internals; the "Inside Windows NT" book by Custer does not *even* describe NT4 - it was published before NT3.1 was released (I got a copy at the 92 DDC) and is very much lacking in detail.

You might also try the experiment - write some code to VirtualAlloc a LOT of stuff and then see what the performance hit is.

If you have read it but disagree with the above analysis, you have failed the "understand" part; you should take one of the many "Windows Internals" seminars that are available.

Please, do SOMEthing of the sort before you embarrass yourselves further.

Nathan Evans said...

This entire blog (not the comments) are just frustrating to read. The level of ignorance displayed by the author is shocking.

Basing your knowledge on a book about NT4 is fine. But the key is in the word "basing".

It seems to me the author has just started on down a path in learning how operating systems work (hence recommending baseline material).

The fact is though that the author was informed about the fact of "Superfetch" numerous times in the original blog post comments. He decided to ignore these. This is a stupid and ignorant mistake. A simple Google search for the word would have educated him to his mistake(s) within a few minutes.

Everyone has to start somewhere I guess. But creating a blog site when, clearly, you're nowhere near ready to become a "influential member of the field" is ill advised.

Give it another try in 10 years, hot shot.

Elby the Beserk said...

Well, I've only been running it a day, but am getting alerts when there is not perceivable problem at all. Memory is fine, network is fine - yet I am being told I have a problem.

Uh?

Richard said...

"rebutting ars technica"...but p0wned instead.

Good fail though, not quite epic though.

How about a nice review of the USER and GDI stacks too?

Hey Idiot said...

I cannot conceive of a better example of begging the question...

Mr. Bright is wrong because we say he is wrong.

The sad thing is, vendors like this have the ear of reporters like Gregg Keizer at Computerworld, and drivel like this gets widely read by pointy-haired bosses the world over. Thus requiring IT pros to waste precious time refuting this kind of nonsense...

Kettle said...

"None of the counters in question are in any way related to system caching and/or Superfetch"

That's precisely the problem with your metrics. They fail to take superfetch into account. Your numbers aren't strictly incorrect, but what they are is woefully incomplete for the purposes of assessing performance.

The system allocates any memory that would be completely idle in XP to superfetch, which is why it shows up in your counters as being used, but The system can snatch RAM away from superfetch without any notice, so for the purposes of performance, it might as well be free! Better actually, because this busy-work for the memory actually improves performance, as it's doing something potentially useful while waiting to be assigned another process.

Try this metaphor:
Let's say I complete a contract and get paid $5,100 in cash; then I spend $2000 of that on bills and other necessities, deposit $3,000 into my bank account, and keep the remaining $100 in my wallet.

Your counters are merely measuring the total $5,100 I originally received against the $100 still in my wallet. It's an accurate, but almost completely useless measurement, and you misinterpret it(because in this wacky hypothetical reality, bank accounts are a relatively new invention, and you've never needed to consider them before), so you're going around telling people that I've spent all but that $100 in my wallet. Where we're at now, is that for some reason, no matter how many times you're told that I have more money in a bank account, you're just pointing to my wallet and insisting that you're right.

I can still spend the money at any time using my debit card, and in many ways it's better than having it my wallet, because it's gathering interest and can be used for electronic transactions. Despite this, you seem to be fixated solely on how many dollar bill paper cranes I can fold with what I have on me right now, and the rest of us are left scratching our heads, wondering how you got a job as an accountant.

GhettoDuk said...

Belligerent stupidity. I think it's time you realized that you peaked working for Geek Squad.

pejeno said...

I guess OSNews is also also wrong, as well as Ars?

http://www.osnews.com/story/22896/Windows_7_Memory_Usage_FUD_Explained

Ignorance is bliss indeed.

I suggest you start a new company, so your new users won't stumble upon this facts when they google your company.

Bluee said...

YOU MUST BE OUT OF YOUR MIND

Do NOT PASS GO
DO NOT COLLECT $200

BURN YOUR WINDOWS NT BOOKS AND STEP INTO 2010 BECAUSE YOU DON'T KNOW WHAT YOU ARE TALKING ABOUT!!!!

Hans said...

"Anyone who believes otherwise needs to pick up a copy of Inside Windows NT (the original from Helen Custer is a classic) and start reading."

Are you guys for real??? Are you basing memory usage on how we used computers in 1992?

It's about time for you guys to buy a new book or two. Microsoft have released quite a few updated Windows versions since 1992. And I am quite sure the kernel and memory usage technology has been refined a lot during those 18 years.

Timothy said...

Committed Bytes is just one of several weighted contributing factors we consider when calculating the Peak Memory Pressure Index for a given PC – another point Mr. Bright would have understood had he bothered to ask.

Alright then, I'll bite. What are all the factors that are taken into consideration when determining your Peak Memory Pressure Index? A nice detailed blog posting of exactly how you calculate the index could prove all your naysayers wrong.

I mean you're pretty sure you're right and everyone else is wrong. Prove it.

Siki said...

I would describe just why you're wrong, but it seems people with a greater understanding of the field have done a better job.

Perhaps you should learn from my example.

Matthew said...

I think the problem is trying to still argue that the memory is full, when that's not what Ars is rebutting.

Nobody is arguing that the memory is not full. You can use a sophisticated metrics analysis template to find that out, or you can just bring up task manager.

Either way, what you're going to find out is that yes indeedy, the memory is full.

What the other side is arguing that automatically flagging full memory as a bad thing is wrong because with Vista and 7 full memory is actually a good thing because the memory is actually getting used instead of sitting there wasting space.

That's because it's full of Superfetch and disk cache data that can be thrown away if the memory is needed for something else, but is already available if it turns out to be what you need.

Now if that's true or not I'll let everyone else debate about that.

But to keep arguing that you're correct because the memory is full isn't going to get us anywhere because we already know the memory if full. We can see that by just opening task manager. The question is, what is it full of?

What the other side is saying is that's a good thing and not a bad thing because of how 7 treats system memory as a cache.

If that's true or not I don't really even care, but that's actually the point Ars is trying to make.

BlakeyRat said...

Armed with this information, he no doubt would have come to a far different conclusion, especially in light of the fact that none of the counters in question are in any way related to system caching and/or Superfetch

Wow. You explained in your post exactly why your numbers are wrong, and yet you still won't admit that your numbers are wrong.

Windows Vista and Windows 7 have SuperFetch. If you're not accounting for that, SuperFetch memory (which is read-only cache memory) is going to be counted as "in-use" instead of "free".

If none of your counters are looking at SuperFetch numbers, your results are wrong. Period. That's all she wrote.

praveen said...

FYI http://en.wikipedia.org/wiki/Windows_Vista_I/O_technologies#SuperFetch

SirBruce said...

Okay, I'm not trying to pick a side here. I was all ready to believe that their program was wrong because of superfetch. However, if they are actually using the Committed Memory counter, then perhaps they are right.

On my Windows 7 box, I have 4096 MB of memory. Resource Monitor, Memory shows 1025 MB Hardware Reserved (graphics card), 1333 MB In Use, 30 MB Modified, 1686 MB Standby, and 22 MB Free. The "Standby" is primarily what people are talking about with Superfetch.

Performance Monitor indeed says 25% Committed Bytes In Use with 1724 Available MBytes. So it does not look like Stanby (Superfetch) is being counted as Committed. However, it's interested that Committed is not counting Hardware Reserve, so while it may look like I'm only using 25% of my memory, I'm actually using closer to 50%.

maarten said...

superfetch is supercrap

Research Staff said...

@SirBruce,

And to think, it only took three days for somebody to bother to check Committed Bytes in Perfmon to see if it's counting cache and/or superfetch allocations or not (hint: it's not).

Congratulations! You win the "booby prize" as perhaps the only other person in the blogosphere who has a clue (besides us, of course)!

Well done!

dmartin said...

The bottom line to me is that Microsoft has already created a metric to show how your system is doing on memory, whether memory is getting to be a problem or not.

That metric is available memory, which you can easily see in Task Manager, or Performance Monitor. Mark Russinovich has several great articles up at the sysinternals site on Technet, in which he explains in great detail why you should be looking at available memory.

If someone claims to have a better metric than available memory, they need to provide justification why their metric is better, which I have not yet seen exo do.

I understand that if you're in the business of selling performance monitoring software, you want something other than available memory, which you can monitor for free. And perhaps your metric is actually better than available memory. But if it is, you have to provide some evidence of that and show why Mark is wrong.

Not being technically qualified to make such a judgment myself, I must rely on the experts, and to me that's Mark Russinovich, unless someone else proves differently.

Research Staff said...

@dmartin,

I believe we've already demonstrated that "In Use" memory parallels the behavior of Committed Bytes in many ways. And since the opposite of "In Use" memory is "Available Memory," you've essentially reinforced our conclusion.

Please see our latest post on this subject for more details...

Edgar said...

Russinovich is waiting for a response until he has succeeded in removing the ear to ear grin on his face. And looking at the desperate attempts to make your theory fly this can take a while. Get your schedulers out, you will be invited for some extra classes to attend since seem to have missed them after XP launched.

xxdesmus said...

God you guys are idiots. I really hate stupid people who feel the need to talk. You clearly have no idea what you're talking about so please just stop embarrassing yourselves.

Save your last minuscule shred of dignity before it's too late.

Tabin said...

I work at a Support Desk for a high-end trading systems company. We run into performance analysis questions all the time. This ranges from the level of "Why is my computer slow?" to "Can you explain the paging behavior and any slowness that can result from so and so app?"

Our programs live and die on system performance...so we've done a fair amount of performance analysis on PC's for a variety of reasons. Looking at XP, XP64, Vista, Win 7, I can unequivocally say the reasoning put forth in this article is greatly flawed.

From what I've seen in real world testing, Windows 7 is superior to XP's memory management hands down. Windows 7 does not have the memory problems the author of this blog purports it to have. It just annoys me when you have these "performance software" companies publish BS results to try and make themselves look good.

Most of our clients quickly realize that some intelligent use of a PC can render these application totally unnecessary.