Friday, February 26, 2010

(Editorial): Picking Apart Intel’s Latest Windows 7 Migration Delay

I read with some amusement the recent account by an Intel IT engineer of how the company has been forced to repeatedly delay its migration away from Windows XP due to concerns for, among other things, Internet Explorer 6.0 add-on compatibility and support for applications that still use 16-bit code in places where, quite frankly, they shouldn’t.

It’s the latest in a long line of “public displays of procrastination” that helped fire the imaginations of the very publications I once contributed to. In fact, this is exactly the sort of excuse-baiting exercise that led to the creation of the controversial Save XP Campaign at InfoWorld.

In case you’ve been living under a rock for the past few years, Save XP was a program that Executive Editor Galen Gruman dreamed up on his own and then forever associated with me by launching it, without my consent, from within my InfoWorld blog (it literally just appeared there one day – like magic).

In other words, it’s “shock jock” bait heaven, and the kind of story I might have seized upon for the Enterprise Desktop. But here, at the exo.blog, I’m free to give my true and honest opinion of this kind of corporate soul baring. And in Intel’s case, I call “BS” on many of their excuses for delaying migration – both to Vista, which they skipped altogether, and Windows 7, which they’re just now getting around to addressing in earnest.

For starters, there’s the IE 6.0 issue. Intel claims to have hung on to IE 6.0 for this long because of a need to support certain important internal applications as well as legacy addons, including, apparently, some older version of the Java runtime. But Intel has been ignoring an important potential solution to this quandary: App-V.

The App-V runtime was purpose-built to address just this sort of scenario. It isolates file system and Registry changes made by application installers, allowing you to run multiple versions of a program, like Internet Explorer, side-by-side on the same PC.

It would be trivial to create an App-V sequenced package that encapsulated IE 6.0 (plus all of the required addons) and then roll this out as a short term fix. They would then be free to either upgrade their OS installed base or, barring that, to at least update the version of Internet Explorer to somewhere north of the paleozoic.

And what about their 16-bit applications? Intel claims that they need to maintain a wide variety of legacy operating systems, ostensibly for testing and verification purposes (this is never made clear in the original blog post). However, why this should affect their mainstream desktop computing stack, and its transition to 64-bits, is hard to fathom.

Note: Do they really want us to believe that some parts of Intel still rely heavily on 16-bit Windows or DOS code from the pre-XP era? For line-of-business functions that affect a significant portion of their user base? Really? Because, otherwise, this line of reasoning just doesn’t hold water – even when you factor in the occasional testing and/or legacy validation requirements.

The Intel engineer-author hinted that their solution to this problem will involve some sort of integrated VM solution, like Virtual Windows XP Mode (or more likely, MED-V). However, what struck me most after reading this posting is how the potential mitigation of these issues has virtually nothing to do with any specific Windows 7 capability or advantage. Vista had similar issues, and the same proposed “fixes” (App-V, MED-V, et al) could apply equally to either version.

In fact, this whole Intel blog entry smells like so much ass covering from a company that very publicly trashed Vista by skipping the upgrade cycle altogether. That controversial move, which was widely reported at the time, helped fuel a public perception backlash that cost Microsoft millions of dollars in potential revenue.

Now Intel is trying to make amends by claiming that everything’s just peachy under Windows 7, when in reality the very same compatibility hurdles – from IE to 16-bit code and even UAC - remain. Frankly, the author could have saved himself a lot of time and effort by skipping the play-by-play recap and saying something more to the point, like:

“Hey Microsoft customer base: We screwed-up by dissing Vista, and it cost our very best buddies wads of cash. Please disregard everything we said before about compatibility hurdles and migration issues and go buy lots and lots of Windows 7 licenses. Because we really do like this one. Honest! It’s better than Vista. Trust us!”

Of course, Windows 7 is better than Vista – just not in the ways that Intel is alluding to in this semi-confessional blog post. But at least they’re finally making the long overdue move away from Windows XP. And for that, I applaud them.

Because, at the end of the day, they’re still just a bunch of hardware guys. And as any hardcore software person will attest, when it comes to figuring out what to do with all those CPU cores and gigahertz, the hardcore hardware guys really don’t have a clue.

RCK


Get more charts like these for free at www.xpnet.com.

Read more...

Thursday, February 25, 2010

(Editorial): App-V Takes Virtualization Mainstream

Those who have followed me on InfoWorld and elsewhere know that I’m a big fan of application virtualization. The idea of bottling-up all of the messy deposits from a typical Windows application installation into an easy-to-deploy, self-contained package has always seemed like a good idea to me. And during my extensive testing of various “appvirt” solutions, I’ve developed some strong opinions about which approaches work best for various deployment scenarios.

For example, in a tightly-managed, generally homogenous Windows environment – with Active Directory at the core of every network – Microsoft’s own App-V solution has often seemed like the best option. However, in less locked-down environments, where portability and flexibility are the primary concerns, stand-alone (i.e. no client agent required) solutions, like VMware ThinApp or XenoCode, have always been at the top of my recommendation list.

I summarized my findings in a white paper that I published through the exo.blog in early 2009. This report, the development of which was funded by VMware, shows how tricky it can be to determine which virtualization platforms provide the best performance across a range of use cases. You can grab a copy of the white paper here.

Now, with the release of App-V 4.6, Microsoft has raised the bar a bit for its competitors. For starters, the new version allows you to sequence (i.e. capture the output from and virtualize the installation of) 64-bit Windows applications. This is significant in that Microsoft’s upcoming Office 2010 will be available in a 64-bit format, and App-V using shops will no doubt want to be able to virtualize it as they do the 32-bit version of Office 2007 now. However, the more important new feature is the capability to deploy virtualized applications to clients running the 64-bit versions of Vista and Windows 7.

Previous versions of App-V were incompatible with 64-bit Windows due to their lack of an x64-compatible kernel mode agent. This is one of the reasons why I’ve traditionally recommended VMware ThinApp for customers with a significant installed base of 64-bit clients. However, while ThinApp-encoded applications will indeed run on 64-bit windows, the virtualization engine itself is 32-bit only. You can’t encode a 64-bit application with ThinApp, and 32-bit encoded applications are treated like any other Win32 application running atop the WOW (Win32 on Win64) compatibility layer.

With both native 64-bit application support and the capability to be deployed on 64-bit Windows editions, App-V has pulled ahead of the competition and established Microsoft as the technology leader for this category. I’ll be revisiting my original test results in the coming days as I see what, if any, improvements Microsoft has made in the performance and overall runtime footprint of their solution. Stay tuned.

RCK


Figure 1 - The Latest WCPI Index Values

Read more...

(Trends): Windows 7 Drives RAM Size Surge

The latest data from the exo.repository shows Windows 7 driving a measurable surge in average RAM configurations across the nearly 24,000 registered xpnet.com contributors. According to repository snapshots taken in the weeks following the Windows 7 launch, the average RAM configuration for PCs running Microsoft’s newest OS has increased from 3.15GB on November 30th, 2009, to 3.76GB on February 25th, 2010 – a surge of nearly 17%.

.

Figure 1 – Average RAM Sizes – 11/30/2009

By contrast, average RAM sizes for PCs running Microsoft’s Windows Vista and XP have remained flat at 2.7GB and 1.7GB, respectively.

Figure 2 – Average RAM Sizes – 2/25/2010

The lack of movement on these legacy OS platforms reflects the rapid influx of Windows 7 PCs into the exo.repository. An analysis of the most recent 1000 exo.performance.network network registrants shows a phenomenal uptake in Windows 7 adoption, with 62% of newly registered PCs running Microsoft’s latest version vs. 28% running Windows XP and a meager 8% still running the much-maligned Windows Vista.

Figure 3 – OS Adoption Rates – Last 1000 Registrants

Bottom Line: Windows 7’s influence is increasingly being felt across the exo.repository, with nearly 2 out of every three newly registered systems running Microsoft’s latest and greatest. And along with this uptick in Windows 7 adoption comes an increase in the average RAM configuration for PCs participating in the exo.performance.network, and by extension, a significant cross-section of the general Windows system population. This is good news for software developers who have been waiting for average RAM configurations to increase before adding new, potentially memory-intensive features and capabilities to their application designs.

Note: The above statistics were generated from the over 230 million process records collected from the nearly 24,000 registered, active xpnet.com users. If you’d like more information about the exo.performance.network, including how to reproduce the above chart object(s) on your own site or blog, please visit www.xpnet.com.

Read more...

Wednesday, February 24, 2010

(Editorial) Confessions of an Internet “Shock Jock”

Note: Please see my follow-up post which sheds more light on what InfoWorld knew and when, including an email thread that proves the publication’s complicity in the Craig Barth ruse.


Public falls from grace. We all love to watch them unfold. Whether it’s a golfer with libido issues, or some blowhard blogger getting his comeuppance, we just can’t get enough of it. The sordid details. The backroom double-dealings. The questionable motives.

I, of course, I fall into the latter category. I am Randall C. Kennedy, former internet “shock jock” blogger for InfoWorld and current holder of the title “Most Reviled Person on the Internet, 2010 Edition.” In the past 72 hours, I’ve been humiliated, chastised and kicked to the curb by virtually every one of my contemporaries. My personal and professional credibility is shot, and my part-time career as an IT journalist is over for good. Can the urinal cake with my face on it be far behind?

Still, like every good tabloid story, the villain still wants his day in the sun - a chance to tell his side so that the record is truly complete. And while the future may see my name relegated to the role of punch line for a crude party joke, it wasn’t always this way. I once had a name I could be proud of, one that was associated with highly successful projects at some of the biggest firms in IT and finance. That it could all come crumbling down so quickly should serve as a cautionary tale for anyone in a similar position. So here, without further ado, is my story.


I’ve been a professional in the IT industry for over 25 years. I got my start in the mid 1980’s pulling wire and installing servers for a Novell Gold reseller in Southeastern Massachusetts. It was there that I cut my teeth on technologies like NetWare, LAN Manager and SCO UNIX. And after 5 years of often grueling work in and around the Bay State, I emerged with a strong appreciation for the difficulties faced by those working in the IT trenches.

My next stop was also my first real gig as an IT journalist. The year was 1993. Windows Sources magazine was about to launch as a new Ziff-Davis publication, and Editor-in-Chief Gus Venditto was looking for talent that could write authoritatively about Windows-related issues. I was brought on as a Contributing Editor – along with John C. Dvorak and others – and carved out a niche covering Windows data communications, among other topics.

IBM Comes Calling

In 1995, after two years of writing for Windows Sources, PC Computing and some extensive work at ZD Labs, I was approached by IBM about doing some consulting work for their Personal Software Products (PSP) division. Jay Sottolano was an acquaintance from the trade show circuit, and he was looking for someone to help write positioning papers and other collateral in support of their OS/2 marketing efforts. Knowing this would signal the end of my career as an IT journalist (back then, the industry frowned on such conflicts of interest – now writers just “disclose” them), I took the leap anyway, forming my first corporation – Competitive Systems Analysis, Inc. – with my new wife as my business partner.

Together, we spent the next year travelling the world on IBM’s behalf, giving stump speeches to the PSP and PSM (Personal Software Marketing) faithful and providing competitive marketing advice to the company’s Software Solutions Group (SWS). Along the way, I got the chance to brief a number of high-level IBM luminaries, including CEO Lou Gerstner and CFO Jerry York, and I was also fortunate enough to work with some exceptional executive talent, including Alan Fudge. It was a heady time for a young professional barely out of school, and I did my best to make the most of every minute.

Settling Down for a Spell

But, eventually the rigors of nonstop travel and feast/famine contract cycles – plus the arrival of my first child – prompted me to seek out a life with greater stability. So I did what most consultants do at this juncture in their lives: I bought a house in the Bay Area (Danville – finest town in the USA, IMHO) and got a real job. Specifically, I took a position as a Senior Industry Analyst with Giga Information Group (now part of Forrester).

It was circa 1997, and I spent the next year working closely with some top notch IT analysts, like Richard Fichera, as well as with more than a few egotistical blowhards (I’m talking to you, Rob Enderle). I also got to spend some quality time working with Gideon Gartner, the legendary founder of the Gartner Group. For whatever reason, Gideon took a liking to me, and I was able to learn a great deal about the inner workings of the IT research industry under his sage tutelage.

But the truth is, I was bored at Giga. The type of content we were asked to produce – dry, color-free analysis of IT trending minutia – was taxing to produce. I missed the chance to get my hands “dirty” working with technology directly, especially Windows NT, which was my first true love. And it was during this time that the pseudonym, “Craig Barth,” emerged for the first time: As a pen name I used while moonlighting for Windows NT Magazine as their News/Analysis Editor.

Note: Contrary to popular opinion, I am – and always have been – a huge Windows NT fan. I was one of the first journalists to jump on the Windows NT bandwagon, even going so far as writing a book promoting IT’s “Migrating to Windows NT” (that’s the title – from Brady Books - look it up) in 1993. In fact, by the time I left my first gig at Windows Sources, I was writing the “Windows NT” column for them on a monthly basis. And, of course, the potential synergy with Windows NT Magazine was a no-brainer.

Intel Comes Knocking

So when Intel Corporation came knocking in mid-1998 with an offer to work with them as a performance engineering consultant to their Business Desktop Marketing (BDM) group, I once again took a leap of faith. I resigned from Giga and resurrected CSA (which had lay dormant during this time), then spent the next two-and–a-half years designing and testing new benchmarking scripts to push Intel’s high-end hardware through its paces.

During this time I worked on numerous projects involving chip launches (the Pentium III, Mobile Pentium II/III, the Pentium 4), networking gear (desktop GbE), and multiprocessor systems (in conjunction with Dell Computer, another client of mine). An overarching theme throughout this engagement was the concept of “Constant Computing” – i.e. the idea that PCs are never really idle, especially under more complex OS like Windows NT/2000 – and I produced over a dozen white papers for Intel cataloging my findings and conclusions.

Note: My primary contact throughout this time period was Tom Harper, a maverick technical marketing manager who reported directly to Pat Gelsinger. Pat was also aware of my work and even used one of the test cases I developed as a Constant Computing demo piece for his keynote speech at the 2000 ISMC (International Sales and Marketing Conference). They later licensed the test case code from me for a tidy sum.

Eventually, my services were farmed out to Intel’s Desktop Architecture Labs (DAL), where I continued to refine my methodologies and also began formulating the idea for my first stand-alone test tool: Benchmark Studio.

The Wall Street Connection

Note: The following section has been heavily modified at the request of our client. It is a violation of our contract to identify them publicly, and we are hereby honoring that request.

They say that all good things come to an end, and when the dot com crash hit silicon valley I found myself out of my primary consulting gig and looking for a new challenge. Working for Intel had left me flush with cash (all told, they poured nearly three quarters of a million dollars into my small, two-person consultancy), so I had time on my hands to work on my test tool ideas. Eventually, I released the first version of Benchmark Studio as a low-cost, commercial test suite. And, lucky for me, it caught the eye of the lead tech in the PC Engineering group at a large financial services firm in NY.

That foot in the door turned into the biggest success story of my career. But back then, circa 2001, I was just happy to have them as a customer, period. And eventually, after cultivating a strong support presence and generally proving myself as a reliable technical resource, my contact recommended me as the best person to help one of the company’s largest divisions develop a new performance monitoring framework for their high-end Windows-based trading workstations.

Needless to say, I jumped at the chance, and over the next three years I developed and refined what ultimately became a commercial performance monitoring product known as Clarity Suite. And as we moved from pilot project, to limited production deployment and finally a business unit-wide site license in 2006, I was rewarded with a steady stream of consulting contracts culminating in the aforementioned licensing deal.

And while I’m not a liberty to discuss the value of these business transactions, suffice to say that they far exceeded my total compensation from Intel. Add to this a smaller scale deployment at CSFB (Credit Suisse First Boston, which was what they were still called in in 2001) and a pioneering study of workstation scalability conducted at Kent State University (under the direction of Hewlett Packard and Intel Corporation – the white paper is still available), and I was quite busy during the first half of the last decade. Again, heady times for a now older and more seasoned IT veteran.

Devil Mountain Software Emerges

It was during this timeframe that I decided a new corporate presence was required to help differentiate my consulting past as Competitive Systems Analysis, Inc., from my long term goal of productizing Clarity Suite and bringing it to market. So I once again collaborated with my wife and long time business partner, and together we created Devil Mountain Software, Inc. – with me as the public face of the company and her as the silent partner working behind the scenes to manage the business.

Eventually, we brought in other partners to invest in the venture, but we kept the management team limited to just ourselves. And when a site license came through from one of our biggest clients (essentially guaranteeing we wouldn’t have to work again for the rest of our lives), I started thinking about alternative ways to leverage what was now DMS Clarity Suite – options and scenarios that existed outside of the traditional commercial resale channel.

One idea that I had always wanted to explore was taking DMS Clarity Suite online – essentially providing the same kinds of monitoring and analysis functionality we were delivering to our commercial clients on their in house servers, but in a more limited, less feature-complete format. The goal would be to create a community of users around a set of free tools and services, and then to mine the data they uploaded in order to gain insight into trends and developments affecting the broader Windows community.

Thus, the exo.performance.network was born. But not before I made a fateful detour back into the world of IT journalism – a wrong turn I would eventually regret in ways I could never have imagined.

Early InfoWorld Involvement

Throughout the early part of the recent decade, I kept a toe in the water of my old haunt, IT journalism. It started quite small. From time to time I would collaborate with contacts at the InfoWorld Test Center – then still a real, physical lab space in silicon valley. I got to know some of the lead contributors, like PJ Connolly, quite well, and we’d get together at the lab sometimes to run benchmark tests using the aforementioned tools I developed for Intel, etc.

Eventually, the lab was shuttered, and InfoWorld started drastically downsizing its operation. It was during this time that I struck up a collaborative relationship with Doug Dineley, who to this day remains a class act and the one person I had the most respect for at that publication. But back then, it was all about product reviews and testing. Doug would present me with a list of possible story angles and I would pick and choose based on what struck my fancy at any particular time. Eventually, I became one of his more regular contributors, and he remained a good friend and close confident right up to the bitter end of my involvement with the publication.

But even when I started splitting time between my day job supporting my commercial clients’ deployments (now pretty much just a software maintenance role), my hobby building xpnet.com and these occasional freelance reviewer gigs, the relationship with InfoWorld remained casual. It wasn’t until 2007 that things got serious. And that year, more than any other, will go down as one of the worst I can remember.

Wooed by the Dark Side

Late 2007 is a time period pivotal to this story because it signaled a series of beginnings. It was when I first started thinking about blogging for InfoWorld. And it was also when I first approached the publication about partnering with DMS on the promotion of an online service, one built around the still evolving precursor to what would ultimately become the exo.performance.network.

And at first, neither venture went very well. Newly promoted Editor in Chief Eric Knorr, who I had never met and had barely heard of prior to his ascension, was resistant to the idea. He didn’t think it would fit with their still undefined editorial focus (InfoWorld had only recently decided to drop print and go online only). Meanwhile, the blog became tedious to maintain, especially since I wasn’t being paid for the work.

But eventually, things changed. Eric settled in as Editor in Chief, and a new Executive Editor, Galen Gruman, emerged to forever change my life. For starters, Galen took a liking to the xpnet.com idea. He began championing the idea internally, working with me to refine the messaging and coordinate with the various sales and marketing groups to achieve buy-in. At the same time, Galen took it upon himself to become the primary editor of my now paid blogging gig. He helped me to identify which topic areas were having the most impact – and thus started me on my descent into internet “Shock Jock”hell.

You see, what Galen and I discovered was that the topics that were most effective in drawing readers were also those that skirted the edges of both legitimacy and taste. For example, if I wrote an entry detailing some deeply held belief about a particular IT vendor or technology, nobody paid any attention. However, if I simply vented about something that was bugging me – a mysterious crash in Vista or some piece of VDI “marchitecture” coming out of VMware – the attention level shot through the roof.

Eventually, I found myself enjoying the buzz that my “angry missives” would generate. Little did I realize how quickly such a model could deteriorate or how much it could damage me, personally, once it fell apart.

A Slippery Slope

As the missives kept coming, and the traffic numbers kept climbing, Galen and I – along with Eric Knorr – worked to evolve the persona of “Randall C. Kennedy.” I was now to be the lightning rod of the publication, the guy who puts the most provocative spin possible on every story with the intention of aggravating as many zealots as possible. The net result was gobs of page views – I was the single biggest draw, site wide, for all of 2009 – and also a great deal of  scorn from my contemporaries.

Ironically, It was the growing disapproval of my peers in the industry that first gave me pause. I realized that I was now regularly espousing opinions and viewpoints that had almost nothing to do with what I truly believed. Rather, they were simply extensions of the RCK persona. I became the “Microsoft basher” when, at heart, I held the company in the highest regard. I became the “Vista basher” and the “Windows 7 basher” when, in truth, I used both every day and found them to be excellent products (yes, even Vista). The whole persona had taken on a life its own, and I was terrified that it would ultimately spin out of control.

Which of course it did, in the most spectacular fashion, and just in time to nearly destroy the one project that I truly cared about and believed in: The exo.performance.network.

As my blogger star rose at IDG, it became easier for me to obtain the kinds of concessions from management that would help me promote my new pet project. Galen Gruman, who had worked with me from the start to develop internal momentum for the project, was now actively helping me to integrate it with InfoWorld’s web presence. While I provided the back-end data collection and analysis engine, Galen crafted the InfoWorld side of the equation, including the various registration pages, widget wrappers and javascript code that helped to glue the whole solution together.

In fact, Windows Sentinel (the co-branding nomenclature that Galen came up with for our collaboration) would never have happened if my colleague Mr. Gruman hadn’t pushed it through the various layers of IDG bureaucracy. The man was a bulldog, and working together we managed to launch Windows Sentinel in April of 2008 to little fanfare and even a few snickers.

Note: It’s important that the public understand the nature of the contractual relationship between DMS and IDG. The arrangement was strictly one of cross-promotion – DMS would provide the service and InfoWorld would promote to its readers. And while both parties would share in the registration data and collected metrics, at no time did any money change hands.

This was strictly a marriage of convenience, and the only side to ever see even a dime of revenue (from advertisements and sponsorships associated with the registration pages and related collateral) was IDG. So my detractors can put away their evil conspiracy theories of greed and avarice – they simply do not apply here. I gave everything to make Windows Sentinel a reality, and got virtually nothing in return.

The Return of Craig Barth

But back to the story. From the beginning, Sentinel had a credibility problem. Though it was being promoted as an independent service and research entity, it still had my name attached to it. In fact, InfoWorld made a point of identifying the solution as the product of a collaboration between the publication and its Contributing Editor, Randall C. Kennedy, the founder of Devil Mountain Software, Inc.

They even plastered as much across the registration page. It doesn’t take a genius to tell you that having the industry’s most notorious internet “shock jock” as your only front man was not a formula for success. So I took drastic action. I created a fictitious spokesperson by resurrecting my pen name from days gone, Craig Barth, and assigning him the title of Chief Technical Officer for Devil Mountain Software, Inc.

It all started fairly innocently. I would receive an email inquiry from some media person asking about a piece of research I had published through my official exo.blog, and I would reply – not as Randall C. Kennedy, the “shock jock” that nobody took seriously anymore – but as Craig Barth, the ever helpful and deeply knowledgeable CTO of DMS.

Over time, these interactions became more frequent, and I began to enjoy my newfound anonymity. No longer fearful that my hard research would be rejected out of hand, I became bolder, even going so far as utilizing my alter ego when fielding phone calls from the likes of Gregg Keizer and others.

And all the while I wondered to myself why nobody was making the connection? How could a legitimate service that was so publicly launched by one of the most reviled personas in the IT media sustain such a ruse? Couldn’t they see the absurdity of it? Devil Mountain Software, Inc., was the company that Randall C. Kennedy formed. There was no attempt to hide this fact.

Of course, someone did see through it all. And it was with fear and trepidation that I fielded a phone call from my close colleague and co-architect, Galen Gruman. He had seen one of Gregg Keizer’s earliest mentions of me – a report on on benchmark results that showed Vista SP1 failing to provide a promised performance boost – and he wanted to know who the hell this Craig Barth guy was.

After all, if anyone knew me well, it was Galen. He was there when I first pitched the idea for Windows Sentinel. He was there as I wrestled with how to separate the hard data from the “shock” persona. And since he was intimately familiar with DMS and its management team of 1+, it was only a matter of time before he made that call.

To be fair, Galen was not pleased when I confessed my actions. He felt that I was pushing the ethical boundaries by misleading the public in this way. However, for whatever reason – personal loyalty, a desire to maintain the status quo with the “shock jock” persona – he agreed to keep it to himself.

Frankly, I’d figured he’d expose me on the spot. But instead, he turned a blind eye, even as I referenced my own research in my blog –data that had been promoted by Craig Barth and consumed by countless other media outlets ignorant of the ruse at play. Somehow, Galen managed to hold his tongue for over a year, even though secretly he must have wondered when it would all come crashing down.

Note: While I can’t say unequivocally when InfoWorld Editor in Chief Eric Knorr was made wise to the ruse, I’m pretty sure it was well before the whole mess spilled over. Eric was as intimately familiar with the nature of DMS as Galen, and for him to pretend to have been oblivious to the situation – when the persona of Craig Barth from DMS  had been plastered all over the Internet for a year or more – makes such a claim hard to swallow. And despite Galen’s apparent loyalty to me, I still can’t see him keeping this from his immediate superior, if for no other reason than he would need a way to cover his own ass during the inevitable implosion.

Crash and Burn

And implode it did. After publishing a particularly alarming set of findings – which I still stand behind while continuing to evaluate new data – the internet became engulfed in controversy. As the furor grew, and as more and more media outlets questioned just who this Craig Barth fellow really was and what made DMS tick, the house of cards came crumbling down. The persona of Craig Barth was exposed as one Randall C. Kennedy, and the entire web of half-truths and misdirection was exposed as the ruse that it was.

Frankly, I was relieved it was over. Balancing the two worlds had become almost impossible, and I longed to escape from the “shock jock” persona that had been created for me so I could once again embrace my core beliefs. But what surprised me was the level of anger expressed towards me for what I saw as nothing more than a very poorly executed attempt to escape from the proverbial rock and hard place. Simply put, the level of vitriol expressed felt way out of proportion, and the claims of “egregious ethics violations” and “insufferable breach of trust” were simply over the top.

After all, it’s not as if I had trafficked in nuclear secrets or or stolen someone’s credit card information. I merely tried to shield what was important to me from the fallout of the world that had been created for me. And in the end, I failed miserably. It was a dumb move, born of frustration at feeling painted into a corner of my own making. I should have just walked away earlier – it’s just a blog in the end – but I lingered too long on the edge of the razor, and eventually it cut the heart out of everything I had tried to accomplish.

Please note that I’m not looking for sympathy or even understanding. My goal here is simply to clear the air – to tell my side of the story and to hopefully clarify both my professional background and the nature of the very legitimate products and services I’ve developed.

At the end of the day, this whole affair is just a blip in the timeline of a a career that spans two decades, one which saw me working with a bunch of amazing people at a some of the most revered companies in the world. I’m proud of my many accomplishments, and I’m happy that I can finally close this chapter of my life.

It’s been one hell of a ride…

Epilogue

So, what next? For starters, neither the exo.performance.network or Devil Mountain Software, Inc., are going anywhere anytime soon. I will continue to develop and expand what has become my true labor of love, but now with a renewed commitment to the integrity and authoritativeness of the data that makes the service so special.

What I will not be doing is venturing back into the field of IT journalism. Not because I couldn’t do so if I chose to – you’d be surprised at how many emails I’ve received offering to host my “shock jock” persona on a different site (some people will stoop to anything for a few page views) – but because I never want to compromise my integrity that way again.

At the end of the day, I really am Randall C. Kennedy – a passionate fan of all things Microsoft Windows-related. Thank you for taking the time to hear me out.

RCK


Get more charts like these for free at www.xpnet.com.

Read more...

Tuesday, February 23, 2010

(Editorial): Clearing the Air On the SSL Issue

Much has been made recently of ZDNet’s so-called “expose” (really a smear piece) about our software. And one of the most disturbing accusations is that we were exposing user data by having our tracker agent communicate over the unsecured TCP Port 80. As evidence, ZDNet blogger Jason Perlow offered up network traffic sniffer data showing our client communicating in the open and without encryption.

After an extensive investigation on our end, we have concluded that the use of Port 80 by the ZDNet client was an isolated incident, the result of a registration script error on our end. Simply put, we were not prepared for the sudden disappearance of our co-branding partner, InfoWorld (ZDNet registered with us the same day that IDG pulled the plug), or their normal registration process, which they controlled. As a result, when ZDNet registered its test VM with us, they were inadvertently redirected to an old test version of the script and it input the incorrect values for the ASPScriptPort and ASPScriptProtocol fields in our console configuration table.

Note: Other clients that were connected at the time were indeed communicating over SSL/Port 443. In fact, if ZDNet bothered to check back now they’d find that the agent now defaults to this mode for all future connections.

Much has also been made about our SSL certificate, which expired in September. To this accusation we have no response other than to say that we screwed-up. It was on our to do list and we missed the reminder notices from our issuer. Fortunately, our client agent is configured to ignore certificate errors when dealing with our server - a precaution we built in when a malformed certificate tripped us up several years back. So while the certificate may have been invalid, our client agents were still capable of connecting to the server using an SSL-secured connection. Regardless, we will be renewing the certificate shortly.

Bottom Line: The DMS Clarity Metrics Tracker agent is an SSL-secured program that does not expose user’s data – when properly configured. Obviously, the events of this past weekend – specifically, the sudden removal of the Windows Sentinel code from InfoWorld’s web site (an act that was in violation of our hosting contract with IDG) – caused a few ruffles in what had heretofore been a fully secure service with an unblemished operating record spanning nearly four years.

But we don’t expect Mr. Perlow and his hired hit-cronies at ZDNet to pay any attention to above disclosure or how it might explain their “alarming” experiences with our agent. They have their marching orders, and all signs point to Redmond as the real impetus behind the whole sordid affair.

Make no mistake: Online IT Journalism is dead, folks, the victim of greed, bias and an unquenchable thirst for page views…

RCK

Read more...

(WCPI): Windows 7 Improves ... But Only a Little

The latest WCPI snapshot – taken at 12:00AM EDT on 2/23/2010 – shows the Peak Memory Pressure Index value for Windows 7 improving, but not by much. An influx of new Windows 7 systems in the past few weeks is serving to reshape the exo.repository’s demographics. And this, in turn, is being felt in the WCPI calculations, which are becoming increasingly influenced by Microsoft’s latest and greatest.

Figure 1 – Windows 7  WCPI Index Values (Interactive)

The above is good news for IT shops concerned about Windows 7’s memory consumption habits. The above trend has helped drop the percentage of monitored Windows 7 systems showing excessive memory use to just below 75% – an 11 point improvement vs. our previous analysis pass, which was based on data through January, 2010. We’re hoping that, as even more Windows 7 systems come online, these numbers will normalize further and we’ll be able to offer an ever more accurate reading of the broader Windows 7 installed base.

Unfortunately, the news is not so great for Windows Vista users. Fully 79% of Vista systems we monitored continue to show signs of high memory pressure and related paging activity. IT organizations should factor this into their consideration of when to migrate away from Microsoft’s much-maligned Windows 7 predecessor and to the newer, less RAM hungry Windows 7.

Note: The above data points were compiled from the hundreds of millions of system and process metrics records that have been uploaded to the exo.repository by participating member sites. You can keep tabs on all of our research findings by visiting our web site: www.xpnet.com. There you’ll find a wide selection of interactivechart objects and free monitoring tools that you can use to compare your own systems to the WCPI and similar independent research metrics published by the exo.performance.network.

Read more...

Monday, February 22, 2010

(Trends): Windows 7 Gaining on Vista

A recent influx of new clients to the repository has shifted the demographic of our user base once again. Windows 7, which had been lagging in the low-mid single digits in terms of usage share, has jumped a whopping 4% points in the past few weeks. Microsoft’s latest and greatest is poised to break into double-digit usage share territory in the next 30 days, with much of its gains coming at the expense of Windows Vista – though our XP population, too, has seen a modest dip in the past week or so.

Figure 1 – OS Usage Share States as of 2/22/2010

Needless to say, this is good news for Microsoft. In a week where the company saw shocking new research on physical memory consumption under Windows 7, the revelation that Windows 7 sales remain strong among tech savvy users like ours must be a relief. Combined with our other recent finding regarding Windows 7 CPU efficiency, this truly is a nice way for Microsoft company executives to start their weekends.

Note: The above statistics were generated from the over 13 billion process records collected from the nearly 24,000 registered, active xpnet.com users. If you’d like more information about the exo.performance.network, including how to reproduce the above chart object on your own site or blog, please visit www.xpnet.com.

Read more...

(Editorial): When Microsoft Attacks ... Again

Raw nerves. You know you’ve hit one when the entity in question practically jumps through the roof to staunch the pain. In my case, the nerve belonged to Microsoft Corporation. And true to form, the company spent incalculable political capital – and cashed in more than a few favors – in order to orchestrate the most one-sided smear campaign in the history of IT journalism.

What has been said about me personally, or Devil Mountain Software as a company, is irrelevant – all de rigeur for the rabid tabloid crowd. Rather, what is disturbing is the timing of it all. The parties in question only loosed their dogs after this project, the exo.performance.network, hit a bit too close to home. It was our research into Windows 7 performance that prompted Microsoft to call in its chips.

And call them in it did, instructing its media cronies to silence me by dragging my name through the mud and casting doubt about what is by any measure a very successful professional history. And now, with xpnet.com, I’m doing more to expose the inner workings of the Windows community than ever before, putting myself on the line so that the truth about Windows market share composition, usage trends and real-world performance is known.

So, despite the hoopla surrounding the day’s developments, I will continue to flash my light at the dark underbelly of Microsoft’s cash cow. Because, in the end, this isn’t about any one personality or pseudonym. It’s about the data, and how it describes the IT world around us. And whether that data is positive for Microsoft or negative, I will never lose my resolve or allow myself to be cowed into silence by the powers that be.

In the meantime, I’d like to invite individuals and organizations that want to know the truth about Windows to register for a free exo.performance.network account. You’ll gain access to a range of useful tools while helping us continue to deliver valuable insight into Microsoft’s dominant OS platform. Together, we can make Windows a more open environment for everyone.

Randall C. Kennedy

Read more...

Sunday, February 21, 2010

Editorial: What took you so long?

It took three days, countless idiotic comments (some too obscene for us to approve), and more than a little patience, but finally somebody bothered to do what anybody with half a clue could have done all along. SirBruce, one of our new favorite readers, actually took the time to fire-up the only performance monitoring tool that matters (ironically, called “Performance Monitor”), and start logging the Committed Bytes counter.

What he found was that, as we tried to explain in various other posts here, Committed Bytes does not count cache and/or superfetch-related memory allocations. Rather, it parallels (though does not exactly follow – it’s still a separate counter) the “In Use” value from the Resource Monitor utility that everyone in the blogosphere keeps parroting. All of which is important because, as SirBruce noted in his comment, such a result just might prove we were right all along.

You see, just as SirBruce so elegantly demonstrated with “Perfmon,” we, too, query the Committed Bytes counter directly – in the case of our Tracker Agent,  via the Performance Data Helper (PDH) libraries. We see what “Perfmon” sees, and those data points reflect the fact that Committed Bytes is indeed accurately representing a fairly close approximation of the real physical memory use in the system. This is true regardless of whether superfetch is enabled or disabled – or even exists – for the system being monitored. They are two completely different entities, and as Mr. Kipling was known to say, “neither the twain shall meet.”

For example, on my own development system – a quad-core workstation with 8GB of RAM – the Committed Bytes counter in “Perfmon” typically hovers within a few hundred MB of what Resource Monitor is reporting as “In Use” memory. If I fire-up a bunch of memory intensive tasks (my personal favorite is VMware Workstation with a few 1-2GB VMs and the memory configuration set to keep everything in RAM), Committed Bytes will increase virtually in lock-step with the “In Use” value in RM.

Likewise, if I start closing VMs, Committed Bytes drops, again in virtual lock-step with “In Use.” And, if I really push the system, so that Committed Bytes/”In Use” memory is pegged at or near the 8GB mark (i.e. my PC’s total RAM size). the other critical “Perfmon” counters we record with our agent – Memory\Pages In/sec and Paging File\% Usage – start to climb rather quickly.

Which is why we factor all three of the above counters into our final Peak Memory Pressure Index calculations. Because when these three counters climb above the thresholds we’ve defined for the WCPI calculation process, it means that your PC really is running out of memory.

Folks, this isn’t rocket science. Anyone with any real experience monitoring Windows performance in the real world – and no, playing with Task Manager or Resource Monitor in your mom’s basement doesn’t qualify – knows we’re right. And now one of our fine readers has done the honor of vindicating us.

Bravo, SirBruce, for not sticking your head in the sand and actually bothering to think before feeding into the frenzy of idiocy that has taken over the blogosphere on this issue.

Read more...

(WCPI): Windows 7 More Efficient than Vista or XP

While Windows 7 may have weight issues, there’s no disputing it’s a very responsive OS. Microsoft has gone to tremendous lengths to ensure that the end-user experience with Windows 7 is a positive one. So it comes as no surprise that, upon reviewing processor-related system metrics data from our network of over 23,000 Windows IT sites, we found Windows 7 to be quite efficient “under the hood.”

An analysis of the latest WCPI numbers shows that Windows 7 systems are, on average, making more efficient use of the processing resources available to them. For example, on our Peak CPU Saturation calculation, Windows 7 systems scored an average index value of 90, vs. 93 for Windows XP. Meanwhile, Windows 7’s immediate predecessor, Windows Vista, turned in a disturbingly high index score of 158 (Snapshot Date: 2/20/2010).

Figure 1 – Current WCPI Index Values (Interactive)

Note: The Peak CPU Saturation Index is derived by comparing the System\Processor Queue Length value against a predefined threshold – in this case, a maximum of 2 waiting ready threads per physical CPU (e.g. a quad core CPU would be flagged when the queue exceeded an average of 8 waiting threads over a 60 second sample period). This data is then further qualified by searching for sequential occurrence patterns within the raw data points in order to identify those peak utilization events that will most directly impact processor throughput.

Needless to say, this is good news for Windows 7 adopters and also confirms our own internal research into multi-core scalability under the three widely deployed Windows desktop OS. Clearly, Windows 7 is making good on Microsoft’s promise to better support key multi-core concepts, like NUMA node affinity,

Read more...