Friday, July 16, 2010

(Announcement): Pulling the Plug

It’s been nearly 4 years since we first began offering our free, hosted performance analysis services to the general public. During that time we’ve registered over 24,000 users and assembled the world’s most comprehensive repository of real-world Windows metrics data ever created.

By almost every measure, the exo.performance.network has been a success. Which is why it pains us to announce that, as of July 31, 2010, we will be shuttering the service for good. All access to the repository site will blocked at that time, and both the exo.widgets and exo.charts objects will cease to function. Likewise, OfficeBench 7 will become non-functional since it relies on the repository for results calculation and storage.

Our reason for discontinuing the service is simple: It doesn’t make us any money. We originally launched the exo.performance.network for the purpose of compiling system and application metrics and then translating our findings into new and compelling research content. However, a viable market for said content never materialized, and all attempts to secure a sponsor for the site have failed.

Recently, we toyed with the idea of upgrading the site’s functionality and then charging a modest monthly fee for access to advanced features (see previous entry regarding DMS Clarity 10 and Windows Pulse). However, we quickly concluded that the effort required to add the necessary payment processing and subscription management capabilities was impossible to justify – especially when we had no guarantee that anyone would be willing to pay for such a service should we proceed with the conversion.

So, in the end, we decided the best thing to do would be to simply pull the plug. As of July 31st, we’ll be taking down our co-located servers and converting xpnet.com to a simple, static web site. Then, we’ll refocus our energies on servicing our existing corporate clients and improving the commercial version of the DMS Clarity Framework.

It’s not the ending we anticipated when we started out on this journey over 4 long years ago, and we’ll be sad to see those server LEDs go dark. We’d like to thank those who contributed to the repository and helped make the exo.performance.network a truly unique project. It’s been an interesting ride.

Note: Anyone interested in our commercial offerings can find out more by emailing us at info@xpnet.com.

Read more...

Monday, May 31, 2010

(Editorial) Announcing DMS Clarity Suite 10

After two months of heads-down development, punctuated by several design breakthroughs and a very successful commercial beta cycle, we’re pleased to announce the release of DMS Clarity Suite 10.

This next generation of the DMS Clarity Framework provides us with a robust new platform on which to base a variety of exciting on-site and hosted performance monitoring and management offerings.

image_thumb1Figure 1 – DMS Clarity Suite 10

Highlights include:

  • A complete AJAX makeover for a more interactive and responsive site UI (i.e. no more postbacks).
  • A highly componentized architecture, with self-contained charting and analysis widgets that can be detached from the base frameset and configured as stand-alone monitors for system, process and network metrics.
  • Charts are now fully interactive, with each data point serving as a drill-down link to further refine the report parameters. Full chart timeline scrolling/panning support is also included.
  • Additional system and process metrics (Handle Count, GDI Objects) as well as better integration of Custom Counters, including extensive charting support.

Per our existing licensing model, we’re offering DMS Clarity Suite 10 both as a hosted solution – for shops that don’t wish to maintain their own performance monitoring/management framework – and as a traditional, customer deployed solution for on-site scenarios.

We’ll also be offering a subset of the DMS Clarity 10 functionality through our exo.performance.network site. With over 24,000 users, the exo.performance.network is the world’s largest repository of real-world metrics data as collected from Windows PCs and servers from around the globe.

The new DMS Clarity 10 functionality will be offered through our forthcoming Windows Pulse service, which will serve as a direct replacement for – and major capabilities upgrade to – our existing Clarity 9-based widgets and tools.

We’re also providing a comprehensive “dashboard” solution as part of Windows Pulse. Dubbed the “Pulse Pad,” this free-form, AJAX-based UI will allow users to configure and “dock” individual widgets to a persistent presentation “canvas” that will preserve both the widget configuration parameters and on-screen layout between sessions. Here’s a sneak peak:

image Figure 2 – The Windows “Pulse Pad”

Users will be able to re-arrange and position (drag & drop) widgets at will, as well as “undock” them for use as stand-alone monitoring objects. Support for multiple pads, each with its own set of up to 10 discrete widgets, is also in the pipeline. We hope to have a public beta version available by the end of June, and the formally launch the service later this summer.

Note: Questions and press inquiries about DMS Clarity Suite 10 and the Windows Pulse services should be directed to our general information email address: info@xpnet.com.


Get This and Similar Charts at www.xpnet.com

Read more...

Tuesday, April 27, 2010

(Editorial) Gizmodo Got What They Deserved

A comeuppance. That’s how I describe the recent Gawker-Gizmodo-iPhone theft debacle. What the organization in question did – paying cold, hard cash for what was ostensibly stolen property – was plainly criminal, and those behind the act are now being held accountable.

One would hope that such a well-publicized incident would serve to temper the blogosphere’s appetite for sensationalism. The spectre of illegal or immoral actions leading to very real consequences (including the potential for jail time) should be enough to give Gizmodo’s contemporaries pause. However,I fear the lesson has already been lost on a community that fashions itself as the “anti-media,” but which runs for cover behind so-called “shield” laws designed to protect the real journalists they so often mock.

And make no mistake: Bloggers are not journalists. Real journalists have ethics. They check their facts and follow well established rules of conduct: Don’t fabricate; don’t obfuscate; don’t steal. Most high-profile bloggers, by contrast, follow a looser, “shoot first and ask questions later” philosophy. It’s all about beating the other guy to the punch by being the first to break that big scoop.

Note that I speak from experience. As InfoWorld’s most successful blogger throughout 2008-2009, I spent much of my time trying to tap into the industry zeitgeist. And while my marching orders frequently came from above – “trash this, promote that” – it was left up to me to figure out how to best implement that editorial vision.

I chose the persona of “Randall C. Kennedy – Industry Curmudgeon,” but at no time did I ever fashion myself a true journalist. Rather, I was just some guy with a poison pen regurgitating supplied opinions on the latest hot topics – a cog in a new media machine who’s sole purpose was to feed an insatiable appetite for page views.

But despite my well-documented self-loathing, it wasn’t until I was on the receiving end of a blogger-led new media assault that I realized just how far removed I was from the shores of professional journalism. In fact, as I watched ZD Net’s Larry Dignan and crew fabricate, obfuscate and steal my reputation away from me, I felt like I was staring into a mirror.

The tables had turned. The shoe was on the other foot. I had gone from victimizer to victim, and my eyes were finally opened to just how violent the blogosphere had become. Never mind that Mr. Dignan’s smear campaign has since been discredited (his subsequent retraction and acknowledgement that our Wall Street clients do in fact exist and continue to use our software to this day was most touching). The damage was done, and Google will see to it that his fabrications long outlive his credibility.

As will mine – and every other high-profile blogger who has abused their position to promote an agenda. We’re all guilty of “playing journalist” while thumbing our noses at the rules of the game. But the Gizmodo case signals a new low in the blogosphere’s storied history of unethical behavior.

Sensationalism, smear campaigns and now outright criminal activity. I’m glad I’m no longer a part of that community, and I hope the authorities seize this opportunity to teach the industry a lesson by throwing the book at those involved.

RCK


Get This and Similar Charts at www.xpnet.com

Read more...

Monday, April 26, 2010

(Stats) Office 2010 Delivers a Performance Boost

In a stunning reversal of nearly twenty years of progressive performance erosion, the latest incarnation of Microsoft’s ubiquitous productivity suite, Office 2010, is actually faster than its immediate predecessor, Office 2007.

Testing with the cross-version OfficeBench 7 test script shows Office 2010 to be roughly 9% faster overall when running on an identically configured Windows 7 desktop environment. This surprising result constitutes the first time in the decade-long history of OfficeBench that a newer version of Microsoft Office outperformed the one it was designed to replace.

O2010 ResultFigure 1 – OfficeBench 7 Results for Office 2010

Historically, new versions of Office have been slower than their predecessors thanks to the inclusion of additional features and a generally more complex code path. For example, moving from Office 2000/XP on Windows 2000 to Office 2003 on Windows XP showed a 15-20% performance decrease under OfficeBench, while moving from Office 2003 under Windows XP to Office 2007 on Windows Vista showed a whopping 40% or greater decline in overall OfficeBench script throughput.

O2007 Result Figure 2 – OfficeBench 7 Results for Office 2007

However, with the release of Windows 7, Microsoft has demonstrated a newfound ability to keep the “code bloat” demons in check, with the net result that Windows 7 performs on par with, and in some cases better than, Windows Vista.

Now, this same disciplined development model – a byproduct of veteran Office business unit manager and now Windows show runner Steve Sinofsky’s “less is more” philosophy – is reaping rewards for the desktop applications side of the house, which can market Office 2010 as a performance upgrade in addition to promoting its myriad functional enhancements.

Of course, benchmark results like the ones quoted above are intrinsically relative. For example, though Office 2010 provides a performance edge over Office 2007 on Windows 7, the combination of the newer Windows and Office still delivers a test script completion time that is 15-20% slower than Office 2007 running on Windows XP (SP3).

Note: You can conduct your own cross-version comparison test by downloading the OfficeBench 7 test script. It’s easy to use, works with any combination of Windows/Office, and is completely free. Grab your own copy today!


Get This and Similar Charts at www.xpnet.com

Read more...

Thursday, April 8, 2010

(Editorial) Used iPads to Begin Flooding eBay

Walking and chewing gum. It’s a simple idea – you do one thing while at the same time doing another, with your brain shuffling between the two (and various unrelated autonomic functions) to keep the whole parade in step.

Modern computers are similarly adept at juggling concurrent tasks -we PC users call it “multitasking.” Yet for users of iPhone OS-based devices, including the new iPad, multitasking is a completely foreign concept. You can’t walk and chew gum with the iOS. In fact, you can’t walk and do much of anything with an iPhone/Pod/Pad other than carry a tune (current iOS devices can play audio in the background, but that’s about it).

Why so many iPhone/Pod users are willing to put up with such a limited functional model has always been a mystery to me. Maybe it’s the fact that most people simply don’t expect too much from a “smart” device. After all, it’s not a real computer – it’s a phone (or media player). The fact that you can get online at all with such a device seems like a huge leap forward to many people, especially less sophisticated consumers with cash to burn.

Now we learn that the upcoming iOS 4 will likely improve on this situation, but only a little. Depending on the mood of Apple’s often arbitrary application approval process (Google Voice, anyone?), certain apps will be allowed to run in the background under iOS 4. So you may finally be able to walk and chew gum – and perhaps even carry that tune as well. But don’t expect to be able to walk and carry an umbrella (not approved), or to walk/sip a drink/read a paper or map (too many tasks at once).

Again, people don’t expect too much from a “smart” phone, so the continued lack of true multitasking will likely go unnoticed. But even the idiot iPhone-using masses know that PCs are supposed to multitask. Concepts, like switching away from one running application to do something in another running application – all while the first application is still running - are now thoroughly ingrained in our collective consciousness.

It’s a level of functional convenience that we’ve grown to expect in any serious computing device. Which is why I predict a high degree of long-term customer dissatisfaction with Apple’s latest and greatest, culminating a in a glut of used iPads hitting eBay (just in time for Christmas).

You see, customers are buying the iPad with the expectation that it will somehow replace all of their other computing devices. Such has been the media hysteria surrounding the product’s launch. However, in reality the iPad is nothing more than a glorified “companion” device – a limited function platform designed to compliment a Mac or PC while roping Apple’s customers ever more tightly into the iTunes sphere of influence.

So, when these early adopters - especially those swayed by the media hype - begin to bump into these very real functional limitations, they’ll likely feel cheated. And as they slowly gravitate back to their familiar PC environments (including powerful new Windows 7-based tablet PCs that outclass the iPad in almost every way), they’ll begin dumping their now underutilized Apple devices into the online auction meat grinder.

Hence my prediction: By year’s end there will be a glut of used iPad’s flowing through eBay, Amazon, et al. So resist the urge to splurge now and count your savings later.

RCK


Get This and Similar Charts at www.xpnet.com

Read more...

Friday, March 26, 2010

(Editorial) Why the Client Hypervisor is Doomed

Big surprise! Both VMware and Citrix have fallen behind schedule in delivering their “bare metal” hypervisors for client computing. Both had promised to deliver solutions by the end of 2009, but now VMware has reset that goal to the end of this year while Citrix has stopped talking about ship dates altogether.

So, what happened? In a word, hardware. Or more precisely, the ever changing cornucopia of PC hardware devices and configurations. A “bare metal” hypervisor has to sit at the very bottom of the software stack, where it directly manages, and controls access to, the underlying hardware devices. And doing those two things requires hardware-specific control software – i.e. device drivers.

Developing a comprehensive library of device drivers is no easy task (just ask Microsoft). Even assuming that you can create enough generic or “pass through” type modules to allow the majority of common devices to function, there will still be the inevitable subset of components or peripherals that refuse to cooperate.

It would only take handful of (highly publicized) customer run-ins with such finicky devices to give the “bare metal” client hypervisor a long term compatibility black-eye. Which is why these leading vendors continue to test – and wait.

But waiting (and testing) won’t solve the long-term problem of PC hardware churn. Unlike in the server space, where hardware evolves more slowly and where there are fewer basic configurations to support, the client PC space is in a constant state of flux. The never ending performance arms race, coupled with a near constant stream of innovation at both the internal and external component level, has turned the PC platform into a moving target. Blink, and you’ve missed it.

What is needed is a layer of hardware-level device abstraction, with groups of discrete components functioning as a logical block and accessible through a relatively static interface model. Intel is doing its best to promote as much through its vPro and similar management initiatives. However, these sorts of solutions require significant buy-in from the very OEM partners who stand to lose by making client computing environments portal across hardware platforms. Why would HP want to make it easier for you to move your stuff over to Dell or Acer?

And then there’s the 800lb gorilla in the room next door. Microsoft, which stands to lose the most in a hardware-abstracted world, has been relatively silent on the issue. Ask them about “bare metal” hypervisors on the client and they’ll respond that they “already have one…it’s called Windows.”

In fact, much of what a “bare metal” hypervisor does is entirely redundant in a Windows client environment. It’s an abstraction (client hypervisor) of an abstraction (the Windows Hardware Abstraction Layer). Which makes me wonder why you would really want one in the first place.

After all, it’s not like the current generation Windows platform is really tied to the underlying hardware. Technologies like Plug & Play and improved hardware auto-detection/driver-reconfiguration have made the process of creating a portable, hardware-abstracted Windows client image relatively trivial. This was the whole point of developing WIM and other post-XP installation technologies: To make PC imaging easier.

So, if the primary goal of a “bare metal” client hypervisor is to further abstract the OS from the hardware (and I give zero credence to the “other” reason being bandied about: Running multiple OS VMs on a single PC), and if this task is already handled quite effectively by Windows and its well-established device driver ecosystem, then the only real reason to pursue such a strategy is if  you’re trying to do an end-run around Microsoft’s desktop hegemony.

Which is exactly what VMware and Citrix (not to mention Microsoft’s fair weather friends at Intel) are trying accomplish. They want to remove the Windows kernel/HAL/driver model as the gatekeeper to the PC client world. As such, their actions represent a clear and present danger to the ongoing survival of Microsoft’s core desktop OS business.

And we all know what happens to companies that post a threat to Microsoft’s bread-and-butter revenue stream. First they pan you. Then they copy you. And, finally, they bury you – typically with one of those infamous “free” solutions that seems to fit the bill but still somehow locks you into their world.

A “bare metal” hypervisor on the desktop? Without Microsoft’s direct support?

Good luck VMware and Citrix…you’re going to need it!

RCK


Read more...

Tuesday, March 23, 2010

(Editorial) Web Developers: Time to Dump Firefox?

As a commercial web developer, I’m constantly on the lookout for new trends in browser adoption and usage. After all, there are only so many hours in a day, and investing time and energy supporting a faltering standard is both frustrating and inefficient. So it was with some hesitation that I approached our latest project: A complete overhaul of the user interface for our commercial metrics analysis portal site, DMS Clarity Suite 10.

I knew from the last go-around that getting our site to render consistently across the leading browser platforms (legacy IE 6/7 and Firefox) was a chore, one involving lots of dynamic tweaks and clever hacks. Now we were planning to expand this list to include several newcomers, including IE 8 (running in “standards compliant” mode) and Google’s Chrome. The thought of testing, tweaking and re-testing each and every page against four or more separate rendering models was enough to make me start breaking out in hives.

Worse, still, was the fact that, with our DMS Clarity 10 release, we weren’t just overhauling the UI. We were gutting the entire site to make way for a new, highly-visual, componentized interaction model. Gone were the static page layouts of the past. In their place, a collection of discrete rendering widgets that would assembled on the fly to create a fully customizable presentation. These widgets could be re-arranged, broken-out into their own windows and re-attached to other parts of the site in order to better identify and expose the most critical data points. Here’s a screenshot of the net result:

image Figure 1 – DMS Clarity 10 Portal Site (BETA)

Not surprisingly, the project ran behind schedule, with much of the delay attributable to us figuring out how to get identical results across our various target platforms. For example, calculating the window resize values for our slide-out widget configuration panel. Each browser had its own idea of how “big” or “small” a window would become when we executed the window.resizeto() method. To short, and you’d cut-off the panel. Too long and you’d end up with lots of ugly white space.

Our workaround was to read the browser make/version via JavaScript and then dynamically resize the underlying ASP.NET panel control prior to rendering the page - not a complicated task, but one that required a lot of trial and error to get the desired result. It definitely qualified as a “hack” solution in my book, though by all accounts its a fairly common one.

image Figure 2 – Clarity 10 Widget Rendering Consistently

Needless to say, we got to know a lot about the various quirks and rendering oddities associated with today’s web browsers. And by far the biggest PITA to work with – next to legacy IE 6/7 - was Firefox. HTML and CSS that would render consistently on IE 8 and Chrome would always require some hand-tuning for Firefox, while JavaScript code that ran flawlessly under the other browsers would often need at least some minor tweaking for Firefox to be happy.

In fact, it got so bad that we eventually had to expand our base template design to include three major potential rendering models: IE legacy, Firefox and “everybody else” (including Chrome and IE 8). And when even those assumptions proved to be inadequate (offset values that worked for one page would sometimes also work elsewhere, but not consistently), we seriously considered dumping Firefox support altogether.

With most of our commercial customers still using IE for in-house application access, it was a shortcut we could probably have gotten away with. However, in the end we decided to bit the bullet and hand-code the necessary markup and scripting corrections. After all, Firefox is still a major web presence, and we do plan to offer Clarity 10 as a hosted commercial solution later this year.

However, the situation was very much “touch-and-go” there for a while. Had we been under tighter time constraints, or if we had run into any real “showstopper” issues that compromised our design in some fundamental way, we likely would have given Firefox the boot.

Compounding matters is the perception, now shared by many of my contemporaries, that Firefox is in decline. Our own exo.repository numbers still show strong (50%) use among our tech-savvy contributor base. However, those same users are also increasingly turning to Google’s Chrome. Some 25% of systems monitored by the exo.performance.network report Google’s nascent web browser.

Figure 3 – Latest exo.repository Browser Share Statistics

If this number climbs much higher, and if Firefox use takes the kind of nose-dive so many are now predicting, we may have to revisit our decision to continue supporting Mozilla’s browser. With the web gravitating towards the rapidly maturing webkit, and with the latest versions of IE and Chrome converging towards a consistent rendering result, the writing may finally be on the wall:

Save yourself a headache or two and dump Firefox.

RCK

Read more...