Thursday, June 25, 2009

One of those technological breakthroughs that we just can't have soon enough

I’m a big fan of the ultra-portable laptop category. I’m not talking about one of those stripped down, internet-access focused Netbooks. I’m talking about a fully functional, every day laptop that just happens to be super portable. I use a Macbook Air but it could be one of the phenomenally small Sony, Acer or Lenovo’s out there. Full functionality (ok, so I don’t really need a DVD/CD drive with me every day) computing without the hassle of cables, heft or any other accoutrement to weigh me down. When I get home I plug in a couple of dongles and I have the extra display, harddrive, keyboard and mouse. (I was always a huge fan of the original Powerbook Duo - if you don’t know what that is you have to check this out: http://www.youtube.com/watch?v=4dqLT0UBPx0 So innovative but where is that solution today?!

Now I’m afraid I’ve gone a bit off my original subject focus. No matter how portable the laptop has become there is always that ONE cable that you can’t escape: the power cord. I’ve gone to work in libraries, coffee shops and countless airports. Work for a few hours and all of a sudden you realize that the drain on your battery will force you to start looking at the walls in search of the ubiquitous power plug. (Don’t even get me started about overseas travel and the limited number of converters that you are forced to juggle between all your electronic devices.)

I’ve always thought that if we could make internet connectivity and peripheral connectivity (I love my wifi connected printer, not to mention bluetooth keyboards and mice) then what is the problem with wireless power. Yes I know that in order to keep from getting fried or electrocuted there are some physics issues to work through! My dad always said that we just needed a better solar cell to more efficiently convert light into electricity and then cover the laptop in solar cells... not sure where that technology is headed, but for a long time I was sensing that my desire to be truly cord free was never going to be satisfied. Then the Palm Pre is introduced with a power mat charging peripheral. Then an enterprising start up introduces a dongle to accomplish the same thing with an iPhone. Now before getting too excited I realize that the power requirements for a cell phone pale in comparison to a laptop but I do a bit of online research and low and behold there has been progress in making my dream a reality. Within a few days late last week not one but two articles appear:

http://tech.yahoo.com/blogs/null/143945

http://web.mit.edu/newsoffice/2007/wireless-0607.html

Sure one of those articles is focused on cell phones but I figure the more companies look at the problem the sooner we’ll have an efficient and powerful solution... THEN we can really say that we are mobile and cord-free!

Monday, June 22, 2009

Moore's Law

As certainly one of the most ubiquitous technological hypotheses, Moore’s Law which postulates that the number of transistors or the power of a processor will double every eighteen months, has been under pressure time and time again. Every time it appears that the available technology will not be able to continue to support the doubling of a microprocessors power, something comes along that proves that it can be done yet again. New process, new materials or new innovations continue to push the limits of what can be done on a smaller and smaller sliver of silicon. Even when it appeared that the market’s appetite for more gigahertz would slow processor power innovations dual- and quad-core processors accomplish the same feat while virtualization technologies allow the additional power to be used in new and unique (and ultimately more efficient) ways.

So why spend any cycles on a “law” that continues to provide new and different ways to integrate technology into everything we do and create new ways to extend efficiencies? It appears that science is again running hard into the hurdles of physics that may mean the ultimate end to Moore’s Law. As manufacturing process continues to shrink to 20 nanometers it’s increasingly clear that the traditional silicon materials will fail as a foundation for delivering electron-based bits. As a matter of fact it becomes easier and easier to see that the existing processes will reach their natural conclusion as early as 2014. (See http://news.cnet.com/8301-13924_3-10265373-64.html).

Of course there have been countless naysayers over the years. The wonderful thing about technology is that whenever you tell a group of engineers that you can’t do something they automatically get to work on proving you wrong. This has worked in Moore’s Law favor so many times that it becomes difficult to count. Innovation, the hallmark of the technology industry, is a wonderful thing.

Don’t believe me? Check this out: http://www.mercurynews.com/topstories/ci_12596666

Ultimately this leads to one last question: Do we rename Silicon Valley? Somehow “Bismuth Valley” just doesn’t have the same ring to it. I guess we’ll all happily grow into it.

Wednesday, June 17, 2009

Don't read this if you're easily depressed!

One of my regular morning coffee stops on the Internet Highway is the Register. If you’re in Tech then you either are very familiar with that site; if you’re not you absolutely need to bookmark (http://www.theregister.co.uk/). Yes, it’s a UK-based news site but it covers the Tech Industry like no other news organization I’ve seen. (Note that this isn’t a blog so, while there are the occasionally opinion articles -- and I’m completely aware of the outspoken opinions that are sprinkled throughout the coverage -- I’m not comparing this site to the many tech blogs out there...) So, where was I? Oh, yeah, I’ve been seeing a bit of general coverage that may lead one to determine that the recession may be improving and a recovery may be at hand.

And then I saw this: From: IDC: Server market to decline through 2010 by TP Morgan
http://www.channelregister.co.uk/2009/06/17/idc_server_forecast/

Quote:

“The server market could show a 29.6 per cent revenue decline to $10.6bn - which will be the largest decline in the history of systems and servers, worse than even the first quarter - given that the global economy is still down and the second quarter of last year was pre-meltdown and therefore not bad. And everyone knows even if the third quarter is flat and the fourth quarter is flat or even up a bit, those are easier compares. So there will be no joy if this comes to pass, but not much panic, either.”

Ouch. Sorry but I don’t see much of anything to be pleased about in this announcement. I was expecting flat to - 10% revenue in the server space. An almost 30% decline says nothing to me short of “wow, we’ll be screwed just a little bit longer...” Reminds me of Bob Uecker’s Harry Doyle in Major League describing an errant pitch as “... just a little bit outside” as it pelts the crowd way out of reach of the catcher. I’m thinking “euphemism depression!“ As someone directly challenging the employment market I struggle to find the silver linings day-in and day-out. While I do fancy myself a Master Networker and I do have some irons in the fire, if things don’t turn around sooner rather than later I’m afraid for my friends. I’m afraid for Silicon Valley.

Sorry. I know I’m probably breaking a cardinal rule in blogging by ending the week on a down note.

==============

Tuesday, June 16, 2009

Geography of Job Loss

Over the last few days I’ve caught a number of headlines for research on the recession and the impact on employment throughout the country. The research has rekindled an interest I had many years ago at SGI on data visualization. Economic data, web data, crash simulation data; whatever quantifiative measures you have the big research and marketing challenge has always been how do you quickly and easily identify trends and communicate ideas backed by overwhelming numbers. During my tenure at SGI there were a number of initiatives looking at how to weave the data visualization challenge into market opportunity. The engine behind what became e.Piphany, among other technologies, came from SGI.

Over the last ten years the tools, access and programming models have changed so radically that solving this problem has become so much easier. Adobe’s Flash animation technology alone has dramatically improved and simplified anyone’s ability to create animations and graphics that have changed the way we think of data based research and trending analysis. In the last few days I’ve run across the following two unemployment statistics analyses visualized:
visualization of data

http://tipstrategies.com/archive/geography-of-jobs/

and

http://www.slate.com/id/2216238

Same data, different metrics, very cool and understandable animations!

Now compare those charts mapping the data over time to the following charts from the Brookings Institute:

http://www.brookings.edu/metro/MetroMonitor.aspx

Animation and time lapse comparisons are so much more compelling and communicative than point in time analysis. Now with that said I’m reminded that strategic marketing and executive level folks have always been challenged with how to both identify and then communicate trends. My old friend Kevin Strohmeyer, product marketing extraordinaire, long had a picture / graphic in his office illustrating the Napoleonic campaign into Russia. He and I talked about it at length. It represented one of the finest graphic illustrations of data I’ve certainly ever seen:


Link: http://en.wikipedia.org/wiki/File:Minard.png

Now if it were only that easy to do with all the data!

More retrospection on Sun Microsystems

I just read about a speech that former Sun Microsystems Sales Chief Masood Jabbar gave at the Silicon Valley Historical Museum the other day. You can read it for yourself at: http://www.theregister.co.uk/2009/06/15/jabbar_sun_regret/

So many regrets like many of us former Sun employees but there was something interesting that took me back. Understanding that the Silicon Valley Historical Museum is now located in the former SGI Building 20, Mr. Jabbar waxes nostalgic in recalling how focused Sun was about putting SGI out of business. Here we are in 2009 and both companies are (or have already) disappeared (although it now appears that Rackable which purchased all remaining SGI assets for $25M is going to change its name to SGI). Funny thing about history is that Masood, Scott and the rest of the executive management at Sun may have been focusing on the wrong competitor to put out of business. Now I know that SGI was an initial and very successful competitor and a likely target for Sun but now with 20/20 hindsight keeping SGI viable and successful may have helped Sun far more than they realized at the time. The demise of the focused and successful Unix vendors has not only hurt IT but I think will hurt consumers as well. Those who are familiar with operating systems will say that Unix will always be around and will always have a place in the technology market (and it remains at the core of Linux and Mac OS) but the death of focused Unix vendors will only weaken the case for Unix versus Windows. And, let’s face it, that is the real battle for the hearts and minds...

As a long time Solaris and Irix user I chuckle when I read the numerous threads comparing and contrasting Windows vs. Mac OS. I laugh loudly once a quarter when I see IBM’s earning statement and the amount of revenue they steal from their customers in order to maintain and integrate everything and anything in a datacenter (IBM is the farthest from technological focus as there has ever been in our industry). The remaining vendors maintain a “we don’t care” attitude about recommending operating environments in a datacenter or (as we used to do it) by workload. Windows to so many is the same as any other operating system. While this logic and rationale makes perfect sense for SMB and many mid-market customers, any business above $500M in revenue should really think long and hard about implementing a Windows-only solution in the datacenter. Of course, VMware, recognizing this, makes bundles of cash supporting Linux and Solaris along with Windows in their hypervisor implementation.

The long and the short of this rant is that we seem to have lost a bit of the workload to OS implementation methodology of the late ‘90’s. This relegates the OS to a tactical (or licensing) role versus the strategic role it should play in a datacenter. (Remember the good old days of “-ilities?” As in reliability, scalability, availability... Whatever happened to that marketing? Nowadays I’m not sure anyone understands that just because a vendor claims these capabilities doesn’t mean that their OS can provide them.)

Funniest thing is that I’m sure as we load and grow our datacenters once again (knock on wood that this recession ends sooner rather than later) these very same issues will start to crop up again. (But don’t get me started on how the Cloud Computing promise may simply move the problem from corporate IT out to the service provider...)

Monday, June 15, 2009

Counterpoint to giving it away for free

A couple of days ago I wrote in this blog about the challenges of anything being given away for free. Now comes the following article summary of a recent speech by Wired editor-in chief-Chris Anderson:

http://www.techcrunch.com/2009/06/15/chris-andersons-counterintuitive-rules-for-charging-for-media-online/

I agree that something needs to happen with regards to the media / news business model. This challenge only highlights my previous entry on this subject: once you give something away it’s virtually impossible to start charging for it later. With that said I agree that determining what is unique is valuable. What someone can’t get somewhere else has explicit value that can be charged for. This is what the current news organizations are challenged to determine. I expect that the big networks -- CBS, NBC, ABC and FOX along with AP and a few of the larger more prestigious newspapers such as NY Times and Washington Post along with Wall Street Journal -- will fight it out for national and international news coverage prominence while the local papers and news outlets will focus on better local coverage. The problem is that it will only take one of these outlets to distribute for free to kill the other’s business models. They can’t work together as that is illegal collaboration. Hmm... this isn’t going to work. Thank god I’m not in the news business.

(Thanks to Dali Kilani for the link!)

Thursday, June 11, 2009

Giving it away for free

My first jobs out of college was in politics. I was hired onto two campaigns for two candidates seeking statewide office: first for Lt. Governor and second for US Senate. As I was responsible for Silicon Valley fundraising I got a ton of experience asking people for money. Every week we’d open the donations envelopes and inevitably find a stack of donations for between .50 cents and $5. Unfortunately I didn’t really understand -- nor had I read Hardball -- why we were wasting time depositing such small donations as it usually cost more to deposit than the donation was worth. Fortunately I worked for a real professional Campaign Manager who taught me an interesting lesson about human behavior that I have never forgotten. Regardless of the amount of money you always graciously accept the donation. It took a lot of conscious thought and effort to write that check, put it into an envelope and, in many cases even put a stamp on the envelope. Whatever their reasons for writing a small check -- they could be on food stamps, living in a shelter or on the street -- they felt it important to make the donation. The most important outcome of the donation was it created a tangible and significant emotional attachment between the donator and donatee. The donator feels invested in the candidate and will see the candidate’s success as equal to their own success. They will tell their friends to vote and wear a button and go door-to-door for votes. In many cases their was a reverse correlation between the size of the donation and their fervor for the candidate.

There is an important lesson here for technology and other businesses: think twice before deciding to give something away for free. While there is value in a time limited or capability limited free product, if the customer doesn’t have to invest in the product their fervor and enthusiasm for the product will be very minimal at best. If however they are invested -- even just minimally -- they can become strong advocates and even salespeople.

I know that this isn’t always true but it is my hope that it will remain as a cautionary reminder to thing long and hard about making your product or service free. One truism remains: it’s a lot easier to make something free than it is to try and charge for it later!

Tuesday, June 9, 2009

This isn't a post about the iPhone



For those of you not in the tech business you probably didn’t notice that yesterday was the first day of the annual Worldwide Developer’s Conference in San Francisco. Actually for many of you in the tech business you may not have noticed either (and certainly there are many of you who don’t care).

For many people any conversation about Apple is too much about Apple. How is it that a company with just 10% market share in PCs and 30% market share in a SMALL segment of the cell phone business can get so much coverage? Sure there is the coolness factor and the innovation thing but come on! None of these justify the amount of media coverage and fanboy-ism that Apple gets over its competition.

I’m not going to talk about that today either.

What I am going to comment on is that as successful as Apple has been over the last ten years and as innovative as their products and vision have been for consumer electronics (which have contributed inarguably to a sky high stock price, tremendous revenue growth and a growing stockpile of cash - $30B at last count), they have completely sidestepped a much larger revenue opportunity: corporate IT. I also know that Apple simply doesn’t care that they are leaving billions on the table: this I know because I have been personally rebuffed over and over again in trying to sell my services, proposals for business and other overtures to build a business value proposition for the company and its products. I have also spoken to many others who have been similarly turned away.

Think about it: Apple already has a limited server line. Their OS is built on top of enterprise ready Unix / BSD. With only some minor tweaking they could build a solid enterprise value proposition around innovation, quality and reliability. Yes it wouldn’t be low cost which is a major reason why Dell has been successful in selling into the enterprise. However with a focus on ease of use they could make immediate in roads into the SMB / Mid-Market business space which, while very sensitive to the price, is just as overheard constrained and ease-of-use would enhance a strained IT (or complete lack of IT for that matter).

There is precedence for this move on Apple’s behalf. There are numerous system integrators and resellers that are working to make Apple enterprise-ready. There is no reason for Apple to build a direct sales force to focus on an enterprise sale either. There are many ready channel partners who can build the GTM and the focused solutions from the desktop through to the datacenter. They have added Exchange support to their applications and will extend this critical business solution interoperability to the OS itself with the forthcoming distribution Snow Leopard. Now all they need to do is hire a few people with B2B experience to focus a GTM (and make even a few points in the largest IT market).

If Apple only took a small part of this advice imagine what their stock price would do then!

As I’ve tried to tell Apple many times before... I’m available if they want to talk about this opportunity!

Monday, June 8, 2009

Palm Pre

Today is the opening day of the Apple World Wide Developer Conference in San Francisco. This morning they announced new Macintosh laptops, a significant upgrade to the Mac OS and, what everyone was really waiting for, a software and hardware upgrade to the widely popular iPhone.

In honor of the iPhone announcement I thought I would take a few minutes to write about the other phone getting all the news coverage right now: the new Palm Pre. Most of the coverage leading up to the launch of the Pre over last weekend was about how Ed Colligan (Palm CEO) lured Jon Rubinstein (the brains behind the iPod and iMac) out of retirement to take on his old company in the phone space. The coverage -- and there was a lot of it -- focused on this battle between the old leader and the new leader in the handheld / smart phone market segment.

Now I should be fair and disclose that I was a huge Palm fan throughout the 90’s. I was an early Palm Pilot adopter and ended up purchasing three or four over the course of a decade to upgrade or simply replace an older unit. (I have a bad habit of dropping my phones and PDAs, usually with a bad result.) I invested Palm apps and used my PDA for everything I could. When the Treo smart phone was introduced I was ecstatic. I owned my 600 for years. With a few hardware exceptions -- that annoying buzzing sound aside -- I was a happy user. That is until Palm abandoned me wholesale. They introduced Windows CE phones and no longer advanced the Palm OS. They were happy to take my money but were never happy to advance me along the lifecycle of products.

With that said the one thing that impressed me the most about the iPhone -- no, not the interface or multimedia or handwriting recognition or applications -- was the fact that despite owning the first generation of phones I could upgrade to the 2.0 software for free. Yes the hardware doesn’t support all the bells and whistles but Apple was going to continue to advance the technology for me. More shockingly is that they were going to advance my software capabilities for FREE. I was happy to pay for it but the upgrades (with dot revisions there have been four or five now) continue to come for free.

Today Apple announces iPhone 3.0 software. Guess what? I’ll download it when it comes available on June 19th for free. I’ll flash my first generation iPhone and get additional functionality. Guess what? I’ll be happy...

Palm will never again get my business. I can’t afford their model. The Pre may be cool but if you’re interested in buying one you should ask for a commitment for future upgrades (paid or free) before you walk out the door! Palm is the number five phone OS vendor behind Apple, Nokia, Blackberry and Android. Do you really want to take a risk that they’ll be in business long enough to provide the support you deserve for something as business critical as your phone?!

Thursday, June 4, 2009

This just in: Intel is buying Wind River Systems

Intel is buying an embedded OS vendor. Hmmm think that they’re going to start inching even closer to the system vendors? There is more money to be made in system integration than hardware?! Where have we heard that before? Oh, there will be more to write on this subject!

Wednesday, June 3, 2009

Wake Up Facebook!

So my topic for the day was going to be commoditization. There are so many things that are increasingly moving to lowest cost genericized offerings. Just saw this news article about advertising revenue:

http://www.btobonline.com/apps/pbcs.dll/article?AID=/20090602/FREE/906029995/1078/newsletter011

No surprise really considering how many options are now available for advertisers. I was going to focus today’s blog on how advertising-based business models are bound to fail and how new sources of revenue based on value add is critical. I’ll save that for another day and focus on something related but I feel particularly emotional about today:

WAKE UP FACEBOOK!!!

I can’t say that loudly enough! They have become so blinded by an investor valuation that they’re missing the real opportunity for revenue and sustainability and ultimately long term relevance. Facebook will go the way of, dare I say, MySpace -- I don’t know anyone on MySpace anymore -- or Orkut (remember them?) unless they focus on their value add and build a business model that creates value from their service.

I know, I know that many will scoff at this opinion but regardless of the success they’re having today at building an infrastructure for communities some completely unknown service or technology will come around tomorrow, next month or next year that will be cooler, hotter or just easier and will totally subvert their success eliminating any perceived market valuation.

So what is the answer?! The funniest and most aggravating thing about the Facebook service is that there is a straight forward and powerful value that has distinct revenue generating potential that has not been leveraged. The largest and wealthiest market in dire need of Facebook’s easy-to-use, collaboration and knowledge management solution is business -- the Fortune 1000. The problem with Facebook is that they’re caught in a “cool” consumer cycle that they miss the business opportunity right in front of them.

Facebook should be able to relatively easily use their existing service as the basis for a corporate SaaS collaboration, knowledge management, auditing service. I can’t speak for every corporate intranet but certainly the ones I’ve used have been a nightmare aggregation of mixed systems providing limited functionality along with horrible search, auditing and collaboration tools that never quite get optimized. Worse yet, as we’ve all likely experienced, as soon as we get comfortable with one tool a new one is introduced with an entirely different interface and capability. Facebook is terrific for creating user controlled groups and integrating IM, email, web services and digital media for sharing, controlling and collaborating. The technical challenges would be minimal (at least initially as a SaaS offering only) with a mid-market sized business customer seeing immediate and significant value from an enterprise type offering.

More importantly, for Facebook, they could charge a reoccurring subscription fee to business customers that is predictable and not insignificant. It’s not advertising-based. With their advance feature set and brand recognition Facebook could quickly gain marketshare and significant customer success.

The fact of the matter is that without Facebook in this space there has been a number of Facebook type services popping up to service this B2B need. These services include: Grou.ps, TamTamy, Jive, and Igloo. Facebook has competitive advantage today but it must move quickly to address this need before the market opportunity passes them by...

Tuesday, June 2, 2009

Desktop virtualization

As a follow up to yesterday’s blog on the hypervisor revolution I wanted to spend a few cycles talking about the desktop. For the last 30 years we have become accustomed to personally managing our desktop OS. From business to consumer and back to business desktops our sense of ownership and entitlement has come at a very high cost. The cost of management, security, patching, corruptions / conflicts and ultimately the cost of the hardware upgrade (or replacement) necessary to support the next / latest version of the software we need to get our job done.

Now the idea of running your desktop OS or at least a subset of applications from the server is nothing new. The first desktops were dumb terminals running mainframe sessions. The evolution to thin clients was somewhat revolutionary in that you had your own GUI-based desktop image. However because the network wasn’t fast or efficient enough the thin client solutions were forced to make compromises that in most cases negated the benefits of moving off a traditional PC in the first place. You still had an OS image on the desktop to manage, patch and secure. In addition the customer often had a set of legacy peripherals that they were forced to scrap as they weren’t supported by the thin client. In addition many solutions had architectural issues -- single points of failure -- or unique management interfaces that they had to learn and certify. Often it was more challenging than simply sticking with the tried and true. Now, I’m generalizing and for simplicity not even dealing with application virtualization as an option. In addition many customers moved to a bare bones Windows virtualization solution from Microsoft called Terminal Services. This functions fine unless the user needs some basic things like support for sound. ;)

Which brings us back to the hypervisor. The wonderful thing about virtualization is that you can run any OS (x86-based that is) on the hypervisor on a server and deliver it via the network. The benefits of this model -- outside of centralized imaging, managing, patching, service and support -- is that you can optimize the network to a point that minimizes the technological footprint of the client yet delivers almost full desktop capabilities to an end user. Thus, in the last two years the introduction of the Zero Client. (I take some credit for that concept!) Yes a Zero Client may have some small firmware or bootware but for all intents and purposes we’re talking about a client side box that provides ports for network and peripheral access. (A place to plug in your monitor, mouse and keyboard.) There are a number of Zero Clients on the market today led in large part by Teradici’s OEM suppliers (such as ClearCube), Pano Logic and nComputing. Now for the initiated I don’t want to get in a discussion on whether Teradici or nComputing are indeed desktop virtualization plays at all considering their unique architectures or whether they (today) utilize a hypervisor backend solution -- in those two cases they don’t.

The background was important for my point: while this is revolutionary it ultimately doesn’t matter!

None of this matters for two reasons: commoditization and cheap alternatives.

  1. The three most significant components of desktop virtualization and its alternatives are: connection brokers (software that authenticates and connects remote users), hypervisors (disaggregators of the OS) and protocols (optimized network connections). Connection brokers were commoditized two years ago. You want a connection broker you call any vendor in the space and they’ll give it to you for free. There are no companies specializing in connection brokers left. VMware, Microsoft, Citrix and all the smaller vendors provide the connection broker at no cost. The hypervisor is being commoditized as I write this blog. VMware tried to sell ESX for $999 and Microsoft countered with their hypervisor, Hyper-V, for $28. Now you can get either for free. Xen the open source hypervisor has always been available for free. Sun Microsystem’s Xen derivative, XVM is also available for free. The highest potential hypervisor, KVM will begin shipping as part of Red Hat any quarter now as well. Their business, as VMware’s and Microsoft’s is to build a business on management tools and not on the underlying technology. Lastly there is the protocol. This is where most vendors in the space are focused on creating added value and differentiation. However with Microsoft working diligently to significantly upgrade RDP (provided for free) and VMware partnering with Teradici to distribute PCOIP later this year I predict that soon the protocol will also be free. This leaves tertiary players such as Pano Logic, Sun Microsystems, Wyse and even Citrix with their robust ICA stuck between a rock and a hard place to continue to drive customer’s to pay for little additional value add.
  2. Devices ultimately are unimportant because the value of any server-centric desktop solution should be to deliver an optimized experience to ANY device. Yes there are benefits to a Zero Client but the installed base of PC’s, netbooks, and most importantly cell phones is vast and each has a very unique value that will never be eliminated by thin clients or zero clients. Zero clients in particular have other challenges: since they have no state they need a secure network connection. This eliminates their ability to connect outside the intranet (at least easily and cheaply). This is yet another major obstacle.
The net of this discussion is that there is no doubt that the desktop computing platform will alter dramatically from the traditional PC most of us use today (and that includes many existing laptop users). Will it mostly look like a Thin or Zero client? Not likely.

Another view of virtualization

I’ve spent the large part of the last five years looking at server and desktop virtualization. It was only a matter of time before I was going to spend cycles in this blog focused on the role, impact and future role of virtualization on datacenter IT and ultimately on the desktop. I will likely continue coming back to this subject or one of its derivations (Cloud Computing, SaaS, etc.) over the coming months.

I thought I’d start with my view of the hypervisor on IT. The hypervisor is a disaggregation technology: it disaggregates hardware from software, x86 platforms from operating systems. For those Macintosh or Linux fans out there, the hypervisor is what easily brings Windows applications to your beloved platform. For IT the hypervisor allows applications to continue to run as you migrate to new platforms or seamlessly add additional workloads to existing platforms.

My contention has long been that for the obvious utilities -- consolidation, optimization, business continuity solutions -- the hypervisor has real value but only evolutionary value. Best practices don’t fundamentally change. Architectures evolve but don’t revolutionarily change. Management architectures do, even sometimes radically change, but the tools of the trade don’t really do more than evolve.

During my tenure at VMware I was struck by the fact that there are areas (two distinctly) that are revolutionarily impacted by the hypervisor. Two areas that, at the time, VMware wasn’t only minimally invested in. These two areas -- virtual appliances and desktop virtualization -- have become, under new leadership, more highly invested in at VMware but still, I believe, entirely under invested by IT.

The first area, virtual appliances, seems to be the least impactful over the last few years. There seem to be only a few companies trying to create a business around this concept. Virtual appliances, or the use of a hypervisor by a software developer to disaggregate OS decisions for their customers, could potentially have a profound impact on software development models and dramatically change the call-to-arms between warring .NET and J2EE camps. The ultimate impact of a virtual appliance was captured, albeit only briefly, by BEA (since acquired by Oracle). Since BEA had a fast derivation of an OS (their implementation of JRocket JVM) they wedded ESX with an optimized JRocket and their app server to create a “bare metal” implementation of their software stack for Intel-based systems. No need for Windows, Linux, Solaris or any other OS. Simply install the stack with a single click and you have an optimized software solution installed and ready-to-run. BEA could pick and choose which OS-level components it wanted to optimize and deliver as part of the stack. While the utility is obvious for the customer the ultimate savings were never realized by BEA -- no longer having to choose a single or multiple development paths for different OS platforms -- as they were shortly thereafter acquired.

There are some companies out there trying to build a software development business around this technology. rPath is one that comes to mind. It’s also clear that VMware has increasingly invested in software distribution for their partners using the hypervisor in this manner. They have built a software distribution community through their corporate site focused on this. However the long term impact on the “traditional” developer model hasn’t yet developed. I have faith that at some point it will. Perhaps it needs a bit more consumer exposure?

The other area, desktop virtualization, has been an area of heavy VMware investment over the last couple of years. It’s also an area that I know quite a bit more about having spent a year in thin clients with Sun Microsystems and the last couple of years with a desktop virtualization start up (which I am no longer associated with). I have also done quite a bit of writing about this space (and will obviously continue to) -- see my very first blog entry. Stay tuned for tomorrow’s update for more on the evolution of the desktop and the impact of the hypervisor on OS delivery.

Monday, June 1, 2009

The End of an Era - Part 2

So I just finished my entry on Sun Microsystems when I realized that there was another very sad passing that hasn’t gotten the kind of press that Sun - Oracle has: SGI. Good old Silicon Graphics! Yes, I spent time at SGI as well... They too had some religion albeit not at all the same passion for it as Sun did. SGI was willing to change and grow their solution to fit customer demand. Unfortunately their leadership (post-McCracken) was interim in every way and didn’t seem to care too much about the impact their short term thinking would have on long-term business. Just for clarification I’m not speaking about the SGI leadership of 2000. I’m specifically speaking about the year of “Rocket Rick.” This would-be savior came from HP’s printer division where he was hailed as a visionary leader who knew how to manage commoditized technologies and could take SGI’s graphics leadership to the next stage.

What he did instead was give away the farm.

SGI had already committed to a Windows path and was already working on an NT-based workstation. One small mistake there... the wonderful, industry-leading, 35M transistor graphics engine designed specifically for the Visual Workstation was hardwired to the desktop’s motherboard. This meant that for 6 months the customer had the top-of-the-line performance. After that they were stuck. There was no easy way to upgrade the components. Ooops. Trip. Not fatal but certainly embarassing. It was hard for many engineers to foresee the commoditization of graphics. But that was happening in real time.

No, that wasn’t fatal. What was significantly contributory however was SGI’s stewardship of the graphics API’s known as OpenGL. This was the cornerstone of SGI’s IP leadership. Yes, it was open to everyone but it was so advanced that SGI had a hand in virtually every big graphics and big data solution on the planet. (Big data was critical too.) Rick Belluzzo, eager to please his future employer Microsoft engaged in Project Farenheit. What this was supposed to be was a graphics interoperability project between OpenGL and Direct3D. What it ended up becoming was a way for Microsoft to successfully stall OpenGL development for a year or two while Microsoft enrichened Direct3D to make up some of the gap in technology.

By itself this wasn’t enough to kill SGI but couple that with the decision to 1. spin off the technology that would ultimately make business intelligence visualization pioneer ePhiphan.y and 2. adopt the Itanium processor as the successor to MIPS and you have the recipe for disaster.

For me the funniest and most tragic moment in my career at SGI occurred on the day Rick Belluzzo was introduced to the employees. Firstly he accidentally referred to “us employees at HP...” (I’ll give him that mistake) and his comment that if we didn’t do our jobs and execute we’d all be walking around with Sun Microsystems badges by year end. That was something! Most of us were asking what was so wrong with that?! In retrospect Sun gets sold for $7B and SGI gets sold for $25M.

The End of an Era - Part 1

It’s amazing how many people look at my CV and immediately ask the very same question: “So what about Sun Microsystems?” At least we have the next step defined: Oracle. The only remaining question is what’s next? Firstly as a former Sun employee I too drank the kool-aid. Funny metaphor in that it had the very same net result as the Koresh cult... death. Ultimately Sun’s undoing is due in no small part to that kool-aid. Technology is not a religion. It should never be treated as such.

As the former Director of Marketing for the Sun-Microsoft Collaboration and later as Director of Partner Operating System Marketing (my team and I were responsible for marketing all non-Solaris OS implementations on Sun’s hardware - including Windows, Red Hat, Suse, Umbuntu and VMware) I can tell you from first hand experience that Sun’s problems largely parallel the speed (or lack thereof) in deciding that selling solutions that customers ask for is what they should be doing. It took over two years for the Company to formally OEM VMware’s hypervisor and solutions stack. The Sun channel partners who wanted to sell VMware virtualization on Sun’s opteron-based server hardware usually sold an HP sku. Yes, sell Sun hardware and HP get’s a cut. That’s not a way to run a business! Some may even find it shocking that there was actually an Partner Operating System marketing organization. That, by itself, was progress for Sun. You should know that I reported to the VP of Solaris Marketing. Not an organizational structure designed for success.

So, what happens now? Oracle can quickly become one of the handful of end-to-end IT solutions providers -- along with IBM, HP, Microsoft/Dell, and perhaps Cisco at some point (I’ll blog on that later). Or, Oracle can decide to simply continue to focus on the software footprint and sell, close or spin-off the server and storage hardware parts. If I were a betting man I’d say that Oracle will look to use hardware to create appliances from much of their software stack and attempt to optimize the remaining hardware for the database and applications stack. (Which, by the way, they will fail to accomplish in the same way as everyone else in the commoditized platform business has failed to accomplish this. That after all is what commoditized means: everyone can build -- or have access to -- the same thing.)

Sun Microsystems will ultimately go the same way as DEC... remembered fondly in alumni groups until those too die off.