Tuesday, August 25, 2009

Couple of really interesting links

Thought I’d go light on the opinion and heavy on the content. I ran across the following really cool links this afternoon.

Ever wonder how a car can turn -- one wheel must turn faster than the other while both are engine powered? Anyone can say it’s because of the differential but how many of you really know how it works? Check out this old video for a really easy to understand breakdown of a differential. Video automatically starts 1:50 in...

http://www.youtube.com/watch?v=K4JhruinbWc#t=1m50s

Another very cool flash animation comes from USA Today and shows how the International Space Station has grown over the years. It’s also very cool:

http://i.usatoday.net/tech/graphics/iss_timeline/flash.htm

Enjoy!

Tuesday, August 18, 2009

Microsoft's Real Control Point

I was just reviewing a CNET article written in mid-July following Google’s Chrome announcement. Chrome as you may recall is Google’s new open source operating system. Having been down that road with Sun Microsystems and the Java Desktop System I rolled my eyes. Having also had an outsiders inside view of Microsoft over the years both as a competitive analyst and later as the marketing lead of an interoperability alliance with Microsoft I have seen much of the Redmond stack as threatening to a competitive landscape and the open source community. However I’ve dramatically changed my tune on much of what I think matters.

Over the years we’ve heard of many technologies labeled as architectural control points: operating system, browser (oh how wrong we were there), directory (think user authentication / password access), exchange (control email access) and lately cell phone ecosystem. Each one of these technologies held our attention for a while until something else came along proving the weakness in that as a control point. In many ways you can suggest that all of these technologies in aggregate controlled architectural flexibility and capability. But making that argument would be ridiculous as it remains true with any homogeneous system. The best news to come from Open Source is that there is a focused and strong developer base focused on creating competitive alternatives. Some are better than the proprietary, some are not. As the iPhone has proven, make something that people want to buy and their purchasing power alone is enough to move entire industries and markets. (Imagine the boring world of cell phones if Apple hadn’t pushed the capabilities and expectations for cell phones. It’s quite possible that RIM would still be selling B&W models of the popular Blackberry.)

However undisturbed is one Microsoft control point that we all take entirely for granted: the ubiquitous Office Suite. Having suffered with and been a proponent of OpenOffice I know that there are alternatives actively used by all sizes of organizations. However almost all of us use all or parts of Microsoft Office whether at home or at work. It is my contention as we view the increasingly web-based movement of applications that Office productivity tools will remain mostly on premise and will increasingly integrate cloud service capabilities and services to make application access, integration and synthesis more capable.

The biggest problem no one has solved is how to integrate multiple web services and SaaS applications for reporting, analysis and business intelligence. I recently ran across a company called Pervasive that has a solution to provide across cloud service integration. It appears to be complicated and expensive as a solution goes. But it clearly is targeting an end user need that will only further become exacerbated as more of our lives migrate online. For consumers think about how much of a pain it is to work in Facebook, Twitter, LinkedIn, MySpace, etc. Now think how difficult it would be if you had to gather, analyze and report on data from each service regularly. For the enterprise it’s even worse. Salesforce.com, the leading online CRM tool, provides average sales reporting capabilities. Imagine how difficult it would be for someone using SAP ERP, Eloqua marketing/lead management and Google Analytics to create a cross service report for weekly review?! As someone who did weekly reporting across Google, Marketbright, Salesforce.com, Netsuite and Intuit I can tell you that this isn’t just a pain, it’s extraordinarily time consuming as well. I managed this for a company of 50. Imagine doing this for a company of 500 or 5,000!

Now to wrap this back to Office: Office remains a set of tools that we all know and we all have a local license to use. We all know how to use Excel and Powerpoint. We know how to create tags, worksheets and pie charts. Wouldn’t it be great to simply plug in web service content and have it integrate automatically in the spreadsheet we normally use for analysis and reporting? Wouldn’t it be great if that tag were dynamic and could automatically refresh as the web service is updated? Wouldn’t it be even better if we could update sales data in Excel and have that automatically updated in Salesforce? How easy and powerful would Office become then? I wonder if anyone at Microsoft has thought about how powerful Office could become?

Tuesday, August 11, 2009

A Ghost in the machine

Having worked in some form of alternative desktops for many years -- from alternatives to Windows, Linux-based desktops and applications, thin clients and most recently desktop virtualization solutions -- the other day something very different caught my eye: G.ho.st or Globally Hosted Operating System.

So there are two unique and newsworthy angles with this story. I’ll start with the one that seems to have gotten the most press: the company is based in Israel but all the programming and code jockeys are Palestinians working in the West Bank. In other words, these two traditional enemies have teamed up to create a really cool technology software offering although most have not been able to meet face-to-face. My guess is that they have purchased some very expensive video teleconferencing equipment and work virtually. I do hope that some very smart marketeers at Plantronics or Cisco have provided them with either free or seriously discounted hardware because this angle to the company is not only unique but would make a tremendous success story. Hopefully the Israeli government will also notice that working together is a whole lot more economically conducive than fighting -- or, in this case, limiting travel between the two regions. Considering neither the West Bank nor Israel is very large I’m sure that the Palestinian contingent isn’t physically more than a few miles from the Israeli half of the Company. It’s too bad they can’t physically connect.

With that said, the truly interesting part is that they have done so much virtually maybe the lesson for the rest of the world is that you really don’t need the physical barriers that we have come to take for granted. The G.ho.st solution -- and I’ve only done some cursory trials with the software provide a very unique and quite capable approach to virtualizing a desktop. They have virtualized the entire GUI, application and file base to a cloud and have optimized the delivery to run fairly well over an internet connection. In other words you can access your true desktop with a desktop experience (as opposed to a file or application experience) from a web browser. You get file to device synchronization (plug in your cell phone or MP3 player and it will synch content), access to applications and drag and drop file management right from the web interface. Oh, and by the way, the web interface looks an awful lot like a Linux GNOME desktop. (For those of you not familiar with GNOME just replace what I just said with “Windows desktop.”) There are many companies looking to control the desktop and it’s functionality by moving the OS and applications back to a server but this is the first that actually delivers a full Windows replacement as a web service.

We hear so much hype around Cloud Computing -- and interestingly not much about its greatest challenges such as analytics or cross platform BI -- but I’ve not seen an alternative to the entire client delivered via Cloud Computing until now. Unfortunately you’ll always need some kind of hardware platform to deliver access but I can now see a future where that platform becomes as vanilla and general purpose as a Netbook has promised but not yet delivered.

For more information check out: http://g.ho.st/

Thursday, August 6, 2009

How big is a Trillion anyway?

Making the news over the last few months is the term “trillion.” As in the budget deficit has ballooned to $1.5 trillion dollars, and the budget debt has grown to $12 trillion dollars.

The thing I’ve noticed in the tech industry is that we tend to the micro and nano; there is so much value in making something smaller and cheaper... thank you Gordon Moore and, quite frankly, Sony for driving this point home.

I’m a huge fan of visualization technologies as a tool to better understand and communicate the esoteric. Thankfully some smart people have now given us new ways to visualize the enormously large “trillion.” Consider:

A million seconds is 12 days.
A billion seconds is 31 years.
A trillion seconds is 31,688 years.
A million minutes ago was – 1 year, 329 days, 10 hours and 40 minutes ago.
A billion minutes ago was just after the time of Christ.
A million hours ago was in 1885.
A billion hours ago man had not yet walked on earth.
A million dollars ago was five (5) seconds ago at the U.S. Treasury.
A billion dollars ago was late yesterday afternoon at the U.S. Treasury.
A trillion dollars is so large a number that only politicians
can use the term in conversation... probably because they
seldom think about what they are really saying. I’ve read that
mathematicians do not even use the term trillion! 
Here is some perspective on TRILLION:
 Trillion = 1,000,000,000,000. 
The country has not existed for a trillion seconds. 
Western civilization has not been around a trillion seconds. 
One trillion seconds ago – 31,688 years – Neanderthals stalked the plains of Europe.
Thanks to Tysknews for this wonderful illustration. This was taken from: http://www.tysknews.com/Depts/Taxes/million.htm

Want to SEE a trillion dollars? Check out this video: http://www.youtube.com/watch?v=MmxU2IqhnjM&feature=related

Tuesday, August 4, 2009

Update on pico projectors

Last week I wrote about the evolution of the presentation projector and how cool it will be (one day) to see projectors as a native part of a cell phone. That day is far closer than even I thought. Nikon just introduced a camera with a built in projector:

http://www.wired.com/gadgetlab/2009/08/nikon-crams-projector-into-compact-camera/

Very small step from there to the cell phone!

Friday, July 31, 2009

Bing vs. Google - There's another to consider

Yesterday Yahoo and Microsoft finally announced their search partnership. If you haven’t kept up on how Google is changing the shape of their competition, Yahoo decided to get out of the search business and have agreed to outsource their search service to Microsoft. This comes at a great time for Microsoft as they finally have what should be considered a viable search alternative in Bing. If you haven’t yet checked out Bing you should just for the user experience. Bing provides an option when you roll your cursor over the search results to view the content on the link without actually clicking on the link. With that said they’ve had some significant indexing issues and you won’t find exactly the same kind of results that you will in Google. Microsoft claims that they have indexed the most significant results but good indexing takes time so if you’re looking for some a bit more obscure you may be disappointed. (On a side note I have been frustrated over the lack of indexing my personal consulting site www.baer-consulting.net. While it showed up in Google the day after I went live, it still, two weeks later, has yet to show up on Bing.)

Following the ultimate implementation of Bing on Yahoo -- this deal should take a good six months to roll out -- consumers will have to choose between Google and Bing. The only remaining alternatives are a collection of third tier players including ask.com among others. (Remember askjeeves?) But actually there is another alternative: WolframAlpha. Have you checked that engine out? If not, you absolutely should! (www.wolframalpha.com) For a subset of search needs WolframAlpha clearly exceeds anything offered by Google or Bing. Are you looking for the answer to a mathematics equation? Chemical formula or historical information on a date? You need to check out WolframAlpha! Not only does this site approach search in a different way, its results are far superior for its focused type of searches over any rival.

It’s been so de rigueur to type your own name into Google to see how high you may rank in their search results -- come on, you know you’ve done it! With WolframAlpha, it’s way too much fun to type in your own birth date or hometown! Check it out. If you’re as geeky as me you’ll bookmark it immediately!

Monday, July 27, 2009

So how cool will this be?

Over ten years ago while I was living in Los Gatos I met a wonderful couple from North Carolina. They were our next door neighbors. Thinking out of the box they came up with a great idea: “drive-in movie night.” Play a movie on their VCR and output via an inFocus projector borrowed from work onto a large white sheet hung over the garage door. What a phenomenal idea and great invitation to friends and family to pop the popcorn and watch movies with the neighborhood. I know that quite a few of you have amped this idea up with DVD or Blue-Ray, surround sound and an even bigger bed sheet. The amazing part about this is how much more advance the projectors have become.

A few years ago a friend of mine was looking into investing in a new home theater system and ran across a couple of HD projectors specifically designed to project in widescreen HD. Major benefits: small footprint, scalable image from 26“ to 110”! Big drawback: any ambient light washes out the image. In other words if you’re watching during the day you better have the best black out curtains.

Funny thing is that since I went on my first sales trip I always dreamed of better image quality coupled with more portability. Sure projectors have shrunk to a very useable size of a large paperback book but I’ve dreamed of an even smaller and more portable version. Well my dream is almost a reality. PicoProjectors -- projectors so small that they can be embedded in cell phones -- represent the ultimate in capability and form factor. Afterall my cell phone is nothing more than a handheld computer. I can install applications, edit MS Office documents, view images and watch movies. Wouldn’t it be great if I could project my sales pitch or movie onto a wall for all to see? And do that without plugging into a separate device? Soon you will be able to do all this!

Pico Projectors for mobile phones:

http://news.cnet.com/8301-13772_3-10284209-52.html

Wednesday, July 22, 2009

If the World Wide Web was invented in 1980

Have you ever wondered where we would be if the world wide web had been invented in 1980? I know that we’ll likely have a clear idea in another ten years or so but I got to thinking about what Twitter and Facebook really mean to us won’t really be clear for another ten years - in societal terms I truly believe that what we’ve seen in Iran with the rapid adoption of Twitter as a social tool for dissent comes to pass more broadly throughout the world. If the neww reporting and communications business is truly going through dramatic change to incorporate a very different business model (which it is) and Twitter and new online collaboration tools become the norm then the world as we know it today will be dramatically different in 2020. While a world like that portrayed in Minority Report won’t likely come to pass there will certainly be some elements very familiar: interactive advertising, video cameras streaming live from every street corner, personal preferences integrated into our daily interactions -- much of which we can already experience today.

Will this world be better? Worse? I believe that the consensus has to be that it will be a very different world. A world where there are fewer secrets and one in which our children expect less privacy, modesty along with increasing amounts of technological integration. Why would they expect to interact with their video games with a Wii-like wand and then interact with there computers with a keyboard. As a matter of fact there likely will be fewer computers and more special purpose devices with deeper and richer capabilities. Think of your cell phone as a communications tools that can interact with a different type of typewriter access files universally (in the cloud). It may be a world where the idea of a computer will make future generations laugh... you mean you actually carried that thing around?!

While we contemplate what kind of world we’re building here in Silicon Valley you may wonder what provoked this outburst in the first place? (Have you noticed that the day-to-day content of this blog has been driven by the news?) I ran across this wonderful graphic on the web:

What would Apple’s website have looked like in 1983 with Lisa launch?

http://www.flickr.com/photos/davelawrence8/3663647101/sizes/o/

I love this kind of creativity. Heck, look what it provoked in me...

Thursday, July 16, 2009

How much Facebook is worth and why we need them to IPO

If you’ve kept up on this blog over the last few months you know that I’m really not a fan of advertising based business models. It’s clear that very few companies can build truly robust and growing revenue numbers from advertising. As more niche web sites, news aggregation sites, iphone apps and other targeted advertising distribution mechanisms are created the value of advertising drops to near zero. Too many avenues competing for too few advertisers. Now it’s very important to note my subtle caveat: “very few companies” can build robust revenue from advertising. I think that today there are two companies that have proven they can and do build tremendous revenue engines from a relative pure advertising model: Google and Facebook.

Google just announced a “poor” quarter (for them) with revenue up only 3% to $4.07B. While advertising is becoming commoditized it’s good to know that the King of Online Advertising Distribution continues to take their percentage off the top. Facebook? Well, we don’t know for sure. We know they continue to grow their user / eyeballs which means that they continue to add value to advertisers eager to sell their wares to Facebook users. One thing that we can track however is Facebook market valuation. Everytime Facebook accepts outside investors we know how much they’re getting and how much equity they give up. Thanks to News.Com (C|Net) we can quickly see how this valuation has changed over the years:

http://news.cnet.com/8301-13577_3-10286111-36.html

So, while it’s been as high as $15B (thanks to a Microsoft infusion of $240M in October of 2007) it’s now around $6.5B. That’s a market valuation for a company that -- as far as we know -- has never been profitable and has never publicly announced revenue! There have been some rumors that Facebook may be considering a public offering as soon as 2010. Considering there is no expectations that there will be an tech IPO at all in 2009, we can keep our fingers crossed that Facebook does indeed look to IPO in early 2010. Remembering the Netscape IPO in the mid-90’s as the beginning of the Dot Com hysteria we can only hope that a wildly successful (as it will undoubtedly be) Facebook IPO will provide exactly the kind of market and investor enthusiasm as we desperately need right now.

Tuesday, July 14, 2009

What's in my coffee? More on Great Visualization

A while ago I blogged (see Geography of Job Loss) about the power of data visualization, the art of presenting data in a compelling way. One of my other traditional online haunts is Dark Roasted Blend (www.darkroastedblend.com/) which continues to provide my one stop for what’s new and cool on the Internet.

Persuant to provide information in an interesting and informative manner I present the following link (courtesy of Dark Roasted Blend) that, for those of you like me who love Coffee and have always wondered exactly what makes a Mocha different from a Cappucino:

http://www.flickr.com/photos/two-eyes/1285147549/

Again, for all you marketing folks out there looking for inspiration on how to turn that bar or pie chart into something a bit more powerful and communicative please pay attention! There is no reason that you can take that data and present it in widgets! See how much more powerful that is?!

I think the frustrating part for me is that while I know what I want I don’t necessarily have the talent to create it. So I hire a professional to do it for me. Afterall, if you hire a professional you get professional results!

Thursday, July 9, 2009

Funny thing happened today

I decided to find a nice quiet place to sit and write today.

As a marketing consultant my optimal projects tend to the 90% writing, 5% listening and 5% talking. This ultimately means that I absolutely need to find a place to sit and be productive. In this case it was a coffee shop. (I’ve increasingly become a fan of the local public library as it presents free internet access with virtually no distractions. And by no distractions I mean inane conversations, hot women or frankly not having to worry about running into someone I know.) However regardless of how efficient I can be in working in a public place -- often with my iPod and headphones to eliminate aural distractions -- I find a large limiter is the efficiency of my laptop battery. Sometimes I can find a place to plug in but most times I can’t.

My best solution to the power problem is to simply turn off the wifi on my laptop. The resulting power savings will often translate into an additional three hours of battery life on a fully charged battery. Yes, I spend significant amount of time on my computer NOT connected to the Internet. Not distracted by the latest news, IMs, twitter feeds, etc. Just trying to write.

As I already mentioned the problem with a coffee shop is ultimately running into someone I know. The first response after the niceties is ultimately: “OMG You’re not connected!!” It wasn’t really that long ago when being connected meant a noisy modem connection. (Please, I really didn’t want to go back farther than that.) Now it’s simply expected that if you have a computer, cell phone, netbook, etc. you are connected to the Internet. Actually I believe the entire utility of a netbook is that you WILL be connected to the Internet at all times.

Why do we have that expectation? Creating content that makes the Web so much fun is all about doing something creative offline and sharing it online. Unfortunately there are so many distractions in everyday life that finding the time to create the content is impossible when email and IMs keeping ringing in. So, I’m disconnected. Don’t make that sound like a disease!

Tuesday, July 7, 2009

The Genie is Free

Over the last few months I’ve had a number of conversations with folks about what constitutes a viable business model. I’ve even blogged a bit on the insanity of an advertising-based business model in the vein of Facebook. (What drives me the most crazy about Facebook however is the fact that I strongly believe there is between $5B - $10B in B2B business that they simply have ignored but that’s another story.) The interesting thing is that most of these conversations ultimately turn to Twitter.

I’ll state up front that I do think that Twitter is inane. I really don’t need to know nor care to know what someone is doing or thinking on a regular basis. With that said, I use Twitter largely as an easy way to update my Facebook status and update interested followers on new posts to this blog. I truly believe that Twitter is a fad technology that exists and thrives today because it can. I believe that, unlike Facebook for example, five years from now Twitter will be completely passe and our attentions will have refocused on whatever is hot in 2014. While I’m not sure what that will be I know it won’t be Twitter.

Now, with all that said I have been surprised (pleasantly) by the way Twitter has empowered and energized the Iranian election revolt. It has provided a fantastic platform for communication, flash mobs, and news delivery to the outside world. So, again, the technology has proven it has value and can provide a strong platform for diverse and instantaneous communications. It still doesn’t have a revenue generating business model -- that is, outside of advertising.

This morning I ran across the following stinging obituary from McClatchy Group on the passing of former Defense Secretary Robert McNamera:

http://www.mcclatchydc.com/homepage/story/71328.html

I have to admit that I followed Mr. McNamera’s passing with a sense of historical redemption. Prior to the power of the Internet to share, collaborate and inform, Mr. McNamera’s legacy had dramatically improved over the years since the Vietnam War. It also dawned on me how powerful the anti-war and anti-estabilishment forces were in the late 60’s and early 70’s. Imagine how much more powerful and forceful they could be today with Twitter, email, and cell phones! It’s now only because of the power of online media and blogging that I am forcefully reminded that Robert McNamera was personally responsible for tens of thousands of deaths and that while he died with an improved public perception he could not escape the power of the Internet to remind millions of his role in The Bay of Pigs, Vietnam and the ultimate role of the World Bank in financially undermining and colonizing developing countries throughout the World.

Twitter may not have a financially sound business model but because of it’s power to easily and instantly communicate to millions will always have a place in the world. The Company may disappear one day but the genie is out of the bottle and we’re so much better for it.

Monday, July 6, 2009

Changing our view of being disconnected

As I twittered last week I was shocked by a recent recruiter cell phone conversation:

“... yes, I’m returning your call from my car on the way up to the Sierra’s. My family and I are going camping and will be entirely out of touch for the next four days so i wanted to be sure to return your call now before we are entirely incommunicado.”
“OK. Great. Can you send me a quick email so I can confirm with my partner why we called you in the first place?”
“Yes, no problem. I’ll send you a quick email from my phone.”

I figured he wanted my email address, correct spelling of my name, my contact information... A few minutes later I get an email response with a very terse statement: You didn’t include your resume.

Are you serious? What part of this conversation was missing? I am in a car on my way up to the mountains. I will be completely without internet or cell phone access. Now I could make a statement about the recruiters ability to listen however I think that there was something even more fundamental missing in the communications. That is that incommunicado didn’t include access to the Internet. Have we gotten to a point in our technological adoption that there is an unspoken expectation that no one would really want to go somewhere where they would be forced to be completely unconnected?

I have become a bit hypersensitive to the connected mass transit options and I remember how excited the employees of a previous employer became when they announced the fully Internet connected shuttles. I admit that I once joked to a co-worker who also happened to be a virtual employee (one without an office that is) that he should simply get on a San Francisco - Palo Alto shuttle buss in the morning and stay aboard as it runs back and forth between SF and PA. He had a cell phone and a laptop and the shuttle had a wireless internet connection. The only reasons to get off were for lunch and a potty break...

So here we were actually choosing to go somewhere entirely disconnected (sorry, AT&T no bars where we were) and the expectation was that I would still be able to forward a resume on. Don’t laugh but my alternative was to direct the recruiter to my LinkedIn profile. Unfortunately I didn’t get a call back. I guess that if I was serious I would not be trying to take a vacation.

Thursday, July 2, 2009

On vacation this week (through July 4th)

I promise to get back on track with two new entries a week starting after the Fourth of July holiday break! Thanks for your continuing interest...

Thursday, June 25, 2009

One of those technological breakthroughs that we just can't have soon enough

I’m a big fan of the ultra-portable laptop category. I’m not talking about one of those stripped down, internet-access focused Netbooks. I’m talking about a fully functional, every day laptop that just happens to be super portable. I use a Macbook Air but it could be one of the phenomenally small Sony, Acer or Lenovo’s out there. Full functionality (ok, so I don’t really need a DVD/CD drive with me every day) computing without the hassle of cables, heft or any other accoutrement to weigh me down. When I get home I plug in a couple of dongles and I have the extra display, harddrive, keyboard and mouse. (I was always a huge fan of the original Powerbook Duo - if you don’t know what that is you have to check this out: http://www.youtube.com/watch?v=4dqLT0UBPx0 So innovative but where is that solution today?!

Now I’m afraid I’ve gone a bit off my original subject focus. No matter how portable the laptop has become there is always that ONE cable that you can’t escape: the power cord. I’ve gone to work in libraries, coffee shops and countless airports. Work for a few hours and all of a sudden you realize that the drain on your battery will force you to start looking at the walls in search of the ubiquitous power plug. (Don’t even get me started about overseas travel and the limited number of converters that you are forced to juggle between all your electronic devices.)

I’ve always thought that if we could make internet connectivity and peripheral connectivity (I love my wifi connected printer, not to mention bluetooth keyboards and mice) then what is the problem with wireless power. Yes I know that in order to keep from getting fried or electrocuted there are some physics issues to work through! My dad always said that we just needed a better solar cell to more efficiently convert light into electricity and then cover the laptop in solar cells... not sure where that technology is headed, but for a long time I was sensing that my desire to be truly cord free was never going to be satisfied. Then the Palm Pre is introduced with a power mat charging peripheral. Then an enterprising start up introduces a dongle to accomplish the same thing with an iPhone. Now before getting too excited I realize that the power requirements for a cell phone pale in comparison to a laptop but I do a bit of online research and low and behold there has been progress in making my dream a reality. Within a few days late last week not one but two articles appear:

http://tech.yahoo.com/blogs/null/143945

http://web.mit.edu/newsoffice/2007/wireless-0607.html

Sure one of those articles is focused on cell phones but I figure the more companies look at the problem the sooner we’ll have an efficient and powerful solution... THEN we can really say that we are mobile and cord-free!

Monday, June 22, 2009

Moore's Law

As certainly one of the most ubiquitous technological hypotheses, Moore’s Law which postulates that the number of transistors or the power of a processor will double every eighteen months, has been under pressure time and time again. Every time it appears that the available technology will not be able to continue to support the doubling of a microprocessors power, something comes along that proves that it can be done yet again. New process, new materials or new innovations continue to push the limits of what can be done on a smaller and smaller sliver of silicon. Even when it appeared that the market’s appetite for more gigahertz would slow processor power innovations dual- and quad-core processors accomplish the same feat while virtualization technologies allow the additional power to be used in new and unique (and ultimately more efficient) ways.

So why spend any cycles on a “law” that continues to provide new and different ways to integrate technology into everything we do and create new ways to extend efficiencies? It appears that science is again running hard into the hurdles of physics that may mean the ultimate end to Moore’s Law. As manufacturing process continues to shrink to 20 nanometers it’s increasingly clear that the traditional silicon materials will fail as a foundation for delivering electron-based bits. As a matter of fact it becomes easier and easier to see that the existing processes will reach their natural conclusion as early as 2014. (See http://news.cnet.com/8301-13924_3-10265373-64.html).

Of course there have been countless naysayers over the years. The wonderful thing about technology is that whenever you tell a group of engineers that you can’t do something they automatically get to work on proving you wrong. This has worked in Moore’s Law favor so many times that it becomes difficult to count. Innovation, the hallmark of the technology industry, is a wonderful thing.

Don’t believe me? Check this out: http://www.mercurynews.com/topstories/ci_12596666

Ultimately this leads to one last question: Do we rename Silicon Valley? Somehow “Bismuth Valley” just doesn’t have the same ring to it. I guess we’ll all happily grow into it.

Wednesday, June 17, 2009

Don't read this if you're easily depressed!

One of my regular morning coffee stops on the Internet Highway is the Register. If you’re in Tech then you either are very familiar with that site; if you’re not you absolutely need to bookmark (http://www.theregister.co.uk/). Yes, it’s a UK-based news site but it covers the Tech Industry like no other news organization I’ve seen. (Note that this isn’t a blog so, while there are the occasionally opinion articles -- and I’m completely aware of the outspoken opinions that are sprinkled throughout the coverage -- I’m not comparing this site to the many tech blogs out there...) So, where was I? Oh, yeah, I’ve been seeing a bit of general coverage that may lead one to determine that the recession may be improving and a recovery may be at hand.

And then I saw this: From: IDC: Server market to decline through 2010 by TP Morgan
http://www.channelregister.co.uk/2009/06/17/idc_server_forecast/

Quote:

“The server market could show a 29.6 per cent revenue decline to $10.6bn - which will be the largest decline in the history of systems and servers, worse than even the first quarter - given that the global economy is still down and the second quarter of last year was pre-meltdown and therefore not bad. And everyone knows even if the third quarter is flat and the fourth quarter is flat or even up a bit, those are easier compares. So there will be no joy if this comes to pass, but not much panic, either.”

Ouch. Sorry but I don’t see much of anything to be pleased about in this announcement. I was expecting flat to - 10% revenue in the server space. An almost 30% decline says nothing to me short of “wow, we’ll be screwed just a little bit longer...” Reminds me of Bob Uecker’s Harry Doyle in Major League describing an errant pitch as “... just a little bit outside” as it pelts the crowd way out of reach of the catcher. I’m thinking “euphemism depression!“ As someone directly challenging the employment market I struggle to find the silver linings day-in and day-out. While I do fancy myself a Master Networker and I do have some irons in the fire, if things don’t turn around sooner rather than later I’m afraid for my friends. I’m afraid for Silicon Valley.

Sorry. I know I’m probably breaking a cardinal rule in blogging by ending the week on a down note.

==============

Tuesday, June 16, 2009

Geography of Job Loss

Over the last few days I’ve caught a number of headlines for research on the recession and the impact on employment throughout the country. The research has rekindled an interest I had many years ago at SGI on data visualization. Economic data, web data, crash simulation data; whatever quantifiative measures you have the big research and marketing challenge has always been how do you quickly and easily identify trends and communicate ideas backed by overwhelming numbers. During my tenure at SGI there were a number of initiatives looking at how to weave the data visualization challenge into market opportunity. The engine behind what became e.Piphany, among other technologies, came from SGI.

Over the last ten years the tools, access and programming models have changed so radically that solving this problem has become so much easier. Adobe’s Flash animation technology alone has dramatically improved and simplified anyone’s ability to create animations and graphics that have changed the way we think of data based research and trending analysis. In the last few days I’ve run across the following two unemployment statistics analyses visualized:
visualization of data

http://tipstrategies.com/archive/geography-of-jobs/

and

http://www.slate.com/id/2216238

Same data, different metrics, very cool and understandable animations!

Now compare those charts mapping the data over time to the following charts from the Brookings Institute:

http://www.brookings.edu/metro/MetroMonitor.aspx

Animation and time lapse comparisons are so much more compelling and communicative than point in time analysis. Now with that said I’m reminded that strategic marketing and executive level folks have always been challenged with how to both identify and then communicate trends. My old friend Kevin Strohmeyer, product marketing extraordinaire, long had a picture / graphic in his office illustrating the Napoleonic campaign into Russia. He and I talked about it at length. It represented one of the finest graphic illustrations of data I’ve certainly ever seen:


Link: http://en.wikipedia.org/wiki/File:Minard.png

Now if it were only that easy to do with all the data!

More retrospection on Sun Microsystems

I just read about a speech that former Sun Microsystems Sales Chief Masood Jabbar gave at the Silicon Valley Historical Museum the other day. You can read it for yourself at: http://www.theregister.co.uk/2009/06/15/jabbar_sun_regret/

So many regrets like many of us former Sun employees but there was something interesting that took me back. Understanding that the Silicon Valley Historical Museum is now located in the former SGI Building 20, Mr. Jabbar waxes nostalgic in recalling how focused Sun was about putting SGI out of business. Here we are in 2009 and both companies are (or have already) disappeared (although it now appears that Rackable which purchased all remaining SGI assets for $25M is going to change its name to SGI). Funny thing about history is that Masood, Scott and the rest of the executive management at Sun may have been focusing on the wrong competitor to put out of business. Now I know that SGI was an initial and very successful competitor and a likely target for Sun but now with 20/20 hindsight keeping SGI viable and successful may have helped Sun far more than they realized at the time. The demise of the focused and successful Unix vendors has not only hurt IT but I think will hurt consumers as well. Those who are familiar with operating systems will say that Unix will always be around and will always have a place in the technology market (and it remains at the core of Linux and Mac OS) but the death of focused Unix vendors will only weaken the case for Unix versus Windows. And, let’s face it, that is the real battle for the hearts and minds...

As a long time Solaris and Irix user I chuckle when I read the numerous threads comparing and contrasting Windows vs. Mac OS. I laugh loudly once a quarter when I see IBM’s earning statement and the amount of revenue they steal from their customers in order to maintain and integrate everything and anything in a datacenter (IBM is the farthest from technological focus as there has ever been in our industry). The remaining vendors maintain a “we don’t care” attitude about recommending operating environments in a datacenter or (as we used to do it) by workload. Windows to so many is the same as any other operating system. While this logic and rationale makes perfect sense for SMB and many mid-market customers, any business above $500M in revenue should really think long and hard about implementing a Windows-only solution in the datacenter. Of course, VMware, recognizing this, makes bundles of cash supporting Linux and Solaris along with Windows in their hypervisor implementation.

The long and the short of this rant is that we seem to have lost a bit of the workload to OS implementation methodology of the late ‘90’s. This relegates the OS to a tactical (or licensing) role versus the strategic role it should play in a datacenter. (Remember the good old days of “-ilities?” As in reliability, scalability, availability... Whatever happened to that marketing? Nowadays I’m not sure anyone understands that just because a vendor claims these capabilities doesn’t mean that their OS can provide them.)

Funniest thing is that I’m sure as we load and grow our datacenters once again (knock on wood that this recession ends sooner rather than later) these very same issues will start to crop up again. (But don’t get me started on how the Cloud Computing promise may simply move the problem from corporate IT out to the service provider...)

Monday, June 15, 2009

Counterpoint to giving it away for free

A couple of days ago I wrote in this blog about the challenges of anything being given away for free. Now comes the following article summary of a recent speech by Wired editor-in chief-Chris Anderson:

http://www.techcrunch.com/2009/06/15/chris-andersons-counterintuitive-rules-for-charging-for-media-online/

I agree that something needs to happen with regards to the media / news business model. This challenge only highlights my previous entry on this subject: once you give something away it’s virtually impossible to start charging for it later. With that said I agree that determining what is unique is valuable. What someone can’t get somewhere else has explicit value that can be charged for. This is what the current news organizations are challenged to determine. I expect that the big networks -- CBS, NBC, ABC and FOX along with AP and a few of the larger more prestigious newspapers such as NY Times and Washington Post along with Wall Street Journal -- will fight it out for national and international news coverage prominence while the local papers and news outlets will focus on better local coverage. The problem is that it will only take one of these outlets to distribute for free to kill the other’s business models. They can’t work together as that is illegal collaboration. Hmm... this isn’t going to work. Thank god I’m not in the news business.

(Thanks to Dali Kilani for the link!)

Thursday, June 11, 2009

Giving it away for free

My first jobs out of college was in politics. I was hired onto two campaigns for two candidates seeking statewide office: first for Lt. Governor and second for US Senate. As I was responsible for Silicon Valley fundraising I got a ton of experience asking people for money. Every week we’d open the donations envelopes and inevitably find a stack of donations for between .50 cents and $5. Unfortunately I didn’t really understand -- nor had I read Hardball -- why we were wasting time depositing such small donations as it usually cost more to deposit than the donation was worth. Fortunately I worked for a real professional Campaign Manager who taught me an interesting lesson about human behavior that I have never forgotten. Regardless of the amount of money you always graciously accept the donation. It took a lot of conscious thought and effort to write that check, put it into an envelope and, in many cases even put a stamp on the envelope. Whatever their reasons for writing a small check -- they could be on food stamps, living in a shelter or on the street -- they felt it important to make the donation. The most important outcome of the donation was it created a tangible and significant emotional attachment between the donator and donatee. The donator feels invested in the candidate and will see the candidate’s success as equal to their own success. They will tell their friends to vote and wear a button and go door-to-door for votes. In many cases their was a reverse correlation between the size of the donation and their fervor for the candidate.

There is an important lesson here for technology and other businesses: think twice before deciding to give something away for free. While there is value in a time limited or capability limited free product, if the customer doesn’t have to invest in the product their fervor and enthusiasm for the product will be very minimal at best. If however they are invested -- even just minimally -- they can become strong advocates and even salespeople.

I know that this isn’t always true but it is my hope that it will remain as a cautionary reminder to thing long and hard about making your product or service free. One truism remains: it’s a lot easier to make something free than it is to try and charge for it later!

Tuesday, June 9, 2009

This isn't a post about the iPhone



For those of you not in the tech business you probably didn’t notice that yesterday was the first day of the annual Worldwide Developer’s Conference in San Francisco. Actually for many of you in the tech business you may not have noticed either (and certainly there are many of you who don’t care).

For many people any conversation about Apple is too much about Apple. How is it that a company with just 10% market share in PCs and 30% market share in a SMALL segment of the cell phone business can get so much coverage? Sure there is the coolness factor and the innovation thing but come on! None of these justify the amount of media coverage and fanboy-ism that Apple gets over its competition.

I’m not going to talk about that today either.

What I am going to comment on is that as successful as Apple has been over the last ten years and as innovative as their products and vision have been for consumer electronics (which have contributed inarguably to a sky high stock price, tremendous revenue growth and a growing stockpile of cash - $30B at last count), they have completely sidestepped a much larger revenue opportunity: corporate IT. I also know that Apple simply doesn’t care that they are leaving billions on the table: this I know because I have been personally rebuffed over and over again in trying to sell my services, proposals for business and other overtures to build a business value proposition for the company and its products. I have also spoken to many others who have been similarly turned away.

Think about it: Apple already has a limited server line. Their OS is built on top of enterprise ready Unix / BSD. With only some minor tweaking they could build a solid enterprise value proposition around innovation, quality and reliability. Yes it wouldn’t be low cost which is a major reason why Dell has been successful in selling into the enterprise. However with a focus on ease of use they could make immediate in roads into the SMB / Mid-Market business space which, while very sensitive to the price, is just as overheard constrained and ease-of-use would enhance a strained IT (or complete lack of IT for that matter).

There is precedence for this move on Apple’s behalf. There are numerous system integrators and resellers that are working to make Apple enterprise-ready. There is no reason for Apple to build a direct sales force to focus on an enterprise sale either. There are many ready channel partners who can build the GTM and the focused solutions from the desktop through to the datacenter. They have added Exchange support to their applications and will extend this critical business solution interoperability to the OS itself with the forthcoming distribution Snow Leopard. Now all they need to do is hire a few people with B2B experience to focus a GTM (and make even a few points in the largest IT market).

If Apple only took a small part of this advice imagine what their stock price would do then!

As I’ve tried to tell Apple many times before... I’m available if they want to talk about this opportunity!

Monday, June 8, 2009

Palm Pre

Today is the opening day of the Apple World Wide Developer Conference in San Francisco. This morning they announced new Macintosh laptops, a significant upgrade to the Mac OS and, what everyone was really waiting for, a software and hardware upgrade to the widely popular iPhone.

In honor of the iPhone announcement I thought I would take a few minutes to write about the other phone getting all the news coverage right now: the new Palm Pre. Most of the coverage leading up to the launch of the Pre over last weekend was about how Ed Colligan (Palm CEO) lured Jon Rubinstein (the brains behind the iPod and iMac) out of retirement to take on his old company in the phone space. The coverage -- and there was a lot of it -- focused on this battle between the old leader and the new leader in the handheld / smart phone market segment.

Now I should be fair and disclose that I was a huge Palm fan throughout the 90’s. I was an early Palm Pilot adopter and ended up purchasing three or four over the course of a decade to upgrade or simply replace an older unit. (I have a bad habit of dropping my phones and PDAs, usually with a bad result.) I invested Palm apps and used my PDA for everything I could. When the Treo smart phone was introduced I was ecstatic. I owned my 600 for years. With a few hardware exceptions -- that annoying buzzing sound aside -- I was a happy user. That is until Palm abandoned me wholesale. They introduced Windows CE phones and no longer advanced the Palm OS. They were happy to take my money but were never happy to advance me along the lifecycle of products.

With that said the one thing that impressed me the most about the iPhone -- no, not the interface or multimedia or handwriting recognition or applications -- was the fact that despite owning the first generation of phones I could upgrade to the 2.0 software for free. Yes the hardware doesn’t support all the bells and whistles but Apple was going to continue to advance the technology for me. More shockingly is that they were going to advance my software capabilities for FREE. I was happy to pay for it but the upgrades (with dot revisions there have been four or five now) continue to come for free.

Today Apple announces iPhone 3.0 software. Guess what? I’ll download it when it comes available on June 19th for free. I’ll flash my first generation iPhone and get additional functionality. Guess what? I’ll be happy...

Palm will never again get my business. I can’t afford their model. The Pre may be cool but if you’re interested in buying one you should ask for a commitment for future upgrades (paid or free) before you walk out the door! Palm is the number five phone OS vendor behind Apple, Nokia, Blackberry and Android. Do you really want to take a risk that they’ll be in business long enough to provide the support you deserve for something as business critical as your phone?!

Thursday, June 4, 2009

This just in: Intel is buying Wind River Systems

Intel is buying an embedded OS vendor. Hmmm think that they’re going to start inching even closer to the system vendors? There is more money to be made in system integration than hardware?! Where have we heard that before? Oh, there will be more to write on this subject!

Wednesday, June 3, 2009

Wake Up Facebook!

So my topic for the day was going to be commoditization. There are so many things that are increasingly moving to lowest cost genericized offerings. Just saw this news article about advertising revenue:

http://www.btobonline.com/apps/pbcs.dll/article?AID=/20090602/FREE/906029995/1078/newsletter011

No surprise really considering how many options are now available for advertisers. I was going to focus today’s blog on how advertising-based business models are bound to fail and how new sources of revenue based on value add is critical. I’ll save that for another day and focus on something related but I feel particularly emotional about today:

WAKE UP FACEBOOK!!!

I can’t say that loudly enough! They have become so blinded by an investor valuation that they’re missing the real opportunity for revenue and sustainability and ultimately long term relevance. Facebook will go the way of, dare I say, MySpace -- I don’t know anyone on MySpace anymore -- or Orkut (remember them?) unless they focus on their value add and build a business model that creates value from their service.

I know, I know that many will scoff at this opinion but regardless of the success they’re having today at building an infrastructure for communities some completely unknown service or technology will come around tomorrow, next month or next year that will be cooler, hotter or just easier and will totally subvert their success eliminating any perceived market valuation.

So what is the answer?! The funniest and most aggravating thing about the Facebook service is that there is a straight forward and powerful value that has distinct revenue generating potential that has not been leveraged. The largest and wealthiest market in dire need of Facebook’s easy-to-use, collaboration and knowledge management solution is business -- the Fortune 1000. The problem with Facebook is that they’re caught in a “cool” consumer cycle that they miss the business opportunity right in front of them.

Facebook should be able to relatively easily use their existing service as the basis for a corporate SaaS collaboration, knowledge management, auditing service. I can’t speak for every corporate intranet but certainly the ones I’ve used have been a nightmare aggregation of mixed systems providing limited functionality along with horrible search, auditing and collaboration tools that never quite get optimized. Worse yet, as we’ve all likely experienced, as soon as we get comfortable with one tool a new one is introduced with an entirely different interface and capability. Facebook is terrific for creating user controlled groups and integrating IM, email, web services and digital media for sharing, controlling and collaborating. The technical challenges would be minimal (at least initially as a SaaS offering only) with a mid-market sized business customer seeing immediate and significant value from an enterprise type offering.

More importantly, for Facebook, they could charge a reoccurring subscription fee to business customers that is predictable and not insignificant. It’s not advertising-based. With their advance feature set and brand recognition Facebook could quickly gain marketshare and significant customer success.

The fact of the matter is that without Facebook in this space there has been a number of Facebook type services popping up to service this B2B need. These services include: Grou.ps, TamTamy, Jive, and Igloo. Facebook has competitive advantage today but it must move quickly to address this need before the market opportunity passes them by...

Tuesday, June 2, 2009

Desktop virtualization

As a follow up to yesterday’s blog on the hypervisor revolution I wanted to spend a few cycles talking about the desktop. For the last 30 years we have become accustomed to personally managing our desktop OS. From business to consumer and back to business desktops our sense of ownership and entitlement has come at a very high cost. The cost of management, security, patching, corruptions / conflicts and ultimately the cost of the hardware upgrade (or replacement) necessary to support the next / latest version of the software we need to get our job done.

Now the idea of running your desktop OS or at least a subset of applications from the server is nothing new. The first desktops were dumb terminals running mainframe sessions. The evolution to thin clients was somewhat revolutionary in that you had your own GUI-based desktop image. However because the network wasn’t fast or efficient enough the thin client solutions were forced to make compromises that in most cases negated the benefits of moving off a traditional PC in the first place. You still had an OS image on the desktop to manage, patch and secure. In addition the customer often had a set of legacy peripherals that they were forced to scrap as they weren’t supported by the thin client. In addition many solutions had architectural issues -- single points of failure -- or unique management interfaces that they had to learn and certify. Often it was more challenging than simply sticking with the tried and true. Now, I’m generalizing and for simplicity not even dealing with application virtualization as an option. In addition many customers moved to a bare bones Windows virtualization solution from Microsoft called Terminal Services. This functions fine unless the user needs some basic things like support for sound. ;)

Which brings us back to the hypervisor. The wonderful thing about virtualization is that you can run any OS (x86-based that is) on the hypervisor on a server and deliver it via the network. The benefits of this model -- outside of centralized imaging, managing, patching, service and support -- is that you can optimize the network to a point that minimizes the technological footprint of the client yet delivers almost full desktop capabilities to an end user. Thus, in the last two years the introduction of the Zero Client. (I take some credit for that concept!) Yes a Zero Client may have some small firmware or bootware but for all intents and purposes we’re talking about a client side box that provides ports for network and peripheral access. (A place to plug in your monitor, mouse and keyboard.) There are a number of Zero Clients on the market today led in large part by Teradici’s OEM suppliers (such as ClearCube), Pano Logic and nComputing. Now for the initiated I don’t want to get in a discussion on whether Teradici or nComputing are indeed desktop virtualization plays at all considering their unique architectures or whether they (today) utilize a hypervisor backend solution -- in those two cases they don’t.

The background was important for my point: while this is revolutionary it ultimately doesn’t matter!

None of this matters for two reasons: commoditization and cheap alternatives.

  1. The three most significant components of desktop virtualization and its alternatives are: connection brokers (software that authenticates and connects remote users), hypervisors (disaggregators of the OS) and protocols (optimized network connections). Connection brokers were commoditized two years ago. You want a connection broker you call any vendor in the space and they’ll give it to you for free. There are no companies specializing in connection brokers left. VMware, Microsoft, Citrix and all the smaller vendors provide the connection broker at no cost. The hypervisor is being commoditized as I write this blog. VMware tried to sell ESX for $999 and Microsoft countered with their hypervisor, Hyper-V, for $28. Now you can get either for free. Xen the open source hypervisor has always been available for free. Sun Microsystem’s Xen derivative, XVM is also available for free. The highest potential hypervisor, KVM will begin shipping as part of Red Hat any quarter now as well. Their business, as VMware’s and Microsoft’s is to build a business on management tools and not on the underlying technology. Lastly there is the protocol. This is where most vendors in the space are focused on creating added value and differentiation. However with Microsoft working diligently to significantly upgrade RDP (provided for free) and VMware partnering with Teradici to distribute PCOIP later this year I predict that soon the protocol will also be free. This leaves tertiary players such as Pano Logic, Sun Microsystems, Wyse and even Citrix with their robust ICA stuck between a rock and a hard place to continue to drive customer’s to pay for little additional value add.
  2. Devices ultimately are unimportant because the value of any server-centric desktop solution should be to deliver an optimized experience to ANY device. Yes there are benefits to a Zero Client but the installed base of PC’s, netbooks, and most importantly cell phones is vast and each has a very unique value that will never be eliminated by thin clients or zero clients. Zero clients in particular have other challenges: since they have no state they need a secure network connection. This eliminates their ability to connect outside the intranet (at least easily and cheaply). This is yet another major obstacle.
The net of this discussion is that there is no doubt that the desktop computing platform will alter dramatically from the traditional PC most of us use today (and that includes many existing laptop users). Will it mostly look like a Thin or Zero client? Not likely.

Another view of virtualization

I’ve spent the large part of the last five years looking at server and desktop virtualization. It was only a matter of time before I was going to spend cycles in this blog focused on the role, impact and future role of virtualization on datacenter IT and ultimately on the desktop. I will likely continue coming back to this subject or one of its derivations (Cloud Computing, SaaS, etc.) over the coming months.

I thought I’d start with my view of the hypervisor on IT. The hypervisor is a disaggregation technology: it disaggregates hardware from software, x86 platforms from operating systems. For those Macintosh or Linux fans out there, the hypervisor is what easily brings Windows applications to your beloved platform. For IT the hypervisor allows applications to continue to run as you migrate to new platforms or seamlessly add additional workloads to existing platforms.

My contention has long been that for the obvious utilities -- consolidation, optimization, business continuity solutions -- the hypervisor has real value but only evolutionary value. Best practices don’t fundamentally change. Architectures evolve but don’t revolutionarily change. Management architectures do, even sometimes radically change, but the tools of the trade don’t really do more than evolve.

During my tenure at VMware I was struck by the fact that there are areas (two distinctly) that are revolutionarily impacted by the hypervisor. Two areas that, at the time, VMware wasn’t only minimally invested in. These two areas -- virtual appliances and desktop virtualization -- have become, under new leadership, more highly invested in at VMware but still, I believe, entirely under invested by IT.

The first area, virtual appliances, seems to be the least impactful over the last few years. There seem to be only a few companies trying to create a business around this concept. Virtual appliances, or the use of a hypervisor by a software developer to disaggregate OS decisions for their customers, could potentially have a profound impact on software development models and dramatically change the call-to-arms between warring .NET and J2EE camps. The ultimate impact of a virtual appliance was captured, albeit only briefly, by BEA (since acquired by Oracle). Since BEA had a fast derivation of an OS (their implementation of JRocket JVM) they wedded ESX with an optimized JRocket and their app server to create a “bare metal” implementation of their software stack for Intel-based systems. No need for Windows, Linux, Solaris or any other OS. Simply install the stack with a single click and you have an optimized software solution installed and ready-to-run. BEA could pick and choose which OS-level components it wanted to optimize and deliver as part of the stack. While the utility is obvious for the customer the ultimate savings were never realized by BEA -- no longer having to choose a single or multiple development paths for different OS platforms -- as they were shortly thereafter acquired.

There are some companies out there trying to build a software development business around this technology. rPath is one that comes to mind. It’s also clear that VMware has increasingly invested in software distribution for their partners using the hypervisor in this manner. They have built a software distribution community through their corporate site focused on this. However the long term impact on the “traditional” developer model hasn’t yet developed. I have faith that at some point it will. Perhaps it needs a bit more consumer exposure?

The other area, desktop virtualization, has been an area of heavy VMware investment over the last couple of years. It’s also an area that I know quite a bit more about having spent a year in thin clients with Sun Microsystems and the last couple of years with a desktop virtualization start up (which I am no longer associated with). I have also done quite a bit of writing about this space (and will obviously continue to) -- see my very first blog entry. Stay tuned for tomorrow’s update for more on the evolution of the desktop and the impact of the hypervisor on OS delivery.

Monday, June 1, 2009

The End of an Era - Part 2

So I just finished my entry on Sun Microsystems when I realized that there was another very sad passing that hasn’t gotten the kind of press that Sun - Oracle has: SGI. Good old Silicon Graphics! Yes, I spent time at SGI as well... They too had some religion albeit not at all the same passion for it as Sun did. SGI was willing to change and grow their solution to fit customer demand. Unfortunately their leadership (post-McCracken) was interim in every way and didn’t seem to care too much about the impact their short term thinking would have on long-term business. Just for clarification I’m not speaking about the SGI leadership of 2000. I’m specifically speaking about the year of “Rocket Rick.” This would-be savior came from HP’s printer division where he was hailed as a visionary leader who knew how to manage commoditized technologies and could take SGI’s graphics leadership to the next stage.

What he did instead was give away the farm.

SGI had already committed to a Windows path and was already working on an NT-based workstation. One small mistake there... the wonderful, industry-leading, 35M transistor graphics engine designed specifically for the Visual Workstation was hardwired to the desktop’s motherboard. This meant that for 6 months the customer had the top-of-the-line performance. After that they were stuck. There was no easy way to upgrade the components. Ooops. Trip. Not fatal but certainly embarassing. It was hard for many engineers to foresee the commoditization of graphics. But that was happening in real time.

No, that wasn’t fatal. What was significantly contributory however was SGI’s stewardship of the graphics API’s known as OpenGL. This was the cornerstone of SGI’s IP leadership. Yes, it was open to everyone but it was so advanced that SGI had a hand in virtually every big graphics and big data solution on the planet. (Big data was critical too.) Rick Belluzzo, eager to please his future employer Microsoft engaged in Project Farenheit. What this was supposed to be was a graphics interoperability project between OpenGL and Direct3D. What it ended up becoming was a way for Microsoft to successfully stall OpenGL development for a year or two while Microsoft enrichened Direct3D to make up some of the gap in technology.

By itself this wasn’t enough to kill SGI but couple that with the decision to 1. spin off the technology that would ultimately make business intelligence visualization pioneer ePhiphan.y and 2. adopt the Itanium processor as the successor to MIPS and you have the recipe for disaster.

For me the funniest and most tragic moment in my career at SGI occurred on the day Rick Belluzzo was introduced to the employees. Firstly he accidentally referred to “us employees at HP...” (I’ll give him that mistake) and his comment that if we didn’t do our jobs and execute we’d all be walking around with Sun Microsystems badges by year end. That was something! Most of us were asking what was so wrong with that?! In retrospect Sun gets sold for $7B and SGI gets sold for $25M.

The End of an Era - Part 1

It’s amazing how many people look at my CV and immediately ask the very same question: “So what about Sun Microsystems?” At least we have the next step defined: Oracle. The only remaining question is what’s next? Firstly as a former Sun employee I too drank the kool-aid. Funny metaphor in that it had the very same net result as the Koresh cult... death. Ultimately Sun’s undoing is due in no small part to that kool-aid. Technology is not a religion. It should never be treated as such.

As the former Director of Marketing for the Sun-Microsoft Collaboration and later as Director of Partner Operating System Marketing (my team and I were responsible for marketing all non-Solaris OS implementations on Sun’s hardware - including Windows, Red Hat, Suse, Umbuntu and VMware) I can tell you from first hand experience that Sun’s problems largely parallel the speed (or lack thereof) in deciding that selling solutions that customers ask for is what they should be doing. It took over two years for the Company to formally OEM VMware’s hypervisor and solutions stack. The Sun channel partners who wanted to sell VMware virtualization on Sun’s opteron-based server hardware usually sold an HP sku. Yes, sell Sun hardware and HP get’s a cut. That’s not a way to run a business! Some may even find it shocking that there was actually an Partner Operating System marketing organization. That, by itself, was progress for Sun. You should know that I reported to the VP of Solaris Marketing. Not an organizational structure designed for success.

So, what happens now? Oracle can quickly become one of the handful of end-to-end IT solutions providers -- along with IBM, HP, Microsoft/Dell, and perhaps Cisco at some point (I’ll blog on that later). Or, Oracle can decide to simply continue to focus on the software footprint and sell, close or spin-off the server and storage hardware parts. If I were a betting man I’d say that Oracle will look to use hardware to create appliances from much of their software stack and attempt to optimize the remaining hardware for the database and applications stack. (Which, by the way, they will fail to accomplish in the same way as everyone else in the commoditized platform business has failed to accomplish this. That after all is what commoditized means: everyone can build -- or have access to -- the same thing.)

Sun Microsystems will ultimately go the same way as DEC... remembered fondly in alumni groups until those too die off.

Saturday, May 30, 2009

Getting Started

Every long journey has a beginning. While I’ve already begun with a reprint of an article I recently wrote for Business Management magazine on the tremendous TCO / ROI benefits of desktop virtualization it probably makes sense to provide some personal background and focus for future entries.

I’m a 15+ year veteran of high technology. I’ve served in a wide variety of roles including public relations, analyst relations, business strategy, product manager and product marketing for companies such as SGI, Applied Materials, Apple, Cypress Semiconductor, Sun Microsystems, VMware and a variety of start ups. I’ve seen products -- good and not as good -- come and go. Some spark the imagination for what’s possible and some simply derivative. I can’t tell what will succeed but certainly in retrospect I can tell you why something did or didn’t.

Just for full disclosure I should mention that technology wasn’t my first love -- and it wasn’t my first job foray following school either. My first love is politics. I spent four years working for the Lt. Governor of the State of California followed by a stint as Silicon Valley political fundraiser for Senator Dianne Feinstein. I only disclose this as I’m very apt to fall into a political discourse at the drop of a hat. Ultimately everything that we do, see or experience is impacted by our local, state or federal government so it’s important to remember how everything impacts everything else we do. With that said I’m a big fan of free enterprise and capitalism. I’m not a fan of unregulated enterprise run amuck; we’ve seen what that can do. Nothing is truly free (unfortunately).

On with the show...

Wednesday, May 13, 2009

Desktop Virtualization: Smart IT Infrastructure for Challenging Economic Times

(Originally published in Business Management (Q2 2009):  http://www.busmanagement.com/article/Issue-15/Data-Management/Desktop-Virtualization--Smart-IT-Infrastructure-for-Challenging-Economic-Times/ )

by Benjamin Baer

With the recent economic downturn, almost every industry sector has reported a decrease in their IT budgets. Reshuffling their priorities for the new year, organizations will focus on inefficiencies and redundancies in their IT infrastructure.

With analysts expecting IT spending to slow or decrease in 2009[i], IT managers must discover which technologies will enable them to cut their costs, leverage existing investments and improve overall management.

Over the last ten years, server virtualization - data center optimization products from VMware, Microsoft, Citrix and others - has materialized as a technology that offers flexibility and greater return on investment from IT resources. In these tough economic times, virtualization can provide great advantages to companies by significantly cutting operational and infrastructure costs, and in many cases, increasing productivity. Today, new approaches to virtualization are emerging that leverage existing infrastructure and can optimize desktop infrastructure in much the same way.

Server-based desktop virtualization moves the management and security-sensitive components of a PC - software, memory and drivers - to the data center. By using this approach, organizations can centralize and improve desktop management while concurrently cutting maintenance and deployment costs, improving security and reducing energy consumption. Desktop virtualization can also help create a computing infrastructure that enables quick growth, flexibility and scalability as opposed to traditional PCs, which because of their infrastructure costs and deployment model, have limited flexibility and much higher operational costs.

In any desktop architecture, there are three components of measured cost: initial acquisition cost, short- and long-term maintenance costs (which, when added to acquisition costs, are often referred to as "total cost of ownership), and the harder to quantify, green and security costs. With IT budgets decreasing, one of the greatest advantages of desktop virtualization solutions is that organizations do not need to invest heavily upfront before obtaining short- and long-term benefits such as flexible deployment models, higher endpoint security, longer endpoint lifespan, and centralized management, to name a few.

Debunking Desktop Virtualization Myths
As with other emerging technologies, a number of myths surround the implementation of desktop virtualization.  These myths include beliefs that virtual desktops deployments 1) are more expensive because of the cost of the back-end infrastructure, 2) do not offer end users the same user experience as a traditional PC, and 3) cannot support peripherals and changing user demands.

What IT often does not realize is that with newer server-based approaches to desktop virtualization, virtual desktops work within the same infrastructure as traditional PCs and are often less expensive to implement despite the back-end server and platform virtualization needs. The cost of a server plus the cost of a virtual desktop deployment is the same or less than the cost for the same number of PCs with the same amount of computing power. Additionally, by deploying desktop virtualization instead of traditional PCs, IT managers realize additional time, green and cost savings that traditional PCs do not offer.

While traditional thin client computing has netted many customers the benefits of centralized management and more secure endpoints, many new built-from-the-ground-up desktop virtualization solutions have far exceeded what older thin client computing alternatives could achieve. New solutions exist that can provide a seamless Windows user experience, access to all the peripherals and devices an end user may want to use along with all the benefits of centralized deployment and management. Some of the solutions can even provide a more robust security component than the thinnest thin client.

Improved Management with Desktop Virtualization
Gartner recently announced that effective management of the PC can reduce the total cost of ownership for desktop PCs by 42 percent.[ii] By centralizing control of desktops within the data center, server-based desktop virtualization makes management significantly more effective while leveraging virtualization so that organizations can get more out of underutilized servers.

With central management, IT managers can easily roll out software, updates and security patches. Instead of attending to each computer on-site, which can take hours or even days for some remote branch offices, IT can manage desktops from the data center and instantly provision new desktops for users - eliminating the break-fix cycle and required on-site repair work. During difficult economic times, this means lower operational costs, quicker time to deployment and a far more flexible infrastructure that can adjust quickly to end user demands. Central management also enables desktops to plug into a unified disaster recovery and business continuity infrastructure. If a user's desktop crashes, the user can easily be provisioned and returned to the last working instance of their Windows desktop.

With server-based desktop virtualization, security is better managed - desktops and their users can be assigned and identified regardless of where users are accessing their desktop.

This secure and reliable log-on process can integrate tightly with directory services such as Microsoft Active Directory. Rather than maintain a separate database of users that might become out-of-sync, desktop virtualization connection brokers typically query Active Directory directly for each log-on attempt. IT administrators only need to perform user account management in one place: Active Directory.  The same authentication process can also be created using LDAP directory infrastructure.

With so much time saved through centralized management, IT is able to drive strategic technology initiatives focused on contributing to the bottom-line rather than tactical initiatives such as maintaining an end user's desktop.

Going Green
Although most organizations will decide to implement desktop virtualization because of its initial and operational cost and time savings, green benefits are another significant way that organizations can save with server-based virtual desktop solutions.

According to an Environmental Protection Agency (EPA) report to Congress, data centers consumed about 60 billion kilowatt-hours (kWh) in 2006, which was around 1.5 percent of the total U.S. electricity consumption for that year.[iii] For organizations, lowering energy consumption is not just about benefitting the environment, but also about saving money. With desktop virtualization, the client device at the desktop uses less energy per hour than the average 350-400 watts per hour used by a traditional PC. Even with the energy consumed by the server hosting multiple desktop instances in the data center, the energy spent with desktop virtualization is still far less than with PCs.

For further savings, organizations may also begin to audit the amount of energy they use to determine if cuts can be made and if departments are meeting organizational energy standards. Many energy companies[iv] have created rebate programs as incentives and allow organizations to submit green IT initiatives for possible rebate.  In most cases, they also provide free energy audits, so that companies are able to track what they save. By addressing the energy use of PCs and deploying desktop virtualization, companies will find an excellent way to reduce consumption while cutting costs.

Security and Additional Benefits Realized
Acquisition cost savings may be the impetus for adopting desktop virtualization, but they are only the beginning of realized savings. Many companies that deploy virtual desktops say they experience many additional benefits - the most important of these being increased security. Because of centralized management, IT is able to control who is accessing what computer when. In addition to access and identity management, IT has complete control over USB ports and can determine how certain users may or may not use them. This works twofold: preventing the spread of malware or a virus from an infected end user USB peripheral like a flash drive, as well as preventing the removal of unauthorized data onto these types of devices. Even when users have authorized access, certain solutions record USB operations, allowing businesses to keep track of all their information assets. Central management also enables companies to more easily comply with the Sarbanes-Oxley Act regarding required email retention by making it easier for IT managers to determine what remains in storage.

These capabilities will only enhance desktop utility in an era where flexibility and transient workforces become de rigueur. As temporary workforces ebb and flow, desktop virtualization can provide a powerful security solution for IT by ensuring that any employee who has access to company resources doesn't also provide an insecure portal to the company's intellectual property.

Because of the security-enabling component of desktop virtualization solutions, end users also have more mobility. If IT can ensure that end users only access their particular desktop image no matter which desktop device is used, then IT can also enable the end user access to their desktop from multiple locations. This ability increases the flexibility of desktops and enables IT to support a variety of Windows usage models depending on the business. For example, a call center consisting of multiple agents performing the same tasks and running the same applications will use a pooled Windows collection model. In this model, a common set of identical virtual machines is maintained so that when a user logs in and starts a new session they are assigned an arbitrary virtual machine. Alternately, a cloned Windows collection model can be used for deployments where workers require a dedicated virtual machine, but can still benefit from simplified management and use of a common template. This model permanently assigns each user a specific virtual desktop; however, the Windows image originates from the same template. The Windows collection models also enable far better licensing control and management for IT to ensure that not only are software licenses current but also optimized for the number of end users.

Finally, if an organization does decide to adopt desktop virtualization, they can take comfort in the fact that their solution has longevity, and will not quickly become obsolete like traditional PCs. Many studies have concluded that the average PC will have to be upgraded or depreciated over three years. The useful life of a desktop virtualized client solution can be twice the length of a traditional PC. As long as the desktop hardware device is capable of handling the software updates made on the server, the solution will still work. This means that a company has the ability to purchase now and easily scale out in the future, without worrying that their technology will be outdated.

Pano Logic & Pano Virtual Desktop Solution
A truly unique, built-from-the-ground-up desktop virtualization comes from a company called Pano Logic[v].  The Pano Virtual Desktop Solution (VDS) provides a complete Windows desktop with support for USB peripherals and a native video and audio experience. This level of Windows support and feature-rich Windows user environment is even more significant when you consider that Pano VDS is delivered through the industry's only zero client. That is, the Pano desktop is a completely stateless, CPU-less, memory-less endpoint; the device doesn't even have a fan. To the IT manager this means a device with the highest levels of security, robustness and deployment flexibility.

Additionally the Pano VDS provides self-help options and the ability for quick and easy desktop configuration changes creating flexibility that offers users a superior Windows experience compared to traditional PCs.  This capability is enabled by IT managers in minutes and provisioned remotely from the data center.  In case of a Windows or application error, Pano VDS can also provide end users the ability to return to the last working Windows instance. This capability improves help desk implementations and empowers end users by giving them the capability to help themselves.

The Pano Logic green story is differentiated as well.  The Pano zero client desktop consumes only 3% of the energy consumed by a traditional PC - just three watts - and only 18% of the energy when including its share of the server's power. The Pano VDS also provides a smaller packaging footprint, resulting in savings on waste disposal. The average PC weighs 30 pounds and requires 5-10 pounds of packaging to ship. The Pano VDS client is 18 cubic inches in volume, with a shipping weight of 1.1 pounds, of which only 3.5 oz is packaging - reducing shipping and waste disposal costs.

As the Pano client does not have any software, operating system or moving parts, its useful life can be twice the length of a traditional PC. By leveraging a server-based Windows instance, the Pano solution is future-proofed.

Conclusion
While the future of the economy remains uncertain, organizations should look to invest in technologies that will protect and leverage their existing investments as well as streamline IT processes to cut costs and make IT more nimble. A desktop virtualization solution like Pano VDS is a great investment because it cuts initial and maintenance costs while improving management and green savings. However, it also gives organizations desktop flexibility and results in a host of secondary benefits which often can save as much time and money as the primary benefits.

References:
[i] www.gartner.com/it/page.jsp?id=776112
[ii] www.gartner.com/it/page.jsp?id=636308
[iii]yosemite.epa.gov/opa/admpress.nsf/0de87f2b4bcbe56e852572a000651fde/4be8c9799fbceb028525732c0053e1d5!OpenDocument
[iv] For example, Pacific Gas & Electric in Northern California (http://www.pge.com/hightech/)
[v] www.panologic.com