November 19, 2004
Microsoft has invested in a small Utah-based company that provides software allowing Windows-based authentication, management, and monitoring abilities to non-Windows environments, specifically UNIX, Linux, and Macintosh. The exact amount of the Microsoft investment in Vintela was not announced, but it was said to be less than $10 million. Vintela is a startup company that intends to seek another round of funding in the next year, the company said, and its base technology is based on intellectual property the company acquired from former Microsoft rival Caldera. Vintela is shipping products, including its Vintela Systems Manager, which allows Microsoft Operations Manager 2005 to provide management, authentication, and monitoring capabilities to UNIX, Linux, and Macintosh OS X systems.
For the most part, the bridge between the Windows and non-Windows world has largely been engineered and built by folks who do not reside in Redmond. To these engineers and builders, the reality is simple; Microsoft has a significant footprint in the marketplace — and a dominant one on desktops — so that the river of IT development must flow through it. In this regard, Microsoft is a large boulder in the river that water must find a way around. And of course, Microsoft sees things largely the same way, only to the point where it hopes that it will not only be a boulder on the river, but a complete dam from which all IT resources will flow. From this point of view, it is not surprising that Microsoft has engaged in so little bridge building to this point in time.
So does the investment in Vintela mean that Microsoft recognizes it lives in a heterogeneous world and that it must do its part to at least provide an anchorage on which its rivals, the bridge builders, can build toward? Are we seeing a new realpolitik from Redmond? Before we go too far out on that rhetorical tree branch, let’s remember that this investment is a mere grain of sand in the Microsoft treasury. It is not the stuff of a major repositioning of Redmond’s strategy. Think of it as more of a hedge bet. Microsoft detractors might go so far as to argue that this is just so much more FUD that the company is putting out, a seemingly conciliatory gesture to customers with heterogeneous environments that will evaporate at about the same time these customers seek major upgrades to their non-Windows IT infrastructure. Perhaps. But given the undeniable market presence of Linux (and yes, Macintosh) one would think that the world’s dominant software company might have finally recognized the inevitable: there will be competition. We will know the answers to these questions in years, not months, and in the meantime we’ll keep an eye out on the relative success of companies like Vintela who seem to be taking up Microsoft’s bridge-building work for it.
Sun will offer a new computing service by the end of 2004, called the N1 Grid Service, which will allow customers to buy computer processing power on an hourly basis. The customer will pay $1 for every hour of CPU usage. No service contract will be required and customers can purchase computing power with a credit card, using a Web browser. The service will be purchased on Sun’s web site and the company will let customers upload their software and data onto one of two1000-CPU Sun grid-computing facilities. These data centers are designed to harness all or some of the processors, on one or multiple big processing tasks. The system will support software running atop of Sun’s Solaris operating system, and various versions of the Linux open-source OS. When the processing is completed Sun will clean all traces of the user’s software and data. The company plans to eventually expand the program to auction off excess processing capacity on eBay, at prices that could be less than $1 per CPU per hour.
Renting time-shared computer services on a fixed price-per-hour basis and auctioning excess capacity through eBay is new. Further, the utility computing market is just emerging. For example, a recent market survey found that only one in five IT managers understood grid computing and only 6.8% of the respondents indicated they were implementing or planning to adopt a grid computing model. In view of these facts, N1 Grid customers will be technically more sophisticated and fall into three main categories: those with high compute-intensive applications, those with limited IT resources, and/or those that require excess capacity to meet cyclical computing demands. Up to this point utility computing has been targeted at large enterprises, and utility computing implementations have been primarily within the four walls of the enterprise. However, IBM and other vendors offer utility computing services hosted in their datacenters, external to the enterprises they serve. These pay-as-you-go services typically require, at a minimum, an annual contract, with multi-year contract terms more common. Contracts usually incorporate a fixed annual fee and a pay-per-use component. The Sun offering is a pure pay-as-you-use model. The payment and financial scenarios for utility computing may differ, but the customer value proposition remains the same. From this perspective Sun’s N1Grid offering is not so new.
The N1 Grid Service, in a sense, makes the computing resource a commodity and would seemingly undercut Sun's business. That said, Sun remains a value-added computer seller; in fact, Sun would claim that is its differentiation. Just because an ERP solution is running inhouse or on a leased compute grid does not change the value of the ERP solution to the enterprise using it. This is similar to the choice of leasing an internal phone system or owning the phone system outright: the value is in the use of the system, not how it is delivered and financed. Hence, we don't think Sun is about to become a commodity solutions provider, but rather that it will deliver some of the solution with what the company hopes becomes commodity components (CPU cycles), with a favorable bottom line to the company.
Microsoft has announced that it and SBC Communications have entered into a $400 million, ten-year deal in which SBC will use Microsoft’s TV Internet Protocol Television (IPTV) Edition software platform. SBC has been testing an IP-based television service using the Microsoft IPTV product since June and will begin field trials in the middle of next year, with a commercial availability in late 2005. SBC plans to begin construction of Project Lightspeed, in which the company hopes to deploy fiber optic networks closer to customer locations in order to offer IP-based services, including IP telephony and TV. The company hopes to have the network in place to serve 18 million households by the end of 2007.
At first glance, this would seem to be largely a consumer-directed offering, with the most impact made on the various market participants in delivering bits to households. Certainly cable companies can expect to feel a squeeze as SBC and other regional phone companies begin offering a host of IP services — including entertainment — over an IP network. Of course, the phone companies are looking for ways to eliminate or at least substantially reduce the enormous glut of unused bandwidth capacity they own, and delivering feature-rich TV signals and services is certainly one way to do so. Microsoft gets the opportunity to find another way to move its franchise out of the home office and into the living room just as it is trying to do with its Internet gaming services.
On the enterprise side of things, the development of a successful IP network that can reliably deliver video at today’s required standards will have impressive impacts on the perceived viability of IP services, especially the nascent IP telephony market. Reliable, high-quality VoIP will be a mere afterthought when all the bugs in IPTV are worked out, thereby driving it more into the enterprise mainstream. If IPTV is good enough to deliver the Lord of the Rings Trilogy to discerning viewers, one suspects it will be good enough for enterprise videoconferencing (and a host of other features). The changes these developments could foment are not abstract, nor are they very far into the future. We call that moving at the speed of light.
At its IT Forum Europe, a Microsoft user event, Microsoft revealed more of its vision of management for the Windows universe, called the Dynamic Systems Initiative (DSI), which is meant to deliver improved technologies to simplify and automate tasks associated with creating and maintaining existing Windows infrastructure. At the heart of DSI is the intended combination of creating relevant knowledge bases and modeling the appropriate architecture. It begins initially with the implementation of management packs — which monitor server health indicators within applications and call attention to problems preemptively when possible — through Microsoft Operations Manager (MOM), and longer term will be driven by use of the System Definition Model (SDM), which is available in Visual Studio 2005.
Microsoft’s plans are ambitious and involve yet another round of new standards (SDM) to be introduced and adapted by partners and developers, as well as massive shifts in IT department thinking about their Microsoft environments. At the same time, Microsoft has listened to its customer base, and understands that management of the Windows environment is the first and most important goal. Microsoft has made it clear that it is interested in managing things Microsoft rather than seeking to replace the management applications already in place within the data center. Microsoft is adopting an architectural approach to management, seeking to accumulate and make use of the enormous knowledge base that exists in systems and using models that demonstrate how systems should be built. The idea of a knowledge management engine at the core is attractive and much needed in the highly distributed world that is Microsoft.
At the same time, Microsoft’s approach reveals both the strength of its software heritage and their relative lack of experience with business processes. Microsoft’s approach is to understand the relationship between IT professionals, application developers, and information workers. What is obviously missing from this picture is the role of business processes, the inherent service value, and the explicit potential for cost savings. Microsoft’s response is that it is starting with the plumbing at the core and will move out to encompass business issues over time. And yet, an application’s importance and performance is not affected by technical factors alone. Business issues and politics are equal players in the battle for resource allocation at all levels of a company, including the deployment and use of technical resources. We expect these components will be added into the modeling process over time, but without them businesses will be unable in the interim to achieve optimal efficiency without figuring out technical ways to express those elements. We believe that one of the primary benefits of adopting this technology is that it will lower costs both in terms of how IT personnel spend their time, but also in overall efficiency, as Windows systems are notoriously underutilized and under-managed in many environments. However, Microsoft continues to focus most of its cost-of-ownership resources combating Linux. It would do well to help customers understand the important potential cost benefits of incremental increases in efficiency as part of its launch message to a user base with widely divergent needs and experience levels. Finally, the complexity of management means that although the tools are shrink-wrapped, the solutions must be tailored to the specific user environment. Microsoft will also need to demonstrate partner programs that guarantee users that their service provider is capable of using Microsoft tools effectively. Poor implementation of this product would reflect back on Microsoft itself. This product has great potential and is sorely needed. Microsoft’s success in driving DSI will be an indicator to the rest of the IT world that Microsoft remains a key competitor for the heart and soul of the data center and an industry thought leader for the next wave of computing.