November 7, 2003
Microsoft announced this week that it has entered into a semiconductor technology agreement with IBM. Under the agreement, Microsoft will license IBM’s semiconductor processor technology for use in future Microsoft Xbox products and services to be announced later. According to IBM’s Technology Group, the new Xbox technologies will be based on the latest in IBM’s family of state-of-the-art processors.
While IT analysts are well used to press releases that studiously avoid the subjects they supposedly address, a select few approach a level of opacity that suits them for reuse as nuclear bunkers. Given the growing economic and market impact of computer gaming, the subtlety of the Microsoft/IBM announcement is not especially surprising. The gaming industry’s revenues eclipse Hollywood’s traditional entertainment bastion, and any tip of the hand offers the competition a chance to plan or play early catch-up. That said, there are some assumptions one can make about the brains of next-generation Xboxes that may not be too far from what gamers scribble in their letters to Santa. First off, we believe it likely that the Xbox will use some iteration of IBM’s PowerPC (PPC) as its core processor. IBM’s Microprocessor organization has been particularly adept at customizing the PPC for a wide variety of applications (including the “Gekko” chip that runs Nintendo’s Game Cube) and there are indications that POWER will play a significant role in the “Cell” consumer device processor architecture IBM is co-developing with Sony and Toshiba. The real question is whether one interprets “latest” to mean the PPC 970 (the core of Apple’s new G5 machines) or something yet unveiled. While some dismiss the notion that Microsoft would abandon Intel’s venerable x86 architecture for the Xbox, it makes simple sense to us. For over a year, media stories have examined the relative ease with which an Xbox can be hacked with “mod” chips, changing it from a proprietary Microsoft platform to an entertainment terminal capable of playing/running a wide variety of games, media, and applications. By simply shifting processors, Microsoft can leave controversy behind and move on. This is not to say that a PPC-based Xbox could not be hacked, but mod chips for altering the platform are in shorter supply than for the far more ubiquitous x86, and we believe that altering the re-engineered second generation Xbox will be a far tougher and more complex task than the first time around.
So who are the winners and losers in this deal? We see Microsoft as the primary benefactor. Not only does working with IBM offer Microsoft a respected, dependable platform (déjà vu anyone?) for its future gaming efforts, but IBM’s strategy of driving the development of robust, integrated technologies for emerging consumer and home computing applications is similar to Microsoft’s vision of the future. Microsoft will face some challenges in migrating existing Xbox customers to the new platform, but significantly improving popular games and offering upgrade rebates should remove some of the sting. IBM is also an obvious winner. Though the Xbox does not own a huge portion of the game market, inking the Microsoft deal means IBM technologies are likely to soon become the de facto platform for the computer gaming industry. That is a street rep any IT vendor would give its eyeteeth for, especially at a time when technology seems finally to be catching up with visionary notions of consumer computing. If there are obvious winners, there are also obvious losers; in this case, Intel. Losing the Xbox will cost Intel more in prestige than revenues, but the strategic fallout will last longer than many might expect. When the fully wired, media-rich, Internet-enabled home is finally realized, the brains of the household are likely to reside in a ubiquitous gaming console-cum-PC, cum-cable box, cum-TV, cum-consumer server/storage device branded by companies including Sony, Toshiba, Nintendo, and Microsoft, along with IBM. Losing out on a substantial piece of that market is gonna hurt no matter how you try to spin it.
IBM's Lotus Software Group has announced that there will be future releases of Lotus Domino (the current release is 6.5) with the goal of providing full rich client access to applications whether the user is connected or disconnected from the Web. Domino 7.0, due in the fourth quarter of 2004, promises DB2 support, enhanced portlet support, and integration with Lotus’s other Workplace offerings. One of the key design goals for this release is to provide email support for disconnected users and to facilitate rapid synchronization when they are reconnected. Domino 8.0 will extend the same class of support to applications as well as a plug-in approach to adding or extending applications. The company also announced that the future Notes client would be based on open-source Eclipse code to provide users and developers greater flexibility, and that over time the Domino Collaboration Server would be blended into the company's overall Workplace scenario. The company reiterated its pledge of continued support for current Notes and Domino development and applications in future versions of Notes/Domino.
IBM is clearly committed to the goal of bringing the Notes/Domino brand and technology equity and 100+ million user base into its growing Lotus Workplace efforts, while not losing compatibility with the tens of thousands of custom apps running on earlier Notes/Domino releases. The company’s plans for Domino, in shifting from proprietary code to a standards-based J2EE foundation, illustrate IBM’s belief in a flexible component base architecture for appropriate re-use and integration throughout its middleware. The announcement to use Open Source Eclipse in its Notes client indicates its commitment to deliver more flexibility, impact, and value over time than the former proprietary model could provide. IBM is not alone in its commitment to include Open Source as an elemental part of its product development plans. At the same time as the IBM announcement, Novell took another step in its evolving Linux strategy with the acquisition of SuSE Linux.
As the integration of the computing infrastructure has matured sufficiently to complement vendors’ utility-like IT delivery and management aspirations, those same vendors are becoming aware of the opportunities available in truly integrating, uniquely delivering, and thus enhancing the value of their clients’ information. Many smaller ISVs are joining IBM, Microsoft, and potentially Novell in focusing on crafting standard access and information management methods at the client level, to utilize Web Services and middleware business processes to deliver highly actionable, well-integrated information rather than just shuffling bits and bytes from high volume data centers. It is a classic view/model/implementation problem from Computer Science 201. With the emerging model of utility computing growing in relevance, we believe that basic and ubiquitous tools (as opposed to highly specialized and often isolated systems) for knowledge workers is the next step in delivering the rich context and clear value proposition of business computing integration.
Veritas has announced the immediate availability of CentralCommand Service 3.5 software, which the company described as the foundation for utility computing offerings including storage, back-up, and recovery services. The new solution is integrated with Veritas NetBackup and Backup Exec software, and allows IT organizations to define, manage, and monitor services through a Web-based portal. CentralCommand Service 3.5 also provides users business-level reporting, and charge back and cost allocation features.
In addition, Veritas announced new releases of NetBackup 5.0 and Backup Exec 9.1 for Windows, which include desktop and laptop options. Finally, Veritas introduced Data Lifecycle Manager 5.0, which adds regulatory compliance features to NetBackup and Backup Exec. According to Veritas, Data Lifecycle Manager 5.0 automates the placement and management of data in virtual archives, provides search and index technologies for data retrieval, and allows historical information to be automatically swept into archives. CentralCommand Service 3.5 is available immediately starting at $22,000. NetBackup 5.0 is scheduled for general availability in December 2003 with pricing beginning at $5,000, while pricing for the desktop and laptop options begins at $2,500. Data Lifecycle Manager 5.0 is scheduled for availability in Q1 2004. No pricing information was included.
Veritas’s press releases follow similar announcements by companies including EMC and IBM. Given that, the company’s new solutions can be regarded as indicators for shifts occurring in the greater storage marketplace, suggesting that vendors believe enterprise storage customers want to be able to deliver utility-style storage services to end-users and need help in dealing with regulatory compliance issues. Gauging the demand for such offerings requires both tactical and strategic considerations. Veritas’s traditional depiction of itself as the “best of breed” storage software solution provider is being challenged from two quarters. On the storage-centric side, EMC’s enhancement of its myriad hardware products and aggressive development of its software solutions through internal development and acquisition of companies including Legato have shifted the playing field. IBM and Tivoli’s continuing development of integrated storage offerings have bolstered IBM’s greater systems strategy. Additionally, the companies’ message of the critical role storage plays in bolstering overall IT and business productivity appears to be resonating among many end users.
So what is Veritas to do? Tactically, the company’s new products seem to be adequate upgrades of existing offerings. While we are uncertain about the demand among enterprises for “utility” buzzwords, centrally managing storage resources is a winner among businesses that are trying to improve storage utilization and ease data management. With the looming deadlines for Sarbanes-Oxley, HIPAA, and other regulations demanding compliance-approved solutions, Veritas Data Lifecycle Manager 5.0 would seem to be a slam-dunk in a ready net. However, we are curious why the announcement lacked concrete information about the regulatory agencies Veritas is working with or regulated areas where the new solution can be applied. Overall, while these announcements appear to be adequately aimed at obvious targets, they leave the reader with nearly as many questions as answers. This may have been fine in the past, but given the current state of the storage market and Veritas’ competitors, it will not be nearly enough today.
This week IBM announced the latest additions to its Express portfolio of products designed for the mid-tier enterprise market. Among the new offerings are IBM Life Sciences Express Solution for SAS, IMB ERP optimization services, Siebel CRM on Demand, WebSphere Business Integration Express for Item Synchronization, Surf Aid Analytics, and DB2 Everyplace Express. The company indicated that the offerings were designed to help mid-tier enterprises continue their efforts to become on demand businesses and noted that mid-tier companies need integration solutions and infrastructure management capabilities. The company also stated that mid-tier enterprises are expected to spend over $100 billion in the next three years on IT products related to integrating their IT footprints.
Since June, when IBM announced its strategy to build a line of products specifically designed for the mid-tier market, it has delivered dozens of products to that end under its Express portfolio. We have previously noted that the company has what appears to us a cogent, company-wide vision of what needs to be done to capture the revenue opportunities of the mid-tier market, one that not only is growing but also is demanding IT offerings that only a few years ago would have been the stuff of large enterprise IT deployments. Mid-tier companies need an ever-growing array of products and IT capabilities just to remain competitive and to offer themselves — and their customers and partners — the ability to play in the modern marketplace.
At the same time, IBM is pushing forward its vision of IT in the coming years, calling this new vision On Demand computing. IBM is not alone in this market; others like HP talk of the Adaptive Enterprise and utility computing. While these concepts all sound fine and dandy, it remains clear to us that the market is still scratching its collective head about what these concepts actually mean on a practical basis. In our minds these concepts are comprised of two sets of capabilities. They are the outward facing elements of On Demand or the Adaptive Enterprise and the internal issues of integration and infrastructure management that allow the outward facing element to occur. IBM seems to understand this concept, although the company’s efforts to drive this idea permanently in to the market’s consciousness still has a way to go. That said, we believe IBM’s efforts directed at the mid-tier enterprise — especially in its Express offerings — remains a sound strategy effectively executed. We believe the success of these efforts will, however, be greatly enhanced as the mid-tier market gains a clearer understanding of the practical value propositions of On Demand computing, an understanding that will benefit both IBM and its customers.
Microsoft announced this week that it has established a $5 million Anti-Virus Award Program in an effort to aid law enforcement agencies in tracking down the writers of malicious code. Microsoft said it would provide reward money for information that leads to the arrest and conviction of those writing and distributing viruses and worms. The first specific award amount announced by the company was a $250,000 reward for information leading to the arrest and conviction of the individuals or individuals responsible for the MSBlast.A worm. The company also has offered a second $250,000 reward for information leading to the arrest and conviction of the individual or individuals responsible for the dissemination of the Sobig viruses, variants A, B, and C. Law enforcement agencies that will participate in the program with Microsoft include the FBI, Interpol, and the United States Secret Service. Individuals with information on the release of the MSBlast.A worm or the Sobig viruses can contact any of the named law enforcement agencies.
On one hand, we cannot really say we are all that surprised at this development. Given the widespread havoc that these particular viruses created and the fact that to date only peripheral players in their release have been apprehended, it must seem like a good idea to begin offering large wads of cash to entice someone to rat out a fellow coder who has privately taken credit for sowing so much worldwide disruption. Will it work? Perhaps. Law enforcement has historically made some of its most high-profile arrests through information provided by informants.
But just offering cash for hackers is not enough, in our minds. While law enforcement agencies are slowly getting up to speed on the technical issues surrounding various forms of cybercrime, we would be very comfortable in arguing that they are far behind most hackers and security specialists. The recent instance of the U.S. Justice Department releasing online a heavily-redacted report as a PDF concerning sensitive internal policy matters — only to have an individual easily remove the electronic redactions and publish the full and embarrassing report online — comes as the most recent example of an apparent lack of sophistication concerning computer security issues among law enforcement professionals. With many legal agencies now occupied with the hunt for terrorists, it is only a matter of time before cyber attacks of various kinds are publicly traced to designated terrorists or criminal organizations. Will the FBI, CIA, etc. be ready? Considering their past efforts, perhaps the best and wisest answer is to assume they won’t be. It is reasonable to assume that since disruptive and criminal online activity is only going to continue in the coming years, security issues will not be resolved by creating an “Internet Most Wanted” program complete with big cash prizes. For these reasons and others, we believe the responsibility for providing the first line of secure and resilient IT infrastructures largely lies with vendors like Microsoft. Considering the ever-growing trend toward IT deployment integration that IT vendors are actively promoting, security breaches of all sorts have the potential to become quickly catastrophic since gaining entry to a well integrated network can result in gaining entry to its every nook, cranny, and resource. At that point, a cash reward for law breakers will be of little consolation to devastated enterprises, who are as likely to blame inactive vendors as proactive hackers for their problems.