August 22, 2003
The blackout that rolled through several Great Lakes and Mid-Atlantic states — Vermont, New York, central New England and Ontario — seriously affected major cities including Cleveland, Detroit, Toronto, and New York City. According to reports, a key contributing factor was the region’s aging electrical grid infrastructure. Along with its impact across myriad consumer and business activities, the blackout provided a real world pop quiz on the resilience of the Internet and the effectiveness of enterprise disaster recovery plans. Keynote Systems, which monitors Internet performance, said that the major Internet backbones in the twenty-five largest U.S. metropolitan areas functioned normally following the event. SunGard Availability Services, which delivers disaster recovery support for enterprises, announced that it had received disaster declarations from sixty-two companies, and that an additional hundred had put the company on alert. This is the largest number of declarations supported by SunGard since the World Trade Center terrorist attacks, when the company received seventy-seven declarations. IBM’s disaster recovery and business continuity center in Sterling Forest, NY declared a “BizCon red” emergency and activated the company’s call center in Boulder, CO to handle overflow traffic. Additionally, a number of businesses discussed the impact of the blackout on their IT infrastructures. According to reports, Commerzbank’s primary datacenter and two disaster recovery sites were all affected by the blackout, but generators allowed the company to successfully continue replicating data on its EMC Symmetrix systems. Canadian-based Telus also utilized generators to power HP and Sun systems in its two Toronto datacenters.
Beyond the massive insurance claims and political finger-pointing that typically follow a FUBAR event like last week’s blackout, some serious issues are worth consideration. If one sets aside the dotcom bubble and its associated hysteria, the fact is that the Internet continues to inspire one of the largest business revolutions in history. Over the two past decades, as computers and computing have moved far beyond the confines of the datacenter to desktops, laptops, and pocket books, the dreams of pervasive computing are slowly becoming real. Remote facilities, home offices, and field personnel have secure, high-speed access to real-time company data whenever they need it. Engineering change orders are delivered electronically, saving wasted time, effort, and money for vendors and manufacturers alike. Enterprise storage arrays can contain warehouses worth of printed data, and provide near-seamless access to that information. In other words, Information Technology is an integral and growing part of virtually every enterprise, and has become business-critical to an extent that few people truly comprehend. If that is the case, how important is it to protect the infrastructure that supports that data and those processes?
We would say nearly beyond measure. Reports have suggested that last week’s power debacle cost businesses and consumers billions of dollars and countless headaches. While the power industry has enjoyed the free market joys of deregulation since 1992, it has failed to adequately maintain an infrastructure that is critical to the lives and jobs of tens of millions of people. After the devastating events of 9/11, there was a great deal of discussion of what constitutes safe IT practice. Some enterprises even discussed placing disaster recovery/business continuity facilities overseas, away from obvious terrorist targets. We see that approach as somewhat extreme, especially since last week’s events demonstrated that laxity and greed can be every bit as dangerous and threatening as radical extremism. Overall, we believe that enterprises serious about these issues should follow the largely autonomous disaster recovery lead of companies like Commerzbank and Telus or consider enlisting services such as those provided by SunGard and IBM. If nothing else, last week’s events should serve to remind enterprises that unless their IT infrastructures can survive the dark designs of terrorists and the uncertain capabilities of a deteriorating public infrastructure, they run the risk of being left quite literally in the dark.
The California State Senate has passed a financial privacy bill that is designed to restrict what financial institutions can do with information collected from their customers. The bill was passed by a 31-6 vote after passing through the state assembly earlier in the week. The bill was initially introduced in 1999 in response to federal legislation that allowed various financial institutions, such as banks, insurance companies, and stock brokerages to merge, creating financial supermarkets. These financial supermarkets, supporters of the privacy bill argue, would have unfettered access to virtually every detail of a consumer’s financial life. The new law would allow consumers to prevent the selling or sharing of information among these financial entities. The bill passed both houses of the California legislature with support from both Democrats and Republicans. News reports indicate that financial institutions are considering challenging the law in the courts. The bill will take effect July 1, 2004.
Americans — and Californians in particular — seem to becoming much more conscious of the issue of data privacy. The fact that this bill passed through the nearly gridlocked California legislature with support from both Republicans and Democrats indicates that opposition to consumer privacy rights is politically unfeasible. The fact that such a law passed in California — the home of much of the nation’s Internet and IT innovation and development — would indicate that consumers there are becoming much more aware of how much information is being gathered about them by their various credit card companies, banks, and insurance carriers. That said, it should be pointed out that this awareness — and its translation into political sentiment — lags far behind the actual practices of financial institutions. Financial institutions have been gathering data for quite some time, and will continue to do so.
In our mind, the passage of this bill is, in the words of Winston Churchill, “not the beginning of the end but the end of the beginning.” Not only does this bill fail to address the amount of information already gathered, shared, and cross-tabulated, it will not stop ongoing information collection. Consumers will have to pro-actively ask that such activities are limited with their personal data. Regardless of the existing political sentiment, we suspect not every Californian will remember to do so every time it’s necessary. This of course, assumes the bill will indeed take effect in July, 2004. We see numerous scenarios in which such implementation will be delayed, not the least being legal challenges from the financial institutions, if not from the federal government itself. In the United States, the actual effect of such legislation is largely determined by case law and court rulings, in short: precedent. It could take years for such a body of functional case law to be in place. While European nations may not find the precedent of such privacy legislation bogged down in courts, the same time lag will come into play as evolving social custom lays the groundwork of precedent. Either way, such processes take time. And in our mind, that is the core issue at hand. The Internet and IT technology are prime examples of the acceleration of the rate of change in modern societies. The legislative process — in this case a five-year process — cannot keep pace with the rate of change being spearheaded in so many ways by the ongoing IT revolution. Perhaps greater awareness on the part of lawmakers of the implications of this IT revolution will serve to make for more forward looking legislation in the future. Perhaps not. Either way, the revolutionary aspects of IT and the Internet continue to make their way into the public domain, far from what many envisioned back in the 20th century.
IBM has introduced a series of initiatives designed to clarify the role the company’s zSeries mainframe solutions play in IBM’s On Demand strategy and enhance the business value they deliver to IBM customers. Included was a new Mainframe Charter that reiterated IBM’s plans for continuing to improve mainframe solutions, extend and enhance their business value, and support key zSeries and Open Source community efforts. To substantiate this charter, IBM announced new pricing initiatives that will occur over the next few weeks. Immediate price reductions in both memory across the entire zSeries product family and zSeries IFL capacity offer mainframe customers significant cost benefits. Additionally, IBM announced an On Demand business investment promotion that will deliver rebates to z990 customers that can be applied to the purchase of additional IBM products. Changes in base configurations and the introduction of daily On/Off Capacity on Demand (CoD) software options are planned for implementation in September. In October, IBM plans to implement software price/performance improvements for the z990, WLC pricing enhancements, NALC price reductions, and On/Off CoD for Linux. Later in 2003, the company plans for offer sub-capacity pricing for select WebSphere for zSeries products.
In its signature mainframe product family, IBM possesses what might be considered both a blessing and a potential curse. On the plus side, it is difficult to think of an enterprise IT solution that is better known than the mainframe, or more clearly associated with and attributed to a single vendor. This perception has been further strengthened as IBM’s mainframe competitors dropped out of sight or into different markets. On the negative side, it is hard to think of an enterprise IT solution that is more severely misunderstood than the mainframe, a situation that has been exploited and even expanded upon by IBM competitors who want a piece of the mainframe market and its hosts of large enterprise customers. To accomplish this, some vendors have resorted to claims that the mainframe is an out of date, out of step, expensive, and moribund technology better suited for the ash heap of history than for corporate datacenters. So do purported “mainframe-like” computers really offer the same reliability, stability, and flexibility as true-blue mainframes? They do not, but the evolving capabilities of non-mainframe server architectures including IBM’s own product lines, provide IT choices that simply did not exist a decade or even five years ago. The result of all this has been understandable confusion for both existing and potential mainframe customers.
Given this, what is the point of IBM’s new Mainframe Charter and zSeries pricing initiatives? A couple of things, really. The first is to clarify a contemporary context for the company’s mainframe solutions. Putting aside the dissembling of its competitors, IBM is systematically reinventing the mainframe. The company’s z800 “mini” mainframe delivered fully updated capabilities including Linux support in a smaller, affordable package. This spring, the introduction of the flagship z990 “T-Rex” redefined high-end mainframe capabilities, literally doubling the number of processors and I/O channels, and tripling the systems capacity of the z900. Perhaps most importantly, mainframe-derived autonomic technologies lie at the heart of IBM’s On Demand initiative and notably enhance the stability and manageability of the company’s other product lines. While some cynics might suggest that IBM’s new pricing initiatives may signal difficulties zSeries solutions are encountering in the market, we believe they reflect a different reality. The fact is that customers and vendors alike are quite used to seeing dramatic price/performance fluctuations in PCs, servers, storage and other IT solutions, all part and parcel of the ongoing primacy of Moore’s Law. That mainframe solutions should follow this same path of self-improvement is inevitable, and further cements the position of IBM’s zSeries products as fully contemporary and continually evolving IT solutions.
HP has reported financial results for its third fiscal quarter ending July 31, 2003, with revenues totaling $17.35 billion, a 4% decline from the previous quarter, and non-GAAP operating profits totaling $858 million, down 25% from the previous quarter. Non-GAAP diluted earnings per share were $0.23, which missed analyst expectations of $0.26 per share and was down from $0.29 per share the previous quarter. Among specific results discussed by HP: Personal Systems revenues totaled $4.97 billion, down 3% sequentially. Strength in notebook and consumer PCs was offset by a double-digit year-over-year decline in commercial desktop revenues. Imaging and Printing revenues totaled $5.24 billion in the quarter, down 5% sequentially. Enterprise Systems revenues were $3.71 billion, down 4% sequentially, with operating losses of $70 million for the quarter, up from a loss of $7 million the previous quarter. HP said the losses were due in part to seasonal issues and top line weakness, along with planned expenses due to HP’s Integrity server launch, investments in management software and the acceleration of HP’s Alpha processor transition. HP said the 3,500 employee layoffs the company plans to complete by October will be increased to 4,800.
When HP announced its acquisition of Compaq in September 2001, we saw the deal as one that could help the companies buoy up lagging sections of their businesses, since both possessed discreet products that filled gaps in each other’s solutions sets. However, it has become increasingly clear that HP does not work particularly well as a trifurcated entity that attempts to be many things in many markets. To begin, HP’s imaging and printing business exists almost as a separate entity and is largely responsible for the perceived success of the greater company. Without those healthy imaging and printing revenues, Ms. Fiorina would have prepared or delivered an exit speech by now. Second, the continuing game of PC market leader chicken HP has been playing with Dell has more downside than upside. To win, HP must successfully master the arcane details and razor thin margins of profitable PC manufacturing, a game at which Dell is particularly adept and which HP is still learning. Finally, the company Enterprise Systems has had a very hard course to follow in assimilating and discarding a host of new Compaq products, and in convincing its customers that a planned migration to Intel’s Itanium2 platform is in their best interests. The fact is that any number of products have gotten lost in the jumble, and since migrating to Itanium is neither more nor less painless than migrating to any other 64-bit platform, HP has left itself dangerously exposed to the competition.
Is there any upside here? We believe so. Despite lingering uncertainties, it looks as if the economy may be stabilizing a bit, with people and businesses loosening their pocketbooks. Additionally, enterprise and developer interest in Linux is accelerating, which is good news for a company that owns as big a piece of the Linux server pie as HP. Finally, Intel appears to have solved many of the performance problems of earlier iterations of the Itanium processor, and recent benchmarks have been promising. The real question is: is this enough to make a real difference? We have our doubts. For the past two years, HP’s management has pursued a course of shrinking the company towards profitability. This may work to quiet Wall Street and HP’s myriad institutional investors, but it is not a great strategy for a company whose long-term hopes rest largely on its traditional status as an IT innovator. Say what you will about seasonal issues, tough times, and events beyond one’s control. At this stage of the game, the new and improved HP Way appears markedly unclear.
This week, the UK House of Commons Trade and Industry select committee, which is responsible for investigating these matters, is aware of growing concerns about broadband availability across the UK. A report in Silicon.com this week indicated that the committee intends to examine the overall health of the UK technology industry, including broadband penetration, e-government progress, and the UK’s overall ecommerce fitness.
Plenty of economists using all their fingers on multiple hands have debated the intent, impact, and outcome of free market economics, including the shifting of labor, capital, and knowledge over time and geography. Some would argue that these things are inevitable unless and until government intercedes, which then opens a smorgasbord of issues around regulation, privatization, and the inherent value of industry at a national level. Sageza has observed the recent infrastructure issues plaguing both the US and Europe with these thoughts in mind. America’s recent power outage and Great Britain’s broadband availability issues are just two examples of the national version of the innovator’s dilemma. If companies (or governments) simply give their installed base (i.e., citizens and enterprises) more of the same products they’ve always had with only incremental or residual benefits, they run the risk of becoming slaves to their base and missing opportunities to provide real innovation. Even worse, if customers come to like another brand better they switch, leaving their original supplier in the lurch. At a corporate level, this can lead to M&A or closure events. Its impact on a nation is a bit more complex, but arguably does not bode well for the government in charge. It would behoove the powers that be in developed nations and mature companies to rethink their value proposition and look for analogous disruptive technologies that can move them forward before start-ups (China and India?) steal their best resources out from under them
What this means varies from country to country. Germany came to the biotech game five to seven years later than the UK or U.S., but they have since made up for it by dedicating significant time, energy, and publicity into creating specialized centers for biotech firms. Ireland offers tax advantages and other benefits to corporations to encourage businesses to place new plants and technologies in their territory. The government of Dubai has spent years establishing Dubai Media City and Dubai Internet City, physical areas where foreign ownership is allowed, taxes are lowered, infrastructure is state-of-the-art, and large companies have come in droves. Dubai is a particularly interesting case because Sheik Mohammed of Dubai has approached this effort by strategically positioning the emirate for future technology (a risk!) and investing the capital to make it a reality. Time will tell if it works, but his is an example of government actively facing the innovator’s dilemma head on. The U.S. and Western Europe have many opportunities to leverage new technology over old, to invest in new skills and research and development for their populations, and to allow their citizens make the most of their lives. In order to do this, some politicians are going to have to behave like businesspeople are supposed to: with vision and a willingness to take risks and challenge to the existing status quo. In addition, they will need to actively dismantle or alter the rules that prohibit novel new areas of business from truly taking off. For broadband is not about subsidizing rural folks so they can become city slickers, as many in the UK seem to believe. It is about investing in a communications infrastructure that allows a nation to embrace the industries that will drive the world well into the 21st century. In essence, this is about long term investment vs. short-term profits. Overall, developing a new (or maybe old) view of the value of investing will be necessary to succeed, especially where established systems are beginning to crack under the weight of their own successful history.