June 24, 2005
IBM has announced that Cisco Systems has selected its WebSphere middleware as one component of its Application Oriented Network (AON) initiative, which is designed to let its networking hardware have application intelligence. Cisco announced the AON initiative this week and Cisco CEO John Chambers stated his belief that the new architecture is a fundamental change in IT deployment. Cisco chose IBM’s WebSphere MQ messaging software as part of this effort. Officials from both companies said the addition of intelligent application message handling will allow the network and the applications to work as a much more integrated system in areas such as security, routing, and application performance.
While the AON initiative is still for the most part announcement-ware, the potential to offload more routine and mundane computing tasks to elements of the network makes a great deal of sense. Merely having intelligence at the originating and arrival points of the network has been feasible to date, but with a continuing litany of new issues facing IT environments, it makes sense to continue the push for more distributed computing environments. Of course, Cisco hopes that by creating a new generation of intelligent network products, it will open a new wave of network upgrades requiring new Cisco products to be installed for the sake of keeping up with the Joneses, IT style.
And Cisco will in all probability succeed in doing so. There have been a host of issues plaguing IT deployments, but by putting more intelligence into routers and switches, many of these issues can be addressed in a more holistic and distributed fashion. By allowing application performance monitoring and improvement in the network itself, tuning the network itself for improved performance comes much closer to reality, not to mention opening up new pricing-tiered service arrangements. Furthermore, intelligence in the network could be a boon to improving security across the network, by having thousands of sentries like switches and routers that could detect and offset security threats such as viruses, DOS attacks, and other intrusions, thus boosting security on the network by orders of magnitude. For IBM, the opportunity to take its middleware into the network itself offers a significant revenue opportunity, as well as the means to actually improve the quality of services and performance afforded by that network. Sounds like a good idea to us.
EMC announced this week that it is expanding its consulting and services business to meet a growing demand from customers for Information Lifecycle Management (ILM) capabilities and deployments. EMC said it will be offering these services worldwide not only directly but through business partners. Among the services offered will be EMC Classification and Policy Services, EMC Architecture and Consolidation Services, EMC Storage Management Optimization Services, and EMC Information Protection Services. EMC existing consulting services have already completed more than 600 customer engagements in the past two-plus years.
EMC has been successful in executing a strategy that adds value to storage hardware by adding new management and virtualization software, among others, to its product line. By doing so, the company is hoping to avoid being caught in a commodity market of storage hardware devices. Given EMC’s healthy financial reports over the past few quarters, it would seem that this strategy is paying off handsomely. If the company can execute with similar focus on providing services to customers, we expect those quarterly reports to continue to shine as a result. Given EMC’s execution of its software strategy, we expect a positive result from its expansion of services.
EMC is certainly not going out on limb in expanding these services. Given the ongoing demands to improve, enlarge, consolidate, secure, and optimize storage environments in enterprises of all sizes, we expect the market for the company’s services to be robust. With demand for storage capacity doubling each year, and with greater regulatory requirements and more rapid response to partner and customer demands, storage environments are not only getting more complex, but are being required to do much more granular and refined information management. Alas, for many companies, the expertise to deploy such storage footprints is scarce, so turning to vendor expertise and its knowledge base can remove many of these headaches before they actually occur. We also note that EMC plans to offer services through business partners, a smart move in our mind because it can only expand the market opportunities for such services. Based on EMC’s execution of its software strategy, we believe that its services offerings will be in great demand indeed.
As part of a celebration recognizing the shipment of its 10-millionth ProLiant server, HP has announced that its new entry-level ProLiant servers will be equipped with AMD’s dual-core Opteron processors. HP has been offering dual-core Opteron solutions in its high-end DL585 and BL45p models, but this announcement represents the first time that the Palo Alto giant is bringing dual-core down scale. The newly announced systems are as follows: The ProLiant DL385 is designed for business-critical applications including databases, mail and messaging, and enterprise resource planning. The ProLiant DL145 G2 is the second-generation DL145 server that now supports dual-core Opteron processors and is targeted for high-performance, technical applications, and entry-level remote management for SMBs. HP also announced dual-core support for its BladeSystem with the ProLiant BL25p and BL35p. In addition, the company announced an enhanced Factory Express configuration and installation service for the BladeSystem and new HP Care Pack service offering for HP BladeSystem. The dual-core HP ProLiant DL385, DL145 G2, BL25p, and BL35p are expected to be available June 27. The dual-core DL385 starts at $3,299, the DL145 G2 at $1,219, the BL25p at $3,099 and the BL35p at $2,599.
For most industries, with the exception of hamburgers and tacos, selling 10 million of anything is a pretty good achievement. In the case of servers it is no world record, but it does qualify for salutation. But perhaps what is more interesting here than the past 10 million servers shipped is what the future of servers could look like and the impact it could have on the marketplace. While dual-core processors and the Opteron are not new, what we find interesting is that this processor, which so many vendors tried their darnedest to pigeon hole or somehow relegate to niche solutions, it is finally being treated as a respected member of the general computing community. Add to this the leading edge dual-core capability, and we see that a state-of-the-art technology is not being relegated to the rarified air of high-performance computing nor limited to some other niche. This must be making the folks at AMD happy, and proving to be a continuing thorn in the side of Intel.
The speed at which dual core has begun its move into the mainstream is remarkable, as these 2-CPUs-in-1 solutions have already caused havoc in software pricing models, as well as expectations from customers. The historic notions of what constitutes a CPU, multi-CPU, or cluster of systems are also becoming archaic. Nevertheless, these new systems offer a future whereby innovation moves more rapidly from specialty and high-end solutions to the entry level. For customers this is a boon, as new technologies will come to market more rapidly and inexpensively. For vendors, it is a reminder that building solutions using industry-standard components is not well differentiated in and of itself. But how these components and other proprietary expertise are woven together, financed, and supported creates added value and vendor-competitive advantage. This is something that HP has demonstrated that it understands and may prove an important arrow in its competitive quiver against the entry-level Opteron servers being plied by Sun Microsystems.
In the wake of increasing identity theft, California is in the process of adopting legislation to close a loophole in consumer protections. Currently, credit card companies must notify cardholders if their information is electronically hacked, but not if paper or tape records are compromised. The proposed legislation introduced by state Senator Debra Bowen (D) has been initially approved by a committee vote of 6-3.
Long accustomed to dealing with confidential and sensitive information, individuals within the financial world are generally not crooks, and only a finite number of people have access to consumer records. The problem comes mainly from outside hackers and rogues who gain access. At the moment, keeping track of buying habits and notifying the customer of anything unusual is a valuable service for both the cardholder and their financial company as reductions in fraud make it easier for companies to lower credit card clearing feeds. However, preventing a mess is certainly desirable to cleaning one up and to that end, a more secure yet more transparent security may be warranted. When the Internet first became widely used, consumer confidence in Web-based transactions was very slow in gaining a foothold. Cardholders were suspicious of hackers and rightly so. However, the IT industry spent considerable time and money on security measures to convince consumers that their credit card information was secure, and the payoff was an almost exponential increase of Internet business. But since hackers will not be daunted for long, it is now time for the industry to again step up their measures in keeping cardholder information secure.
To our way of thinking, consumers need to know who is compiling their information, what they are doing with that information, and with whom they are sharing the information. In addition, consumers need to have some control over how personally identifiable information about them is ultimately used. A finer-grade way of managing cardholder information is desirable, perhaps incorporating a consumer-driven “Do Not Compile” list on the order of the national telephone Do Not Call list. This would mean giving credit card customers more power over their own information, but that would not necessarily be a bad thing for the industry. If cardholders felt that they had more control over their own information, then it naturally follows that they would have more confidence in the process. A massive loss of trust would be far more devastating to the industry than the initial monetary output of increased security and a partial transfer of power. In this time when the number of Internet-based or other automated transactions only continue to grow, maintaining and nurturing the trust of card holders is in the best interest of all who are involved.