July 22, 2005
This week HP stirred up a flurry of analysts and press as it announced yet another restructuring. This time the company announced it was going to lay off some 14,000 employees worldwide — although the company couldn’t give us a better idea of what the geographic breakdown would be — and it also broke up the Customer Solutions Group (CSG), merging it back into the product groups: Technology Services Group (TSG), Imaging and Printing Group (IPG), and Personal Systems Group (PSG). HP claims most jobs lost will be in back-office functions and not in sales or research and development.
Since Mark Hurd took over the helm, he has spent a lot of time focused internally. What we’ve seen externally has been the repealing of several of the last big moves of Carly’s regime. These changes are meant to help HP find its footing so it can move forward, but because the renovation is a work in progress, it has raised more questions at this point than it has answered. In particular, there has been a lot of focus on the channel. HP has made several moves in the channel, and the channel was briefly discussed this week. Hurd indicated that he is interested in partners offering full HP solutions, and reaffirmed that within that context, partnership and the indirect model was an important asset to HP. At the same time, all is not rosy in resellerland. There has been friction with HP’s direct telemarketing group based in Colorado Springs in the U.S. and the channel; however, there was no mention that any changes would take place in the telemarketing group in light of this announcement. As HP continues to grow its business in the consumer markets in particular, this will be an interesting area we will continue to watch. However, once again we must caution HP that shrinking to growth is not a viable long-term strategy.
Despite the churning, HP is not alone in recent reorganizations and layoffs. IBM also announced staff reductions and reorganization of its European infrastructure. At the heart of both HP’s and IBM’s staffing shuffles is a desire to present better routes to market for their products and to provide better experiences for their customers. The problem is endemic in the industry but appears on a grander scale with companies such as HP and IBM because they sell a greater number of products and solutions to a wider customer base. At issue is the high-tech industry with a product-oriented structure, selling to a solution-oriented market involving a mix of products and services adapted to a particular vertical industry or horizontal application. In order to sell efficiently, vendors ideally should know their clients’ business and their vertical needs, in addition to a broad portfolio of offerings in addition to the products they create. The difficulty, of course, is that a vendor must have both product development capabilities and a competent locally oriented sales team, and must work out how those two groups are meant to work together internally. This is the challenge that will continue to vex the vendors in the near term, and to some degree the channel will continue to serve as a virtualization layer as will professional services organizations. Companies will continue to reorganize and hopefully will either get closer to their objective or have the insight to know when they’ve erred and be agile enough to change. Businesses don’t sit still, so we expect the reorganization to happen as much as a sign of healthy growth as for a need to fix problems. We hope HP has found the right mix to get itself back on track.
Documents from the SCO lawsuit against IBM for copyright infringement have been unsealed by the court and viewed publicly for the first time this week. Among the documents released was an August 13, 2003 email from one of the SCO employees charged with determining how much SCO-copyrighted UNIX code had made its way into the Linux kernel and various shell components. The memo indicates that after a four- to six-month investigation, it was determined that there was “absolutely nothing” in the way of evidence that the Linux vendors like Red Hat had copied SCO-controlled UNIX code. The memo’s date preceded the SCO court filings and public threats against Linux users and vendors that came in the years following the memo. The judge overseeing the case did not dismiss it even thought this memo was presented to the court by IBM, which obtained it through the discovery process.
Over the course of the past few years SCO has been making menacing noises about suing Linux users and vendors and has proposed that these companies buy licenses, which many vendors hinted was extortion. It is unknown how many companies decided to seek protection from potential copyright infringement legal action, but the community at large has repeatedly asked in full throat that SCO present the evidence. And for the same amount of time SCO has refused to do so. Now we know why.
SCO decided that it would pursue a strategy that would generate revenue through licensing of what it claimed was its property. Now it becomes clear that not only that no such claim can be made, but SCO knew it all along. We would not be in the least bit surprised if SCO become the target of legal action itself, possibly even reaching to the criminal level. And it remains to be seen what in fact is left of SCO itself, which seems to be relying more on copyright licenses than actual product for revenues. We suspect that the company is going to have a very hard time continuing as revenue and income continue to fall year to year. SCO may also find it harder to make working partnerships with other vendors, especially any they may have threatened in the past. Karma can be a real pain, especially on the “comes around” part of the cycle.
Microsoft has filed a lawsuit against Google in an effort to prevent a former Microsoft executive from joining Google. The former Microsoft executive, vice-president Kai-Fu Lee, was one of the leading developers of the Redmond company’s search engine efforts as well as being instrumental in establishing a Microsoft research facility in China. Mr. Lee had worked for Microsoft for seven years, five of those as vice-president. The lawsuit claims that Lee violated the terms of his Microsoft contract which included a one year non-compete clause. Lee is expected to set up a China-based research center for Google.
Threats of these types of lawsuits, to enforce non-compete clauses, happen with a great deal of frequency in the high-tech world where intellectual property contained in people’s heads can be the crown jewels of the company they leave. Lee may well have a great deal of information to share with his prospective new employer, if indeed he is allowed to work at Google. Since Lee told his former employer that he had decided to leave Microsoft in June, the question for the court will be the enforceability of that non-compete clause.
We’re not lawyers; we’re not going to speculate about such matters. What is more interesting to us is the re-emergence of an increasingly aggressive Microsoft when it comes to defending its turf. It wasn’t all that long ago that Microsoft found itself focusing its efforts on blunting the momentum of tiny Netscape, which was giving away free Web browsers and promoting the concept that desktop operating systems like Windows were going to be obsolete soon. Thus began the now quaint “browser wars” in which Microsoft overcame a 4-1 Netscape advantage in browser share to the point where it now dominates the browser market. Microsoft has been working hard in recent years to build up not only content but ways within which to index and order that content. Its efforts to build its own search technology continue apace. Google, meanwhile, has not only built up a huge, searchable index of content, it has also been providing services like email or storage that compete directly with Microsoft offerings. Microsoft’s concerns with Lee’s new place of employment suggest that the company is taking the competitive threat from Google very seriously, to the point where Google may now be competitive threat number one at Redmond. Stay tuned; more fireworks sure to come.
Teleste, a Finnish broadband equipment maker, has announced that early next year it will be offering Ethernet services to homes, predicting up to 100Mbps connection speeds. This is up to fifty times faster than the average speed that most homes connected via broadband now enjoy. Fiber networks can currently deliver superfast connection speeds, but since fiber networks are so costly, the economics of service delivery haven’t brought fiber to the average home consumer, especially in North America. However, Teleste has discovered a way to fit Ethernet technology over cable television networks. The technology is currently being tested in the Netherlands, and will soon become available for consumers at a cost of between $61 and $241.
This is all well and good. Nobody wants their pipeline to the Internet to suffer from congestion, but with connection speeds of up to 100 Mbps, the pipeline isn’t going to be the problem. Today’s technology gives the average user a connection with undreamt-of speed, but still won’t produce a difference in the throughput of information because of the constraint on their consumption device, be it a PC, laptop, PDA, or other device. In the early days of dialup Internet, the constraint was definitely on the connection speed. Desktops could certainly handle more information at a faster rate, but the information was merely trickling in and people were frustrated by the amount of time it took to load a Web page. Now, connection speeds are at a level and even though most desktop units are equipped with 100Mbps or even Gigabit Ethernet the Internet connection is a mere 1.5 to 6Mbps. This is an easy load for most desktops today; hence some level of balance between network throughput, local processing throughput, and human patience has been achieved for most users. With Teleste’s new technology, however, things could quickly become unbalanced again as the new the logjam comes not from the network wire speed, but rather from the I/O capabilities of the device and the network routers’ and servers’ capacity. The perception of the technology is compelling, but the reality is more likely to be akin to driving a bright red Ferrari through a school zone at 25 mph. The potential is there, but with some notable constraints.
If you build it, will they come? Maybe, but not yet. There is a segment of the consumer population seduced by the latest and the fastest that will indeed sign up for 100Mbps, even if they will never see that level of throughput, but many more will probably see this as a neat new gadget that doesn’t yet have much relevance to them. Right now we see Teleste’s new technology as something that could be meaningful for enterprise rather than consumers, as it may prove an inexpensive way to wire short-distance or campus-based WANs or even provide an improvement over long-distance WAN connections currently in place. How long will it be before desktops will be able to drink continuously from a 100Mbps firehose? Who knows? Nevertheless, it may prove exciting when one can drink that fast, but the number of consumers with the stomach for this level of imbibing would seem limited indeed.
A bill pending before Canada’s Parliament, which is designed to amend Canada's Copyright Act to bring it more in line with the 1996 World Intellectual Property Organization treaty, could make the caching of Web pages or other Internet content a violation of copyright. Bill C-60, which addresses file sharing, anti-copying technology, and ISP liability, would tighten the Copyright Act and is scheduled for debate and initial voting in the House of Commons after Parliament's summer break. C-60 would enhance the Copyright Act in many aspects favorable to the entertainment industry and make it similar to the protections now being offered by the Digital Millennium Copyright Act in the United States. Present Canadian law allows copyright holders to sue without any previous notification when they believe that their material is being infringed. Under C-60, prior notice to the infringing party would be required before a suit for monetary damages could be filed.
One of the really neat things about the Internet is that just about anyone can post information, from most anywhere, and it can be read from most anywhere. While this is a boon to researchers and academics, some of the original users, to many a copyright holder the Internet has come to be viewed as the giant black hole where intellectual property is sucked out and reproduced at great speed, and to the infringement of the copyright holders. While some would still argue that the Internet is a free frontier with no need to respect any sort of information restraint, the reality is that most people and governments representing them realize that taking the intellectual property of others without legitimate compensation is theft plain and simple. But what of the doctrines of fair use that exist in various copyright codes throughout the world? And perhaps more importantly, is the relatively new ability to temporarily copy something arguably for fair use actually an infringement of the same sort as wholesale counterfeiting, piracy, and other nefarious communicated acts?
In case the reader need be reassured, we believe highly in intellectual property rights, and the protections of copyright. However, we are concerned when a standard emerges that says that caching a document is in effect copying it and potentially infringing upon its owner’s copyright. If a cache is in fact an infringing copy, does this apply to every individual who happens to view the page (legitimately) and then caches it in their browser? What about Google or Yahoo!? Or for that matter, what about Web page accelerators or even cache-enabled networking hardware? Hopefully the Canadian bill will be adjusted so to offer realistic protection while not infringing on the uniquely valuable capabilities of the Internet. If every disgruntled Web page publisher started suing major search indices, corporations, or even private users, the chilling effect could cause vendors to simply forgo caching altogether, thus gutting a substantial portion of the value proposition. It seems to us that caching a document that is publicly available and copyrighted would be a fair use that is not denying revenue or distribution control over the content. If the content is so valuable that it should not be able to be cached by Google, or anyone else, then it should not be posted in a public place on the public network, whether copyright-protected or not.