Market Roundup October 20, 2006 Microsoft Virtually Opens Interoperability LANDesk: Turning IT Management into Business Value |
|
Microsoft Virtually Opens Interoperability
Microsoft has announced that its virtualization format
technology will now be available under its Open Specification Promise (OSP).
The OSP is an irrevocable promise from Microsoft, first published in September
2006, to every individual and organization in the world, allowing them to use
certain patented technology for free, now and forever when implementing
specified open standards. Microsoft is now applying the OSP to its Virtual Hard
Disk (VHD) Image Format specification. Microsoft’s VHD format has been
available since May 2005, and captures the entire virtual machine operating
system and application stack in a single file. Microsoft believes that with
this move it is fostering interoperability among commercial software solutions,
including open source. VHD has been adopted by more than sixty vendors,
including Brocade, BMC, Fujitsu Siemens, Network Appliance, and XenSource. The
VHD offers migration across Microsoft Virtual Server, Virtual PC, and Windows
Server virtualization with the forthcoming Longhorn version of Windows Server.
Just as Microsoft made a commitment to security in the last
few years, it has likewise spent 2006 increasing its efforts to make
interoperability easier. It is working with various groups, including
virtualization technical collaboration with XenSource and the release of the
OSP for twenty-five Web service protocols. Microsoft has not been the best
interoperability partner in the past, but computing complexity is driving it to
change. Complexity is driven by a couple of things. One is the sheer volume of
entities in the corporate network. Trying to manage all of the applications,
edge devices, servers, storage, and network components is difficult. One of the
ways IT managers are seeking to gain control is virtualization, which makes
many physical pieces look like fewer, larger, logical pieces. SANs, NAS, and blade servers are some of the hardware
solutions in place. Virtualization technology for servers and applications is
the software approach. The whole point of virtualization is to free managers
from worrying about individual resources and allow them to manage their
computing resources at aggregate levels. This means of course that Microsoft
must have virtualization capabilities, and that others’ software can interact
with it. Neither Microsoft nor any other vendor can be the sole provider of a
virtualized data center, so if Microsoft wants to be part of a larger solution,
then it must work with other players, particularly with other software
developers. And if the specifications it uses require licensing, contracts, and
fees, then the potential costs to both developers and users could result in
lower market share for them. This is not a place Microsoft wants to be on the
eve of launching new products.
Another issue driving complexity is matching the applications to virtual hardware and then providing it to users. As resources are pooled, IT managers need to rethink how they charge for and deliver IT to internal customers. This has given rise to the idea of service-oriented architectures (SOA) and Web services. The traditional building blocks of server and OS, network, and application are being broken into smaller component blocks that are combined in various ways with other software to deliver application services to users. This evolution requires that some technology companies rethink the way their products are sold, licensed, used, and developed. Microsoft needs to make sure that certain specifications are adopted by as broad an audience as possible. By releasing them to the developer community for free use, there is a greater likelihood that developers will incorporate Microsoft’s specifications into applications, particularly those in open source. Perhaps Microsoft has learned the adage: if you can’t beat ’em, join ’em then beat ’em. With OSP, Microsoft is demonstrating that it understands where the industry is headed, and that it wants to play the game, even if it isn’t always writing all the rules.
LANDesk: Turning IT Management into Business Value
Recently LANDesk, now a wholly owned subsidiary of Avocent Corporation, released the latest version of its
process management solution, LANDesk Process Manager 3.0. New templates in the
release are targeted to help organizations define and follow processes covering
essential operations such as software distribution, operating system imaging,
configuration management, security and patch management, vulnerability scanning
and remediation, server management, software license monitoring, asset and
inventory management, and endpoint security management. New capabilities also
include interactive process support for a wide range of mobile devices,
including Blackberry and smart phones. In addition, the software includes tight
process integration with LANDesk Service Desk to support incident, problem, and
change management operations as well as new process actions covering the
management of security and software patch processes. In order to help comply
with auditory and governance requirements, the software now supplies
sophisticated Time-to-Complete and history reporting. The latest release now
offers out-of-the-box integration with other LANDesk management and security
tools while also supplying enhanced import/export capabilities of process
workflows to make the sharing of processes more straightforward. LANDesk
Process Manager 3.0 is available now through the company’s Expert Solution
Providers with recommended retail pricing starting at $35,000 for organizations
with fewer than 1,000 employees and rising to $190,000 for those with over
10,000 employees. The pricing includes limited initial professional services.
While the new platform contains many technical updates and
new features it is perhaps the inclusion of new process templates that could be
its most attractive feature. Many IT organizations today run their operations
without formal, documented procedures making it difficult, if not impossible,
to standardize and accurately monitor key IT management and change processes. Today
many organizations are looking to run IT operations based on their
implementation of best practice models such as ITIL and COBIT. However, while
most utilize many tools to manage various components of the technical IT
infrastructure few have yet to adopt tools that support the management and,
increasingly important, formal reporting of the many manual or semi-manual IT
processes still widely utilized in daily operations. The inclusion of process
templates could well form a base from which organizations lacking documented IT
processes can build their own tailored library of procedures.
There is a clear need for process management and process automation tools to become more tightly entwined in daily IT operations. Manual process may suffer from occasional human errors and it can be difficult to ensure that such processes are implemented uniformly within an organization. Tools that can automate many of these operations, both entirely or partially, now exist, and to our way of thinking should be utilized. Automation can save time and free up human resources and automated IT processes are almost certain to be delivered quickly and more reliably. All business operations depend ultimately on procedures making Process Management unavoidable. The better management of IT processes gives organizations an opportunity to establish good procedures and will allow them to focus not only on the everyday administration of IT but provides an opportunity to begin to address the holy grail of demonstrating how effectively IT is delivering business value. It could also help make auditing slightly less traumatic.
Sun Microsystems: Lighting Up the Blackbox
Sun Microsystems has announced Project Blackbox, a simple,
instant-on modular datacenter targeting companies that are seeking rapid
deployment of datacenter infrastructure. Project Blackbox packages computing,
storage, and network infrastructure, along with high-efficiency power and
cooling, into modular units based on standard shipping containers. Project
Blackbox, as envisioned and engineered today, is a pre-configured, fully
contained datacenter, optimized for maximum density, performance, and
efficiency, as well as complete recyclability that is
designed to be rapidly and flexibly deployed anytime, anywhere. Sun states that
Blackbox has one-hundredth of the initial cost, one-fifth the cost per square
foot, and 20% more power efficiency than standard datacenter installation, and
can hold 250 Sun Fire servers, provide 2PB of storage,
or 7TB of memory. The company states that the unique form factor and underlying
technologies offer a range of potential new uses, including: rapidly deployed
Web 2.0 build-outs for organizations that have an ongoing need for datacenter
space but lack the time to design or build it; advanced military applications
with deployments anytime, anywhere; support for developing nations seeking to
deploy computing facilities to locations that lack the traditional power and
networking infrastructure; and oil exploration and seismic modeling that need
high-performance computing in remote locations, from offshore oil rigs to
underdeveloped regions of the world.
Project Blackbox is currently in the late prototype phase.
Sun has begun working with early customers, with early commercial availability
scheduled for mid-2007.
OK, sometimes Sun is on to something, and sometimes its ideas
are just plain weird. This time, we are happy to say we believe the company is
crafting another creative way to bring its products to market, or more
importantly, to new markets. Sun is a company that tends to thrive when it is
writing the rules of the marketplace. It tolerates early-stage competition, but
when the leading edge ideals that Sun extols reach the mainstream, the company
often has troubles keeping with the margin eroding power of commoditization and
the sheer size of its trailing competitors. Thus, when Sun can define a new
market niche, or strategy where it can stand on it own, in the lead, the
company quite often does very well with its first-mover advantage. It is ironic
that one of the most commoditized icons of the late 20th century, the standard
shipping container, is the place where Sun stands a chance to make a highly
differentiated entry in the data center marketplace.
By focusing on a self-contained modular approach, Blackbox
is independent of the supporting context of electricity, wired
telecommunications, a permanent building, and perhaps even running water. Much
like modular/temporary buildings that can be rapidly deployed to meet overcrowding
at local schools, the modular datacenter could be deployed to cover a short-term
need, e.g., a trade show or major athletic event, or as a longer-term solution
in a very remote locale, or even as a “drop in” to an un- or underdeveloped
building site. Deploying a traditional data center—perhaps aesthetically more
pleasing than a shipping container—certainly takes much longer and many more
resources and thus represents a significant fixed cost overhead, especially if
the term of use is relatively short or cyclical. Perhaps the container itself
could become a measurement of IT processing ability, marked with not only
height, width, depth, and weight capacities, but with Tflops,
and TBs as well. Further, for nomadic activities,
such as oil exploration, seismic, or volcanic study, being able to bring a
power datacenter up close to the center of activity and then being able to move
out almost as easily as firing up the tractor and pulling the trailer away is a
capability that would be likely be well received.
While we might not want to be left inside one of these
containers in the middle of the desert if it were to run out of power, with
adequate supplies this biosphere of a datacenter changes the fundamental limits
of what can and can not be done with existing data center technology. For the
military, we see obvious potential, as this is a group of organizations who
understands how to set up, tear down, and relocate rapidly on a worldwide
basis. Many of the same benefits would apply for civilian disaster recovery
efforts, and other first response scenarios. Given its stingy cost and
operational structure, the innards of the Blackbox might also prompt
consideration of how to improve the operations of existing tradition datacenters
as well.
Overall, we are impressed with Sun’s ingenuity and creativity on this one. Although at first glance this may appear to be another niche play for a ruggedized military solution, we believe it offers much more. Nevertheless, this will be a simple exercise in “what if” if the company does not back up its creativity with solid market cultivation and sales performance. But given Sun’s prowess in thriving in what often seem strange business strategies at time, we are not so quick to write off Project Blackbox as a foolhardy endeavor. Just remember fifty years ago when Western shipping companies scoffed at the standardized shipping container and note just how wrong they turned out to be.
Big Brother or Ramped-Up Security?
A consortium of European Union companies and the University
College London has jointly developed a new airport security system called Optag. Based on RFID technology, the system was designed to
track passengers, staff, and crew inside crowded airports. Using video images
and transponders, the system can pinpoint the location of a person—or maybe
just the RFID tag—to within one meter’s accuracy. Regular RFID tags have a
range of only a few centimeters, but the new Optags
have a range of 10 to 20 meters. The tags don’t store any data, but they might
incorporate biometrics in the future, including scanning faces to match the RFID
information. Beyond tracking applications, it is claimed that the system could
be used to help evacuate airports and find lost children. It has not been
determined by which method the tags will be used; wristbands are currently the
most talked-about method of application, but testing will determine the
viability of that delivery. The Optag system is
scheduled to be tested in an airport in Hungary next month; if the tests are
successful, the system could begin to show up in airports all over the world
within the next two years.
There are, of course, huge problems to be overcome with any
people-tracking technology. The first of which is, other than a permanent
application, how do you keep the tags attached to the people they are supposed
to be tracking? A wristband would be simple to slip off and exchange, even
without the other exchanger’s knowledge: the pickpocket updated to the pickband. Perhaps the biometric enhancements would help to
keep the correct tag with the correct person, but that would be another expense
to add on top of the already difficult and potentially expensive Optag installation. Sanctions against removing the tags
wouldn’t deter a wayward teenage bent on escaping recognizance let alone a
determined terrorist on a suicide mission; it seems once again that law-abiding
citizens would be the ones to pay the price with their civil liberties.
Speaking of civil liberties, we can’t see this system as being hugely popular. Recent to-the-death fights to institute an ID card system show that people are not livestock and actively resist being branded. Perhaps Grandma doesn’t want anyone to know how many times she has visited the bathroom since checking in, and perhaps Dad is trying to slip away for a quick drink before his flight and needs some private time. Some societies still view that the citizen has the right to do such things without someone tracking and noting their movements. However, in this day and age of terrorismitis, it is increasingly easy for authorities to demand (and get from the public) more invasion of privacy, especially in high-profile public spaces, like airports, etc. Then there is the slippery slope argument: If the powers that be have no problem tracking their citizens in airports, how long will it be before tracking systems start sneaking into other areas of citizens’ lives? The next logical step would be to install them in schools “for the children’s safety.” Then sports arenas, subway systems, work places.... We speculate that if the children going to school become used to being tracked, then as they mature, they wouldn’t fight against it so much if the systems start showing up outside of schools. Singapore, perhaps, is jumping with joy at the development of this technology. We see tracking people as a dangerous step into the area of civil liberty violations. Sure, catching criminals would be relatively easy with an RFID tag on every citizen, but do we as citizens really want to trade that much freedom for safety? Anyone who would do that, so the saying goes, deserves neither.