As many of you are aware, Alex Brown will be the "Convenor" of the OOXML Ballot Resolution Meeting (BRM) that will run from February 25 through 29 in Geneva, Switzerland. Alex has a variety of unenviable tasks, including:
Trying to interpret various standing Directives and other ISO/IEC JTC1 rules and practices that were created for what might be described as kinder, gentler times (not to mention for shorter specifications).
Figuring out how to process c. 1,000 comments (after elimination of duplicates) during a 35 hour meeting week, without the currently contemplated possibility of an extension.
Herding 120 cats, some of which will have strong opinions on individual points, others of which will have alternating suggestions on how to resolve a given point, and many of whom may be just plain bewildered, due to the lack of time to be fully prepared.
For better or worse, the rules that Alex will be interpreting and applying are not as comprehensive, and certainly not as detailed, as the situation might demand to put everyone on exactly the same page regarding what should (or at least could) be done at many points in time. As a result, knowing how Alex's thoughts are shaping up is both interesting and important. To his credit, he has been generous about sharing those thoughts, and often how he arrived at them, at his blog, which can be found here.
While I've often linked to Alex's blog and have had a permanent link in the "Blogs I Read" category for some time, I'd like to point to Alex's latest entry, which covers several important points that others have recently blogged on. In many cases, Alex comes out differently than some others that have stated firm opinions, and since Alex has the gavel, his opinion will be the one that counts.
On Wednesday, the Federal Trade Commission (FTC) announced the most important resolution of a standards-related enforcement action since Rambus, and possibly since its landmark settlement with Dell Computer in 1995. At issue was whether a licensing promise made by a patent-owning participant in a standards development process is binding upon someone that later owns the same patent. In a split 3 – 2 decision, the FTC has ruled that it does, when the later owner exploits the “lock in” of the marketplace by dramatically increasing the cost to license the patent in question.
The decision is significant for a number of reasons. First, the marketplace has long worried over whether such promises can be relied upon in the long term. Second, the sole business of the defendant in the action, Negotiated Data Solutions (N-Data), is licensing patents – in other words, a “troll,” in market parlance. Trolls are viewed by vendors and end users alike as a pernicious and increasing threat.
I was pleased to see the formal announcement yesterday of the OpenSAF Foundation, a new open source project that I've been helping form for the past several months. You can find the the launch press release here, and I've also pasted it in at the end of this blog entry for archival purposes. The Web site for OpenSAF is here.
Here are the high level details, from the press release:
This organization is created to promote the development and adoption of an open source implementation of a high availability base platform middleware based on Service Availability™ Forum Specifications. The OpenSAF Foundation leverages the collective expertise of the telecom and enterprise computing industries to create a best-of-breed implementation. This open source project uses the widely accepted LGPLv2.1 license (Lesser GNU Public License). Membership in the foundation is open to all organizations interested in the development and broad adoption of the OpenSAF code base.
The goals of the OpenSAF Foundation include evolving a broadly adopted base platform middleware not tied to any commercial implementation. This includes support for the SA Forum™ Application Interface Specification and alignment with the requirements of the SCOPE Alliance. Additional services will be developed that may be leveraged by enterprise computing companies, network equipment providers, industries requiring high availability, and independent software vendors.
As you can see, OpenSAF is an implementational chicken to an already existing, open standard egg developed by the Service Availability Forum, as well as a continuing buildout of an important telecommunications system where other organizations (such as the SCOPE Alliance) are already active. The need to create such interdependent organizations working on a collaborative, additive basis is increasingly important in today's converged and expanding, telecom and Internet based world.
If you're reading this blog entry, you've probably been following the battle between ODF and OOXML. If so, you may be thinking of that conflict as a classic standards war, but in fact, it goes much deeper than that label would suggest. What is happening between the proponents of ODF and OOXML is only a skirmish in a bigger battle that involves a fundamental reordering of forces, ideologies, stakeholders, and economics at the interface of society and information technology.
Today, open source software is challenging proprietary models, hundreds of millions of people in emerging societies are choosing their first computer platforms from a range of alternatives, major vendors are converting from product to service strategies, and software as a service is finally coming into its own - to mention only a few of the many forces that are transforming the realities that ruled the IT marketplace for decades. When the dust settles, the alignments and identities of the Great Powers of the IT world will be as different as were the Great Powers of the world at the end of the First World War.
It is in this light that the ODF vs. OOXML struggle should really be seen, and for this reason I've dedicated the latest issue of Standards Today to exploring these added dimensions on the eve of the OOXML Ballot Resolution Meeting that will begin on February 25 in Geneva, Switzerland.
Regulators in the EU today announced that they are opening two new investigations against Microsoft, this time focusing not on peripheral functionalities like media players, but on the core of Microsoft's business: its operating and office suite software. The investigations are in response to a recent complaint filed by Norway browser developer Opera Software ASA and a 2006 complaint brought by the European Committee for Interoperable Systems (ECIS), which includes Microsoft rivals IBM, Nokia, Sun, RealNetworks and Oracle among its members.
Both investigations focus on the benefits that Microsoft gains by combining features, such as search and Windows Live, into its operating system. But the investigation sparked by the Opera complaint also includes some novel and interesting features, based upon Opera's contention that Microsoft's failure to conform Internet Explorer to prevailing open standards puts its competitors at a disadvantage (Opera also asks that either IE not be bundled with Windows, or that other browsers, including its own, should be included as well, with no browser being preset as a default).
The investigations will also look into whether Microsoft has failed to adequately open OOXML, or to take adequate measures to ensure that Office is "sufficiently interoperable" with competing products. This would seem to indicate that Microsoft's strategy of offering OOXML to Ecma, and then ISO/IEC JTC1, may fail to achieve its objective, whether or not OOXML is finally approved as a global standard.
Yesterday, the Linux Foundation posted the first in what will be an ongoing series of podcasts with open source leaders, visionaries and pioneers. The interviews will each be conducted by LF Executive Director Jim Zemlin, and naturally, the first interview is with Linus Torvalds. Appropriately, the series is called, "Open Voices." In this first podcast, Linus shares his thoughts on Life the Open Source Universe, and Everything. Here are a few samples (I'll add more later in this blog entry):
Regarding Linux on the desktop: “The Linux desktop is why I got into Linux in the first place. I have never, ever cared about really anything but the Linux desktop… I don’t worry about the desktop on a technical level because I think that’s the first thing that most kernel developers will really put their efforts in.”
On patent trolls: “Yeah, they’re kind of like the terrorists that you can’t bomb because there’s nothing there to bomb. There are just these individuals that don’t have anything to lose. That breaks the whole cold war model and seems to be one of the reasons that even big companies are now starting to realize that patents and software are a really bad idea.”
Other topics include the Linux development process, including internationalization; the commodization of the desktop; cracking the code for Mobile Linux; GPL3; OpenSolaris, the future of Linux, and much more.
This is the fifth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All product names used below are registered trademarks of their vendors.
Chapter 5: Open Standards
One of the two articles of faith that Eric Kriss and Peter Quinn embraced in drafting their evolving Enterprise Technical Reference Model (ETRM) was this: products built to "open standards" are more desirable than those that aren't. Superficially, the concept made perfect sense – only buy products that you can mix and match. That way, you can take advantage of both price competition as well as a wide selection of alternative products from multiple vendors, each with its own value-adding features. And if things don't work out, well, you're not locked in, and can swap out the loser and shop for a winner.
But did that make as much sense with routers and software as it did with light bulbs and lamps? And in any event, if this was such a great idea, why hadn't their predecessors been demanding open standards-based products for years? Finally, what exactly was that word "open" supposed to mean?
To answer these questions properly requires a brief hop, skip and jump through the history of standards, from their origins up to the present. And that's what this chapter is about.
It's not often I find myself at a loss for words when I read something, but this is one of those times.
Or perhaps it would be more accurate to say that it isn't really necessary for me to add any words to the following news, other than to characterize them with a Latin phrase lawyers use: Res ipse loquitor, which translates as "the thing speaks for itself." I'll give one clue, though: I've added this blog post to the "ODF and OOXML" folder. That's "OOXML" as in "the world must have this standard so that our customers can open the billions of documents that have already been created in older versions of" a certain office productivity suite.
So without further ado, here's the news, along with what a few other people have had to say about it [Update: see also the comments that readers have added below interpreting the original Microsoft information]:
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 4 – Eric Kriss, Peter Quinn and the ETRM
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important – what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog ... keep it up.
Happy New Year,
Eric Kriss
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Back in March of 2006, I interviewed Alan Cote, the Supervisor of Public Records in the Public Records Division of the Massachusetts Secretary's office. Alan had testified back in October of 2005 in the hearing where Peter Quinn had been called on the carpet by Senator Marc Pacheco, the Chair of the Senate Committee on Post Audit and Oversight. At the Pacheco hearing, Alan had professed neutrality about ODF, but also doubts that document formats could provide a useful tool for document preservation.
What struck me most forcefully at both the hearing as well as the interview was that Alan presumably should have been one of the biggest proponents of open formats, rather than a doubting Thomas. Why? Because the process he now follows to preserve electronic documents seems almost comically cumbersome and tedious. Briefly summarized, it involves recopying every single electronic document every five years or so onto new media (electronic media degrade surprisingly rapidly) in multiple formats (because formats are regularly abandoned). Shouldn't someone stuck with such a chore be desperate to find a better way?
Apparently, preserving documents is child's play compared to preserving modern movies, especially those created initially in digital form. How bad - and expensive - is that? According to a 74 page study released by the Academy of Motion Picture Arts and Sciences (AMPAD) to a limited audience in early November, preserving a full-length digital movie can cost $208,569 (dramatic pause) per year. The reasons are exactly the same as for digitized documents, and the currently available means of preservation are the same as well. The amount of data involved, however, is vastly greater - and no commitment to remain faithful to a single standard (yet) exists to ensure that future technologies will be able to display movies created using today's techniques.