With the full specification in that state, the PDF formats will once and for all abandon the rather confusing, schizophrenic existence that they have maintained to date. Over time, more and more (but not all) of the specification had …
It's been an unusually active week in the contest between already ISO-adopted ODF and OOXML, as the latter moves through the first step of the ISO the adoption process. More specifically, Ecma submitted OOXML to the ISO/IEC Joint Technical Committee 1 (JTC1) on January 5, starting the clock on the traditional one-month "contradictions" period that begins the "fast track" process in the JTC1. However, OOXML is no traditional specification, weighing in at over 6,000 pages. During this phase, eligible JTC1 members can note ways in which the proposed standard overlaps other standards, fails to incorporate available ISO standards, or otherwise does not meet ISO rules (a second, five month period will begin on February 5, during which technical and other objections may be raised).
With OOXML formally launched within the JTC1, both sides have pulled out all the stops to influence the national bodies eligible to participate, as well as the public at large. Here's a chronology of the principle events of just the last seven days, and how they fit into the overall scheme of things:
I'm pleased to share some news that I expect you'll be reading about in lots of other places today: Open Source Development Labs (OSDL) and the Free Standards Group (FSG) signed an agreement yesterday providing for the two groups to combine forces to form a new organization – The Linux Foundation. The result of this consolidation will be to dedicate the resources of the combined membership to "accelerate the growth of Linux by providing a comprehensive set of services to compete effectively with closed platforms." You can read the press release here, as well a detailed article by Steve Lohr of the New York Times here (the article will appear in Monday's print edition of the NYT).
Jim Zemlin, currently the Executive Director of FSG, will lead the new organization, which will include as members every major company in the Linux industry, including Fujitsu, Hitachi, HP, IBM, Intel, NEC, Novell, Oracle and Red Hat, as well as many community groups, universities and industry end users. The necessary member votes are currently being taken by OSDL and FSG, and the transaction is expected to be finalized on or about February 2, 2007. (Disclosure: I am a director of, and my firm is legal counsel to, FSG.)
Starting with the somewhat silly, OOXML does not conform to ISO 8601:2004 “Representation of Dates and Times.” Instead, OOXML section 3.17.4.1, “Date Representation,” on page 3305, requires that implementations replicate a Microsoft bug that dictates that 1900 is a …
Three news clips I posted yesterday highlight the disorderly, but ultimately productive, way in which products and standards evolve in tandem during times of innovation.
The first clip is a press release announcing the completion of a new USB connector design standard. A really small connector design. How small? That large object to the left of the connector in this picture looks like a flat camera battery to me. The reason for creating the new standard is because mobile devices are getting smaller at the same time that more and more features (camera, video, music, wireless, etc.) are being crammed inside a single mobile device.
The second article provides an update on a delayed wireless USB standard that would eliminate the need for a connector at all. And the third announces the completion of a new WiFi-compatible standard to make mobile security as easy to set up as the push of a button – whether you are accessing data wirelessly or physically, using a USB connector. As the standard is extended, other wireless standards (like Near Field Communications – a very short range standard) will be supported as well.
You can expect that the devices you buy in the next year or two will use one, two or even all three of these standards. Or you might see completely different connection and data-transfer standards in use, all chosen from what might be called a "swarm" of overlapping standards that are continuously being developed around mobile devices. Logically, you might wonder whether this is a good thing, or simply yet another case of too many standards being created to do the same job.
In this case, I think it's the former rather than the latter. Let's see if evolution in the physical world provides an example of why this would be so.
A long story by David Strom at InformationWeek called Five Disruptive Technologies to Watch in 2007 couldn't help but catch my eye on New Years Day. The reason is that all five technologies, and the strategies of the vendors that are promoting them, rely upon standards – in most cases, fundamentally. That's no surprise, because disruption by definition is painful, and no one in the supply chain (including end users) likes pain. Providing a convincing argument why the resulting pleasure will more than offset the pain is therefore imperative.
One type of pain that vendors and end users particularly dislike is the risk inherent in buying into a new technology or service model. For a vendor, that means investing in lots of R&D, set up and marketing in a new product that may not sell. For an end user, it means buying into a new product or service that may prove to be poorly supported with (at best) little competition or choice provided, and (at worst) ultimate abandonment and the need to switch again.
One of the best ways to mitigate against this type of risk for both sides of the sales equation is to create standards that everyone agrees to adopt, ensuring that there are many stakeholders that have a vested interest in seeing the disruptive model succeed. In today's increasingly interoperable and networked world, those standards are in any case often essential for the model to work at all, blunting the competitive urge to try and own the whole opportunity. The result (only up to a point, of course), is an "all for one and one for all" coordinated promotional campaign that seeks to project the image that the new technology is one that you need and just have to get, whether it be a new architecture being pitched to Fortune 100 companies, or yet another entertainment format being promoted to consumers.
There's a comprehensive update of the long-raging Wireless Wars at the IEEE site right now, written by Greg Goth, and aptly titled This Little Standard Went to Market; This Little Standard Blew Up. Those wars, you may recall, have been raging for years. Most recently, attention has focused on a new and hotly-contested wireless personal area network standard intended not to replace WiFi, but to allow other types of devices – like stereo equipment – to be connected wirelessly. IEEE chartered the 802.15.3a short-range universal serial bus (USB) standard task group to create a specification to satisfy this need, and many were the proposals offered by the task group participants to serve as a basis for that specification. Although those many proposals were eventually winnowed down to two, the task group ultimately gave up in January of this year, when the final two warring camps couldn't agree on a compromise
And then there is the 802.20 long-range mobile wireless standard, which will provide a long-range equivalent to WiFi. The IEEE itself shut down that task group last June to conduct an investigation, after charges of conflict of interest and favoritism on the part of the chair, as well as stacking the vote by some members, were leveled. The IEEE standards board conducted an investigation, and found “a lack of transparency, possible ‘dominance,’ and other irregularities in the Working Group.”
I've written about each of these battles frequently over the years, dedicating the March issue of the Consortium Standards Bulletin to Standards Wars generally, and to the wireless heat-butting in particular, as well as writing a number of blog entries that you can find here. You can also find a file of over a hundred news articles on the same topic here. Most of these battles have played out in, or around, the IEEE, and in particular within the 802 technical committee, which manages protocol development for local and metropolitan area networks.
A year ago, many words were written (including by me) on why Microsoft may have chosen Ecma to package Microsoft's Office Open XML formats as a standard. Now that Ecma has finished that project and adopted the result, there's additional data to examine that sheds some light on that question. That will be my topic today, and for the next several entries.
About two weeks ago I wrote a related entry called Ecma Approves OOXML – What Does it all Mean? In that entry, I tried to give an even handed overview (admittedly, as I see it) of how the Ecma approval of Microsoft's XML formats fits into the grand scheme of things. The bottom line was that I did not think that the Ecma action was very significant, given the circumstances under which it had been achieved.
That post elicited the following question from a reader:
Maybe I'm a bit naive . . . but does this mean that Microsoft is trying to get the various standards authorities eating out of their hands - i.e. uncritically approving everything pumped out by the behemoth?
I tried to give that question an even handed response as well – because in fact it's common practice for most companies to engage in the equivalent of what a lawyer would call "forum shopping:" looking for the court and judge they think will most likely rule in her favor. My response therefore read as follows:
For those with an interest in how accessibility can be achieved at the technical level, the go-to expert is Peter Korn (you can find a link to his blog in the “Blog’s I read” list in the right column). Yesterday …
Updated 12:00 ET to include information from the IBM and FSG press releases (both have been appended as well)
Elizabeth Montalbano at ComputerWorld wrote a piece yesterday about a thus far little noticed project with the enigmatic name "Project Missouri." How little noticed? I just tried a Google search of "'project Missouri' IBM ODF" and found…just Elizabeth's article.
What is Project Missouri, and why the odd name? The title of the ComputerWorld article will give a first clue: IBM project aims to help blind use ODF applications. As you will recall, ODF accessibility has been a big issue in Massachusetts, with the community of the disabled, as well as ODF opponents, challenging the Commonwealth's decision to convert to ODF-compliant software products until such time as these products become as accessible as Microsoft Office.
In response, a number of ODF proponents – including IBM – pledged to make ODF not only the equal of, but superior to, Microsoft Office with respect to accessibility. Opponents scoffed, and hence an accessibility project that puckishly borrows its name from the hard to convince midwestern locale that refers to itself as the "Show Me State."
The Missouri Project is only one of a number of ongoing initiatives intended to enable improved accessibility for ODF compliant products. OASIS, which developed and maintains ODF, is supporting a number of these efforts. Version 1.1 of ODF, which has already been adopted as a Committee Standard at OASIS, already includes features based on these efforts. This new project supported by IBM specifically addresses the needs of visually impaired users, and is developing APIs (application programming interfaces) that have been named "IAccessible2." As reported by Montalbano: