It's not often I find myself at a loss for words when I read something, but this is one of those times.
Or perhaps it would be more accurate to say that it isn't really necessary for me to add any words to the following news, other than to characterize them with a Latin phrase lawyers use: Res ipse loquitor, which translates as "the thing speaks for itself." I'll give one clue, though: I've added this blog post to the "ODF and OOXML" folder. That's "OOXML" as in "the world must have this standard so that our customers can open the billions of documents that have already been created in older versions of" a certain office productivity suite.
So without further ado, here's the news, along with what a few other people have had to say about it [Update: see also the comments that readers have added below interpreting the original Microsoft information]:
This is the fourth chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 4 – Eric Kriss, Peter Quinn and the ETRM
By the end of December 2005, I had been blogging on ODF developments in Massachusetts for about four months, providing interviews, legal analysis and news as it happened. In those early days, not many bloggers were covering the ODF story, and email began to come my way from people that I had never met before, from as far away as Australia, and as near as the State House in Boston. Some began with, "This seems really important – what can I do to help?" Others contained important information that someone wanted to share, and that I was happy to receive.
One such email arrived just before Christmas in 2005. In its entirety, it read:
Enjoy reading your consortiuminfo blog ... keep it up.
Happy New Year,
Eric Kriss
This was a pleasant and welcome surprise. Until the end of September, Eric Kriss had been the Massachusetts Secretary of Administration and Finance, and therefore Peter Quinn's boss. Together, they had conceived, architected and launched the ambitious IT upgrade roadmap that in due course incorporated ODF into the state's procurement guidelines.
Back in March of 2006, I interviewed Alan Cote, the Supervisor of Public Records in the Public Records Division of the Massachusetts Secretary's office. Alan had testified back in October of 2005 in the hearing where Peter Quinn had been called on the carpet by Senator Marc Pacheco, the Chair of the Senate Committee on Post Audit and Oversight. At the Pacheco hearing, Alan had professed neutrality about ODF, but also doubts that document formats could provide a useful tool for document preservation.
What struck me most forcefully at both the hearing as well as the interview was that Alan presumably should have been one of the biggest proponents of open formats, rather than a doubting Thomas. Why? Because the process he now follows to preserve electronic documents seems almost comically cumbersome and tedious. Briefly summarized, it involves recopying every single electronic document every five years or so onto new media (electronic media degrade surprisingly rapidly) in multiple formats (because formats are regularly abandoned). Shouldn't someone stuck with such a chore be desperate to find a better way?
Apparently, preserving documents is child's play compared to preserving modern movies, especially those created initially in digital form. How bad - and expensive - is that? According to a 74 page study released by the Academy of Motion Picture Arts and Sciences (AMPAD) to a limited audience in early November, preserving a full-length digital movie can cost $208,569 (dramatic pause) per year. The reasons are exactly the same as for digitized documents, and the currently available means of preservation are the same as well. The amount of data involved, however, is vastly greater - and no commitment to remain faithful to a single standard (yet) exists to ensure that future technologies will be able to display movies created using today's techniques.
One of the topics I'm behind writing on is the state of IPR concerns and standard setting in China in general, and the current status of UOF – China's "Uniform Office Document Format" entry in the document format sweepstakes – in particular. I recently spoke at two conferences in Beijing, and got back up to speed in this regard direct from the source.
While ODF and OOXML continue to generate news and heat, the progress of UOF has proceeded with much less fanfare and reportage. I gave a keynote presentation called the Beijing 2007 Open Standards International Conference, and also moderated a panel on IPR and interoperability. That conference was organized by the dominant software industry standards association in China, the Changfeng Open Standards Platform Alliance, and was co-sponsored by the China National Institute of Standardization and the China Electronic Standardization Institute. Several panels were dedicated entirely or in part to open document formats.
Or so, at least, Google would like you to conclude. Significant differences include single-author control (but the freedom for other authors to set up competing pages as well), bylines for page authors, reader ranking, and - oh yes - Google ads (authors interested in allowing ad placements would get a "substantial" share of the resulting revenues).
Here's how Google introduces the concept on a somewhat higher level:
The web contains an enormous amount of information, and Google has helped to make that information more easily accessible by providing pretty good search facilities. But not everything is written nor is everything well organized to make it easily discoverable. There are millions of people who possess useful knowledge that they would love to share, and there are billions of people who can benefit from it. We believe that many do not share that knowledge today simply because it is not easy enough to do that. The challenge posed to us by Larry, Sergey and Eric was to find a way to help people share their knowledge. This is our main goal.
The question, of course, is how well the Knol project will compete with the Wikipedia, both for author input as well as readers.
And, of course, in quality. The Google announcement of the Knol project is pasted in below in full, but I'll also provide my review and reactions to the concept, not only on the knol's prospects for fulfilling Google's goals, but also its potential for providing a well supported, highly visible, testbed for individuals to experiment with a wide variety of models for the collaborative creation of Web based content - something the Wikipedia does not offer.
As the date for the February BRM (Ballot Resolution Meeting) on ISO/IEC JTC1 DIS 29500 (a/k/a Ecma 376, a/k/a Microsoft OOXML) approaches, more and more attention is being paid to how Ecma will propose the disposition of the comments submitted during the general voting period. This level of heightened interest is legitimately urgent, due to both the great number of the comments that need to be resolved, even after elimination of duplicates, as well because of the late date upon which the proposed resolutions will be made public (the deadline, if memory serves, is January 19, while the BRM will commence its deliberations on February 25 of next year).
The words are therefore flying fast and furious at the many blogs covering this question, and tempers are rising in the comments appended to those of bloggers that have a direct interest in the outcome. A particularly contentious issue has been whether Ecma is trying to make it as easy as possible, or is trying to make it as difficult as possible while still scoring PR points, for interested parties to view proposed dispositions of comments, and whether it does, or does not, have the latitude under ISO rules to be more transparent. The fairly opaque, and sometimes contradictory nature of those rules, has not made the debate any easier, and gives rise to the possibility of confusion, at best, and serious mistakes, at worst, as Pamela Jones pointed out at Groklaw this morning.
The result is that there will be very little real data available to the general public until Ecma opens the curtains on January 19. And the import of what little data does become available is usually the subject of instant disagreement.
With that as prelude, I've pasted in the text of a press release at the end of this blog entry that Ecma issued yesterday. The release gives only a peek at some of the issues addressed in the new dispositions, giving varying degrees of detail on each area highlighted - but that's more than we've had to go on so far. Here is my summary of the press release and its significance, when viewed in the context of other reliable, available information:
This is the third chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
This chapter was revised at 8:30 AM on 12/11/07, most significantly by adding the "Lessons applied" section.
Chapter 3: What a Difference a Decade Can Make
In 1980, Microsoft was a small software vendor that had built its business primarily on downsizing mainframe programming languages to a point where they could be used to program the desktop computers that were then coming to market. The five year old company had total revenues of $7,520,720, and BASIC, its first product, was still its most successful. By comparison, Apple Computer had already reached sales of $100 million, and the same year launched the largest public offering since the Ford Motor Company had itself gone public some twenty-four years before. Microsoft was therefore far smaller than the company that Steve Jobs and Steve Wozniak had formed a year after Bill Gates and Paul Allen sold their first product.
Moreover, in the years to come, PC-based word processing products like WordStar, and then WordPerfect, would become far more popular than Microsoft's own first word processing (originally called Multitool Word), providing low-cost alternatives to the proprietary minicomputer based software offerings of vendors like Wang Laboratories. IBM, too, provided a word processing program for the PC called DisplayWriter. That software was based on a similar program that IBM had developed for its mainframe systems customers. More importantly, another program was launched at just the right time to dramatically accelerate the sale of IBM PCs and their clones. That product was the legendary "killer app" of the IBM PC clone market: Lotus 1-2-3, the spreadsheet software upon which Mitch Kapor built the fortunes of his Lotus Development Corporation.
This is the second chapter in a real-time eBook writing project I launched and explained last week. The following is one of a number of stage-setting chapters to follow. Comments, corrections and suggestions gratefully accepted. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 2 – Products, Innovation and Market Share
Microsoft is the envy of many vendors for the hugely dominant position it enjoys in two key product areas: PC desktop operating systems – the software that enables and controls the core functions of personal computers - and "office productivity software" – the software applications most often utilized by PC users, whether at work or at home, to create documents, slides and spreadsheets and meet other common needs. Microsoft's 90% plus market share in such fundamental products is almost unprecedented in the technical marketplace, and this monopoly position enables it to charge top dollar for such software. It also makes it easy for Microsoft to sell other products and services to the same customers.
Microsoft acquired this enviable position in each case through a combination of luck, single-minded determination, obsessive attention to detail, and a willingness to play the game fast and hard – sometimes hard enough to attract the attention of both Federal and state antitrust regulators. Early on, Bill Gates and his team acquired a reputation for bare-knuckle tactics that they sometimes seemed to wear with brash pride. Eventually, these tactics (as well as tales of Gate's internal management style) progressed from industry rumors to the stuff of best sellers, like Hard Drive: Bill Gates and the Making of the Microsoft Empire.
With the emergence of the Web, of course, the opportunity for widely sharing stories, both real (of which there were many) and apocryphal, exploded. Soon Web sites such as Say No to Monopolies: Boycott Microsoft enthusiastically collected and posted tales of alleged technological terror and dirty deeds. More staid collections were posted at sites such as the Wikipedia. The increasing tide of litigation involving Microsoft, launched not only by state and federal regulators but by private parties as well, generated embarrassing documents. Such original sources were not only difficult to deny, but almost impossible to repress in the age of the Web - and of peer to peer file sharing as well.
Moreover, while Bill Gates and his co-founders rarely displayed the creative and innovative flair of contemporaries like Apple's Steve Jobs, neither were they troubled by the type of "not invented here" bias that sometimes led other vendors to pursue unique roads that sometimes led to dead ends.
For some time I've been considering writing a book about what has become a standards war of truly epic proportions. I refer, of course, to the ongoing, ever expanding, still escalating conflict between ODF and OOXML, a battle that is playing out across five continents and in both the halls of government and the marketplace alike. And, needless to say, at countless blogs and news sites all the Web over as well.
Arrayed on one side or the other, either in the forefront of battle or behind the scenes, are most of the major IT vendors of our time. And at the center of the conflict is Microsoft, the most successful software vendor of all time, faced with the first significant challenge ever to ione of its core businesses and profit centers – its flagship Office productivity suite.
Some twenty years ago, information technology vendors began opting out of the accredited standards system with increasing frequency in order to form organizations they called fora, alliances, and (most often) consortia. The reasons for the schism were several, but the development was remarkable in that the separatists presumed that standards could become ubiquitous whether or not they acquired the imprimatur of one of the "Big Is:" the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), and the International Telecommunication Union (ITU). And they were right.
Today, there are hundreds of consortia, and many of these organizations have achieved a size, work output, membership, influence and respect that equal that of their accredited peers. Along the way, the information and (to a lesser extent) communications technology industries have come to rely heavily upon consortia to supply their standards needs. But even as this parallel universe of standard setting has achieved respectability, an interesting trend has developed: more and more standards that have been created by consortia are being submitted to one of the "Big Is" for adoption.