One of the topics I'm behind writing on is the state of IPR concerns and standard setting in China in general, and the current status of UOF – China's "Uniform Office Document Format" entry in the document format sweepstakes – in particular. I recently spoke at two conferences in Beijing, and got back up to speed in this regard direct from the source.
While ODF and OOXML continue to generate news and heat, the progress of UOF has proceeded with much less fanfare and reportage. I gave a keynote presentation called the Beijing 2007 Open Standards International Conference, and also moderated a panel on IPR and interoperability. That conference was organized by the dominant software industry standards association in China, the Changfeng Open Standards Platform Alliance, and was co-sponsored by the China National Institute of Standardization and the China Electronic Standardization Institute. Several panels were dedicated entirely or in part to open document formats.
Or so, at least, Google would like you to conclude. Significant differences include single-author control (but the freedom for other authors to set up competing pages as well), bylines for page authors, reader ranking, and - oh yes - Google ads (authors interested in allowing ad placements would get a "substantial" share of the resulting revenues).
Here's how Google introduces the concept on a somewhat higher level:
The web contains an enormous amount of information, and Google has helped to make that information more easily accessible by providing pretty good search facilities. But not everything is written nor is everything well organized to make it easily discoverable. There are millions of people who possess useful knowledge that they would love to share, and there are billions of people who can benefit from it. We believe that many do not share that knowledge today simply because it is not easy enough to do that. The challenge posed to us by Larry, Sergey and Eric was to find a way to help people share their knowledge. This is our main goal.
The question, of course, is how well the Knol project will compete with the Wikipedia, both for author input as well as readers.
And, of course, in quality. The Google announcement of the Knol project is pasted in below in full, but I'll also provide my review and reactions to the concept, not only on the knol's prospects for fulfilling Google's goals, but also its potential for providing a well supported, highly visible, testbed for individuals to experiment with a wide variety of models for the collaborative creation of Web based content - something the Wikipedia does not offer.
As the date for the February BRM (Ballot Resolution Meeting) on ISO/IEC JTC1 DIS 29500 (a/k/a Ecma 376, a/k/a Microsoft OOXML) approaches, more and more attention is being paid to how Ecma will propose the disposition of the comments submitted during the general voting period. This level of heightened interest is legitimately urgent, due to both the great number of the comments that need to be resolved, even after elimination of duplicates, as well because of the late date upon which the proposed resolutions will be made public (the deadline, if memory serves, is January 19, while the BRM will commence its deliberations on February 25 of next year).
The words are therefore flying fast and furious at the many blogs covering this question, and tempers are rising in the comments appended to those of bloggers that have a direct interest in the outcome. A particularly contentious issue has been whether Ecma is trying to make it as easy as possible, or is trying to make it as difficult as possible while still scoring PR points, for interested parties to view proposed dispositions of comments, and whether it does, or does not, have the latitude under ISO rules to be more transparent. The fairly opaque, and sometimes contradictory nature of those rules, has not made the debate any easier, and gives rise to the possibility of confusion, at best, and serious mistakes, at worst, as Pamela Jones pointed out at Groklaw this morning.
The result is that there will be very little real data available to the general public until Ecma opens the curtains on January 19. And the import of what little data does become available is usually the subject of instant disagreement.
With that as prelude, I've pasted in the text of a press release at the end of this blog entry that Ecma issued yesterday. The release gives only a peek at some of the issues addressed in the new dispositions, giving varying degrees of detail on each area highlighted - but that's more than we've had to go on so far. Here is my summary of the press release and its significance, when viewed in the context of other reliable, available information:
This is the third chapter in a real-time eBook writing project I launched and explained in late November. Constructive comments, corrections and suggestions are welcome. All Microsoft product names used below are registered trademarks of Microsoft.
This chapter was revised at 8:30 AM on 12/11/07, most significantly by adding the "Lessons applied" section.
Chapter 3: What a Difference a Decade Can Make
In 1980, Microsoft was a small software vendor that had built its business primarily on downsizing mainframe programming languages to a point where they could be used to program the desktop computers that were then coming to market. The five year old company had total revenues of $7,520,720, and BASIC, its first product, was still its most successful. By comparison, Apple Computer had already reached sales of $100 million, and the same year launched the largest public offering since the Ford Motor Company had itself gone public some twenty-four years before. Microsoft was therefore far smaller than the company that Steve Jobs and Steve Wozniak had formed a year after Bill Gates and Paul Allen sold their first product.
Moreover, in the years to come, PC-based word processing products like WordStar, and then WordPerfect, would become far more popular than Microsoft's own first word processing (originally called Multitool Word), providing low-cost alternatives to the proprietary minicomputer based software offerings of vendors like Wang Laboratories. IBM, too, provided a word processing program for the PC called DisplayWriter. That software was based on a similar program that IBM had developed for its mainframe systems customers. More importantly, another program was launched at just the right time to dramatically accelerate the sale of IBM PCs and their clones. That product was the legendary "killer app" of the IBM PC clone market: Lotus 1-2-3, the spreadsheet software upon which Mitch Kapor built the fortunes of his Lotus Development Corporation.
This is the second chapter in a real-time eBook writing project I launched and explained last week. The following is one of a number of stage-setting chapters to follow. Comments, corrections and suggestions gratefully accepted. All Microsoft product names used below are registered trademarks of Microsoft.
Chapter 2 – Products, Innovation and Market Share
Microsoft is the envy of many vendors for the hugely dominant position it enjoys in two key product areas: PC desktop operating systems – the software that enables and controls the core functions of personal computers - and "office productivity software" – the software applications most often utilized by PC users, whether at work or at home, to create documents, slides and spreadsheets and meet other common needs. Microsoft's 90% plus market share in such fundamental products is almost unprecedented in the technical marketplace, and this monopoly position enables it to charge top dollar for such software. It also makes it easy for Microsoft to sell other products and services to the same customers.
Microsoft acquired this enviable position in each case through a combination of luck, single-minded determination, obsessive attention to detail, and a willingness to play the game fast and hard – sometimes hard enough to attract the attention of both Federal and state antitrust regulators. Early on, Bill Gates and his team acquired a reputation for bare-knuckle tactics that they sometimes seemed to wear with brash pride. Eventually, these tactics (as well as tales of Gate's internal management style) progressed from industry rumors to the stuff of best sellers, like Hard Drive: Bill Gates and the Making of the Microsoft Empire.
With the emergence of the Web, of course, the opportunity for widely sharing stories, both real (of which there were many) and apocryphal, exploded. Soon Web sites such as Say No to Monopolies: Boycott Microsoft enthusiastically collected and posted tales of alleged technological terror and dirty deeds. More staid collections were posted at sites such as the Wikipedia. The increasing tide of litigation involving Microsoft, launched not only by state and federal regulators but by private parties as well, generated embarrassing documents. Such original sources were not only difficult to deny, but almost impossible to repress in the age of the Web - and of peer to peer file sharing as well.
Moreover, while Bill Gates and his co-founders rarely displayed the creative and innovative flair of contemporaries like Apple's Steve Jobs, neither were they troubled by the type of "not invented here" bias that sometimes led other vendors to pursue unique roads that sometimes led to dead ends.
For some time I've been considering writing a book about what has become a standards war of truly epic proportions. I refer, of course, to the ongoing, ever expanding, still escalating conflict between ODF and OOXML, a battle that is playing out across five continents and in both the halls of government and the marketplace alike. And, needless to say, at countless blogs and news sites all the Web over as well.
Arrayed on one side or the other, either in the forefront of battle or behind the scenes, are most of the major IT vendors of our time. And at the center of the conflict is Microsoft, the most successful software vendor of all time, faced with the first significant challenge ever to ione of its core businesses and profit centers – its flagship Office productivity suite.
Some twenty years ago, information technology vendors began opting out of the accredited standards system with increasing frequency in order to form organizations they called fora, alliances, and (most often) consortia. The reasons for the schism were several, but the development was remarkable in that the separatists presumed that standards could become ubiquitous whether or not they acquired the imprimatur of one of the "Big Is:" the International Organization for Standardization (ISO), the International Electrotechnical Commission (IEC), and the International Telecommunication Union (ITU). And they were right.
Today, there are hundreds of consortia, and many of these organizations have achieved a size, work output, membership, influence and respect that equal that of their accredited peers. Along the way, the information and (to a lesser extent) communications technology industries have come to rely heavily upon consortia to supply their standards needs. But even as this parallel universe of standard setting has achieved respectability, an interesting trend has developed: more and more standards that have been created by consortia are being submitted to one of the "Big Is" for adoption.
Those of us who live in America are currently in the midst of that most protracted, expensive and (often) tedious of all democratic processes: the quadrennial quest to find, and perhaps even elect, the most able leader to guide the nation into the future. Part and parcel to that spectacle is a seemingly endless torrent of printed words and video. These emanate from more than a dozen candidates, each of whom is trying to convince the electorate that he or she is The One, while at the same time hoping to avoid offering any point of vulnerability that can be exploited by the opposition.
It is an overwhelming and leveling experience for all concerned, electorate and candidates alike.
Out of the campaign cacophony of the last week emerged a handful of words from Senator and Democratic party hopeful Barack Obama that could not fail to catch my attention. He used them during the presidential debate held in Las Vegas, and they also appear in the "Innovation Agenda" that Obama had released a few days before. He announced this agenda in a speech he delivered on November 14 at an aptly selected venue: the Google campus in Mountainview, California. One of the pledges he made in the course of that speech reads in part as follows:
To seize this moment, we have to use technology to open up our democracy. It's no coincidence that one of the most secretive Administrations in history has favored special interests and pursued policies that could not stand up to sunlight. As President, I'll change that. I'll put government data online in universally accessible formats. [emphasis added]
A presidential candidate that is including "universally accessible formats" in his platform? How did that come about?
This is the latest in a series of occasional essays that I call The Monday Witness. This series focuses on social rather than technical issues, for the reasons explained in the first entry in the series. As always, the opinions expressed here are mine alone.
I find it difficult to write about the war in Iraq with the same degree of detachment that I strive to bring to other subjects. Given the dual lessons delivered by the conflicts of the 20th century – that the horrors of war defy imagination, and that it is far easier to begin a war than to end one – it seems incomprehensible that we would have launched a preemptive war against a nation on the other side of the world with no immediate ability to harm us, even if all of our inaccurate pre-war intelligence had proven to be true.
Notwithstanding the fact that Iraq was thought to be at best several years from possessing a nuclear weapon; despite the fact that its regime could not deliver weapons of any type more than a few hundred miles beyond its borders; notwithstanding America's historical and moral stand against preemptive war; and regardless of the fact that the nations of the world assembled in the UN were not convinced of the basis for war.
In part, I fear, it is because Americans do not value foreign lives as highly as their own.
Wednesday I attended the W3C Technical Plenary Day festivities, which included a brief press conference with Tim Berners-Lee, interesting insights into the W3C's work in progress and future plans, and much more. And it also gave me a chance to sit with Chris Lilley, a W3C employee whose responsibilities include Interaction Domain Leader, Co-Chair W3C SVG Working Group, W3C Graphics Activity Lead and Co-Chair, W3C Hypertext CG. What that combination of titles means is that he is the "go to" guy at W3C to learn what W3C's CDF standard is all about.
CDF is one of the very many useful projects that W3C has been laboring on, but not one that you would have been likely to have heard much about. Until recently, that is, when Gary Edwards, Sam Hiser and Marbux, the management (and perhaps sole remaining members) of the OpenDocument Foundation decided that CDF was the answer to all of the problems that ODF was designed to address. This announcement gave rise to a flurry of press attention that Sam Hiser has collected. As others (such as Rob Weir) have already documented these articles gave the Foundation's position far more attention than it deserved.