Skip to primary content
Skip to secondary content
ConsortiumInfo.org
Search
Sponsored by Gesmer Updegrove
  • Blog
  • About
  • Guide
  • SSO List
  • Meta Library
  • Journal
archives

Andy Updegrove

Hands Across the Water: Open Source and Open Standards

6/14/2006

Item:

Having the latest computer technology is great. But what e-government users from the public sector as well as citizens really want is software interoperability. Unfortunately IT managers still only pay lip service to such interoperability, concludes a European project assessing today’s open-source movement.

So reads the lead paragraph of an article called FLOSSPOLS, launched to support Free/Libre/Open Source Software policy support in Europe.  The summary continues as follows:

“Open standards provide independence, not traditional vendor lock-in. They are good for users, purchasers and government from both the economic and competition standpoint,” says Rishab Ghosh, from the Merit/Infonomics research institute in The Netherlands.

Given how often the words "open source" and "open standards" are bandied about in the technical press, it surprises me how little many in the open source community seem to care about the second half of this pairing of related tools.  It's especially surprising since open standards don't need open source, but open source, like all other software does need open standards.  That's reason enough for the open source community to want to know more about them.

WAPI Standards War to Reach Washington (Again)

6/10/2006

Standards wars are not usually considered to be very newsworthy outside of the technical niche in which they occur, but occasionally there's a breakout story that receives much wider attention.  A recent example was the frontal assault of multiple countries directed at wresting control of the root directory of the Internet - a small but very important standard - from the United States.  That skirmish completely monopolized press coverage of the Tunis meeting of the World Summit on the Information Society (WSIS) last November. 

But before that, there was an even higher level confrontation, this time over a close-range wireless networking standard.  That dispute involved the IEEE 802.11 standards family commonly referred to as WiFi, on the one hand, and a Chinese-origin standard called WLAN on the other.  More particularly, it involved a security protocol (called WAPI) included in WLAN that China contends provides superior security protection than does WiFi.

The original WiFi-WAPI conflict arose from the announcement by China that only WAPI-enabled equipment could be sold in the Peoples Republic of China beginning in 2004, which would reverse the situation from one requiring Chinese manufacturers from paying high patent royalties to companies like Intel to one where Western vendors would find themselves in the opposite position.

Publicly, Intel and other chip vendors announced that they would refuse to sell microprocessors in China if the requirement was imposed.  Privately, they headed to Washington for help, since of course they had no desire to lose access to so vast a market.  The dispute worked its way up through diplomatic channels until ultimately then-Secretary of State Colin Powell became involved.  Eventually, China postponed the effectiveness of the home-grown standards requirement.

But what had been  brokered was only a truce, and not a final resolution to the dispute.  Now, that dispute is flaring up again, and it is the Chinese standards delegation this time that is calling in the diplomatic corps to engage with the opposition.

Microsoft, Adobe and the Murky World of “RAND”

6/07/2006

For a week now, the IT world has been scratching its collective head over the breakdown of PDF licensing negotiations between Microsoft and Adobe.  At issue is why Adobe has allowed OpenOffice.org and Apple to bundle native support for saving documents in PDF format without any economic hooks, but apparently is requiring Microsoft to charge its customers more for Vista if this new release includes the same capabilities - even though PDF is a standard that should be available to all on the same terms.

There are, I think, two logical explanations.  One is relatively straightforward, while the other represents a convoluted and little-discussed weakness in the traditional way of creating standards.  In this entry, I'll describe both possible explanations, but I'm betting that the convoluted alternative will prove to be the explanation for what is happening here.

Let's knock off the easy explanation first, which relates to exactly what it is that Microsoft would like to license from Adobe.  At one end of the spectrum, Microsoft may only be interested in licensing those Adobe patent claims that would be infringed if Microsoft were to write software to implement the PDF specifications that have been adopted as standards by ISO/IEC.  Next up the value chain would be wanting to license the actual code that Adobe itself uses in Acrobat to save documents in the PDF format (not too likely a scenario, given that Microsoft has written a few lines of code over the years).  And finally there would be a Microsoft request to license additional features that are included in Acrobat, but which are not described in the ISO/IEC standard.  Such "proprietary extensions," as I discussed in my last blog entry, can be quite desirable - as Microsoft is well aware.

In the first case, Adobe would have an obligation to license the patent claims to Microsoft on RAND terms (on which more below).  But in the latter two cases, Adobe would be fully justified in imposing whatever unique requirements it wished on the extra code and/or patents, as the case may be, since it is not bound by the specific undertakings it entered into with AIIM (the actual standards body that developed the PDF standard, and then submitted it to ISO/IEC) with respect to actual code or any additional functionalities.

So the first possibility is simply that Microsoft wants more than Adobe is required to give it under its standards-related undertakings.

The Emerging ODF Environment Part III: Spotlight on StarOffice 8

6/05/2006

In this third in-depth interview focusing on ODF-compliant office productivity suites, I interviewed Erwin Tenhumberg, Sun's Product Marketing Manager, Client Systems Group (Erwin's own blog is called Erwin's StarOffice Tango). This series of interviews, and the other activities I have planned to follow, are intended to illustrate the rich environment of applications and tools that are evolving around the OpenDocument Format (ODF) specification developed by OASIS, and now adopted by ISO/IEC. (You can find the previous interview with Inge Wallin of KOffice here, and with Louis Suarez-Potts and John McCreesh of OpenOffice.org here).

Each of these interviews illustrates the unique attributes of each flavor of ODF-compliant software. KOffice is an open source component of the KDE desktop, and thus represents an interesting contrast and alternative to the Office on Windows environment. OpenOffice is also an open source suite, and is the lineal descendant of the code that formed the starting point for the ODF development effort. And StarOffice in many ways is OpenOffice. As you will read below, the same code that represents OpenOffice is utilized in StarOffice as well, which in turn is distinguished by additional value-added features, available support and conversion tools. Like KOffice, StarOffice runs on its developer's operating system — Solaris, representing another alternative to a Windows/Office environment. But unlike MSOffice, StarOffice (and KOffice) will also run on GNU/Linux — and OpenOffice will also run on FreeBSD and Mac OSX.

This series of interviews will include IBM's Workplace as well, which is already in the interview queue. I do not personally have contacts at Google Writely, Abiword, Textmaker or Gnumeric, but if you do — please tell them that I'd be delighted to run an interview with them as well, as the hope is to provide a complete, comparative picture of the entire ODF ecosystem.

Four Points of Interoperability and Adobe

6/04/2006

I had the opportunity last week to not only share a stage with Microsoft's Jason Matusow in Geneva, but also to enjoy dinner with him and some time to chat before that as well.  Jason is a good guy, and I enjoyed the opportunity to kick things around face to face that we've sometimes taken different positions on in our blogs.  Obviously, we have different opinions on the ODF/Open XML debate, but fewer than you might expect (and perhaps fewer than he expected). 

While we were exchanging views on open standards and open source with others at the conference, though, things were apparently falling apart in the ongoing negotiations between Microsoft and Adobe.  Frankly, although there may be more to the story that I don't know, it seems to me as if Adobe is in the wrong on this one.  Adobe wanted its Adobe Acrobat product to become the dominant product in the marketplace for saving documents in a locked format - the de facto standard for creating such documents.  In order to achieve that goal, it allowed aspects of its product to become the basis for what became an ISO/IEC adopted standard, which would involve providing licenses in a non-discriminatory fashion to whoever wanted one.  Once that happened, Adobe is under an obligation to make such a license available to everyone on substantially equivalent terms - which in this case would presumably be free, given that Apple and OpenOffice.org are bundling "save to PDF" capabilities in their offerings. 

What more could there be to the story?  One possibility is that the ISO/IEC PDF standard (what the standards world calls a "de jure" standard, or a standard "as law," because it has been adopted by a formal standards body) sits inside a larger "de facto" standard created by Adobe that the market likes.  Since only the elements that are within the ISO/IEC standard are subject to obligatory license, Adobe would be free to charge whatever it wants, and to whom, for the additional de facto elements.  Such "proprietary extensions," added by the dominant market player, are one way that a company that offers royalty-free technology for inclusion into a standard makes money.  In effect, the technology included in the de jure standard that is available under a free license is used to bait the hook for the royalty-bearing license required to implement the full de facto standard.  If this is the rest of the story, there is some irony in the fact that Microsoft has frequently been criticized for adding proprietary extensions to products it has built to a standard, thereby sometimes making them non-interoperable with other products built to the same standard.

Interestingly enough, this latest speed bump for Vista highlights a point that Jason and I had crossed blog swords over not long ago, which is my view of the importance of standards as a means of achieving interoperability over other means of achieving the same end.  Back on May 8, I posted an entry where I objected to the promotion of the idea by Microsoft of other techniques as a surrogate for, as compared to a supplement to, standards as a means to achieve and guarantee interoperability.  My starting point was a quote in an eWeek article to the following effect:

"You can achieve interoperability in a number of ways," said [Microsoft's] Robertson. Among them: joint collaboration agreements, technology licensing and interoperability pacts.

I had heard the same basic statement from another Microsoft representative, which troubled me, because while these other means can result in interoperability, they don't guarantee equality of opportunity and access the way that a standard does that has been adopted by a standards organization with an effective IPR policy - as has just been amply demonstrated by Adobe's apparently discriminatory withholding of a free PDF output license from Microsoft.

StarOffice Attracts its First Virus (and the press notices)

6/01/2006

I was sitting in the audience of an ISO/IEC meeting in Geneva, Switzerland about to give a presentation on the intersection of open source and open standards when I received an email with a link to this story at CNET.News.com:

Researchers at Kaspersky Lab have spotted what they believe is the first virus for OpenOffice, the open-source rival to Microsoft's Office productivity suite.

The virus, dubbed Stardust, is capable of infecting OpenOffice and StarOffice, which is sold by Sun Microsystems, a Kaspersky Lab researcher wrote on the Russian company's on Tuesday. "Stardust is a macro virus written for StarOffice, the first one I've seen," the researcher wrote. "Macro viruses usually infect MS Office applications."  The pest is written in Star Basic. It downloads an image file with adult content from the Internet and opens that file in a new document, according to Kaspersky's posting.

So far, Stardust is a proof-of-concept virus, which means that it was created to demonstrate that an OpenOffice virus is possible. The virus has not been sent out in the wild and is not actually attacking people's systems.

I did a quick and dirty blog entry then, which I'm updating now. 

Hmm.  Now that I have time to give this a closer reading, it appears that this bit o' malware has hit only StarOffice, not OpenOffice.  So while the conclusion may (or may not) be correct to say it can be adapted to hit OpenOffice, it's a stretch to assume that it was "created to demonstrate that an OpenOffice virus is possible."  If that was the goal, why not  _start_ by hitting an OpenOffice user, eh?

Be that as it may, you may recall that only a week ago I did an article about the  Word Trojan, and how the press reported it.  Through this new hack on StarOffice we have the opportunity to see how the press reports an equally (or less) insignificant attack on one version of ODF compliant software, so let's see what we see.

The Free Standards Group: Squaring the Open Source/Open Standards Circle

5/30/2006

Before there was Linux, before there was open source, there was (and still is) an operating system called Unix that was robust, stable and widely admired.  It was also available under license to anyone that wanted to use it, because it had been developed not by a computer company, but by personnel at AT&T's Bell Labs, which for a time was not very aware of its status as the incubator of a vital OS.

Mighty was the rise of that OS, and regrettably, so was the waning of its influence.  Happily, Unix was supplanted not only by Windows NT, but also by Linux, the open source offshoot of Unix.  But today, LInux is at risk of suffering a similar fate to that suffered by Unix.  That risk is the danger of splintering into multiple distributions, each of which is sufficiently dissimilar to the others that applications must be ported to each distribution - resulting in the "capture," or locking in, of end-users on "sub brands" of Linux.

The bad news is that the rapid proliferation of Linux distributions makes this a real possibility.  The good news is that it doesn't have to, because a layer of standards called the Linux Standard Base (LSB) has already been created, through an organization called the Free Standards Group (FSG), that allows ISVs to build to a single standard, and know that their applications will run across all compliant distributions.  And happily, all of the major distributions have agreed to comply with LSB 3.1, the most recent release.

I recently interviewed Jim Zemlin, the Executive Director of FSG, as well as Ian Murdock, the creator of Debian GNU/Linux, and the FSG's CTO and Chair of the LSB Working Group.  That interview appears in the May issue of the Consortium Standards Bulletin and covers a great deal of ground.  Some of the most interesting details, though, relate to how this open standards process interacts with, and serves, the open standards process that creates Linux itself.  Below, I've excerpted those parts of the interview, so that you can see how it's done.  [Disclosure:  I am on the Board of Directors of the FSG, and am also FSG's legal counsel.]

FSG — Linux Interface

1.  Which open source projects does FSG actively engage with?

Primarily the Linux distributions but also many of the constituent projects, particularly if those projects provide a platform that developers can target that could benefit from better integration with the broader Linux platform. Good examples here include the GNOME and KDE desktop environments. Each of these desktop environments is a platform in its own right, but a desktop isn't much use unless it is well integrated with the operating system underneath. Furthermore, ISVs targeting the Linux desktop ideally want to provide a single application that integrates well regardless of which environment happens to be in use.

[more] 

Everybody’s Talking about Open Source/Open Standards (but what are they saying?)

5/29/2006

IT vendors these days are in love with the phrase "open source and open standards" (as in "you should buy our open source/open standards-based solutions"). 

If you haven't already noticed that fact, here's a little experiment to try that will make the point:  Google the phrase "open source" AND "open standards."  I just did, and got 4,700,000 hits, with 284 leading to recent news articles.  Odds are, though, that few of those hits would take me to a knowledgeable discussion of how open source and open standards can and should work together.

In fact, while open source and open standards each work well in isolation, those who create them haven't always played very well together when they find themselves in the same sandbox.  As those 4,700,000 hits suggest, however, there's a great need for both sides to learn how to optimize the relationship between open standards and open source.  And that's the theme of this issue of the CSB, revisiting a topic I last covered in detail in the March 2005 issue.  That issue was called What Does "Open" Mean? and my editorial then was titled A Call for Communication Between Communities. 

A year has now gone by, and I decided that it was high time to return to this topic, so I did, and made it the theme of the May issue of the Consortium Standards Bulletin, which was sent to subscribers last week.  Here's what it had to say.

The Word Trojan: Anatomy of an Online Story

5/27/2006

Lately I've been blogging quite a bit on the state of on-line journalism.  One aspect of that topic that I haven't touched on for awhile is the way in which a story breaks, builds, morphs and spreads electronically.  The recent announcement of the Backdoor.Ginwui virus provides an interesting opportunity to do this once again, in order to see who addressed the story and how (including by me), and what, if anything, it all means. Cutting to the bottom line: it doesn't matter how little impact a virus may have, if it targets Microsoft (a/ka/ the world's desktop), its likely that every conceivable and theoretical angle of the story will get poked and prodded, whether it deserves to be or not. The reasons are two fold: first the threat of a really bad virus is akin to an IT bird flu epidemic, so the mere possibility of a massive break out captures the mind. And second, it offers an opportunity for authors to explore other current issues in the market place that are directly, or tangentially related. Taking a look at how stories are written on line is also illuminating. In fact, it's been my consistent experience when I've conducted a survey like this that only a small percentage of the on-line articles that are written on any story are the product of any actual first hand research by the author. The vast majority are either short rehashes of information taken from other peoples' stories (and research) and/or from readily-available on-line alerts, press releases and public statements.

Monocultures and Document formats: Dan’s Bomb Goes Off

5/23/2006

Dan Geer is an extremely well respected security expert.  When he worries about something, people listen.

One of the things he has worried - and warned - about is the danger represented by IT "monocultures" - the situation that arises when everyone uses the same software, for example, and therefore everyone shares the same vulnerability to a computer virus or other security threat. 

Just as the word "virus" has been borrowed from biology and provides an apt and vivid descriptor for its IT analogue, so also does the word monoculture function: think of the consequences of Irish potato blight, or of the wiping out of the American Chestnut tree, which once numbered in the billions in the forests of the American East and is almost extinct as a mature species.

Well, last November, Dan wrote a perspective piece for CNETnews.com, called Massachusetts Assaults Monoculture.  In that article, he wrote:

As a matter of logic alone: If you care about the security of the commonwealth, then you care about the risk of a computing monoculture. If you care about the risk of a computing monoculture, then you care about barriers to diversification. If you care about barriers to diversification, then you care about user-level lock-in. And if you care about user-level lock-in, then you must break the proprietary format stranglehold on the commonwealth. Until that is done, the user-level lock-in will preclude diversification and the monoculture bomb keeps ticking.

As it happens, Dan's bomb went off a few days ago, with the breakout of the "Backdoor.Ginwui" virus, a malicious bit of code that Symantec introduced in an alert as follows: 

It has been reported that Backdoor.Ginwui may be dropped by a malicious Word document exploiting an undocumented vulnerability in Microsoft Word. This malicious Word document is currently detected as Trojan.Mdropper.H.

  1. «
  2. 1
  3. ...
  4. 54
  5. 55
  6. 56
  7. 57
  8. 58
  9. 59
  10. 60
  11. ...
  12. 75
  13. Next »

Post navigation

← Older posts
Newer posts →

Search Site

Newsletter Signup Form

Subscribe to
the standards blog
Gesmer Updegrove
  • Terms of Use and Privacy Policy
  • Contact
  • Sitemap