Any old standards hand forced to choose the single most disputed issue in standard setting over the past decade would likely respond with a deceivingly simple question: "What does it mean to be an 'open standard?'" A similar debate rages in the open source community between those that believe that some licenses (e.g., the BSD, MIT and Apache licenses) are "open enough," while others would respond with an emphatic Hell No! (or less printable words to similar effect).
That's not too surprising, because the question of what "open" means subsumes almost every other categorical question that information and communications technology (ICT) standards and open source folk are likely to disagree over, whether they be economic (should a vendor be able to be implement a standard free of charge, or in free and open source software (FOSS) licensed under a version of the General Public License (GPL)); systemic (are standards adopted by ISO/IEC JTC 1 "better" than those that are not); or procedural (must the economic and other terms upon which a necessary patent claim can be licensed be disclosed early in the development process)?
The reason why this background level of disagreement is relevant today is because the Obama Administration has pledged to use technology to bring an "unprecedented" level of transparency and interaction in government to the people. If that's going to happen, though, it means that the platforms that the new administration adopts to provide open government will have to be open as well. Which brings us at last to the question of just what, exactly, "open" should mean, when it comes to "open government."
It's said that imitation is the sincerest form of flattery, but I guess being the kind of organization that people love to leak news about might be the next. That seems to be the case with the Linux Foundation, which for the second time in a matter of weeks has seen an enterprising reporter scoop the opposition (and our own internal planning) by releasing a story ahead of our planned schedule. Who knew that an open source foundation could attract paparazzi?
Last time, it was Steven Vaughn-Nichols announcing our acquisition of the Linux.com site, and this time it's the New York Times (no less) announcing a day ahead of time the fact that the Linux Foundation has taken over stewardship of Intel's Linux-based Moblin mobile operating system. If you've been following the mobile space for awhile, this is news worth noting, on which more below.
It would be an understatement to observe that Microsoft's patent suit against Dutch GPS vendor company TomTom has been closely watched. Why? Because Microsoft alleges that several of the patents at issue are infringed by TomTom's implementation of the Linux kernel. In this first month of the dispute, the most urgent question has been this: will TomTom fight or fold? Now we have the answer: TomTom has decided to fight - and perhaps fight hard. Yesterday, it brought its own suit against Microsoft in a Virginia court, alleging that Microsoft is guilty of infringing several of TomTom's own patents.
The question that many Linux supporters are now asking is this: is this good news for Linux, or bad? Here are my thoughts on that important question.
This morning I got an email from a regular Standards Blog reader with some unwelcome news - he informed me that the RSS, Atom and other feeds at my blog were dead, and that he hadn't gotten a new posting notice in a month. Sigh. Not the type of email you like to get, so I'm hoping that this posting reaches everyone that it's supposed to.
The reason for the problem is that the developer that supports ConsortiumInfo.org has been upgrading the version of Geeklog upon which this site is based. Unfortunately, it hasn't been going well at all, in part because the developer isn't familiar with Geeklog, and in part, frankly, because they aren't checking the things that they should as they make changes (like syndication feeds). Complicating things is the fact that this is a "second generation" blog, which began as home-grown software. Later, it was migrated to Geeklog when I wanted to make it more sophisticated than the original setup could support (e.g., by adding the News Picks to the right). Along the way, a few weirdnesses were built in, all of which (naturally) are undocumented. So it's been a challenge to the guy who has tried to figure it all out.
I chose Geeklog in part because it's a really powerful tool, but also because it's the product of a FOSS project application. Given that I write a lot about FOSS, I thought I ought to b e using FOSS to support this blog (that "walk the talk" thing). That's had some downsides, though, because Geeklog is a developer's tool, and not a mass-market application. Consequently, so far as I'm aware, the type of "Geeklog for Idiots" user manual that would be very useful to someone like me just doesn't exist. On the other hand, the Geeklog community has been great about answering the questions my developer has posted at the Geeklog site.
As a result, if anyone out there is a Geeklog ace and would be willing to answer my questions from time to time, that would be great, and would save me a lot of grief, as it would help me become a more efficient and productive Geeklog user, without having to run up my developer tab. It would also save me a lot of heartache.
Economic downturns have a tendency to accelerate emerging technologies, boost the adoption of effective solutions, and punish solutions that are not cost competitive or that are out of synch with industry trends.
So begins a new white paper from research analyst IDC. History supports the logic of the statement, but applying the same logic to predict the future is a dangerous game. Having good starting data can help considerably in that regard, though, and that's what makes this report interesting. Its title is Linux Adoption in a Global Recession, and it marshals some impressive data to predict that Linux will be a significant gainer, while others are punished by the current global meltdown.
The report bases that conclusion in part on its finding that: "Linux users are clearly satisfied about their choice to deploy Linux, and during trying economic times, the potential for those same customers to ramp up their deployment of Linux is strong." In other words, unlike the last recession, in which the free OS had to establish itself in environments where it had never been deployed before (its market share increased dramatically anyway), this time it need only increase its beachhead among existing users in order to post impressive gains. But IDC predicts that it will also do quite well with new, missionary sales as well, promising that this time around, its competitive position should strengthen as well as broaden - including on the desktop.
The headline act, if you will, was announced this morning for the third annual Linux Foundation Collaboration Summit, and it promises to be an interesting show: the Foundation's Jim Zemlin, Microsoft's Sam Ramji, and Sun's Ian Murdock, each giving their predictions on the future of operating system they represent - and, I expect, the others' as well. Jim will moderate the exchange, which will be held on the first day (May 8) of this year's Summit, which will be held in San Francisco. As with previous Collaboration Summits, there is no fee to attend, but attendance is by confirmation only, as the size is limited to a few hundred to maximize the interactivity of this annual gathering of the elite of the Linux clan.
While the OS debate provides the most provocative portion of the program, the rest will provide a great deal of substance - and, who knows, perhaps a few surprises as well.
I’m pleased to report (although a bit earlier than anticipated, on which more later) that the Linux Foundation has acquired the Linux.com URL, and will be hosting a new community site at that address. Needless to say, it’s the premier address to have on the FOSS highway, and we’re delighted to be standing up a new site at that location soon.
Those of you that are long time Linux.com visitors know that late last year SourceForge, the owner of Linux.com, quit posting new content at what had for years been one of the most visited Linux sites on the Web, posting only a cryptic message or two to explain what was going on. Soon, that dearth of content state of affairs will reverse, as we dedicate LF resources to relaunching the site, which is scheduled to occur in a couple of months. Until then, your content contributions will be most welcome, thank you (more on that below as well).
So what’s the deal with all the parentheticals?
The number of standard setting organizations (SSOs) from which specifications have been drawn to create Electronic Health Records (EHRs) are legion, due to the complex nature of these goal. Some of the standards utilized are generic, and common to any sophisticated Internet-enabled commercial system. Others are specific to science, but usable generally in paper as well as information technology (IT) based health care systems. Only a few SSOs, however, have taken up the challenge of developing the major components essential and unique to EHRs. One of the oldest and most important is Health Level 7, more commonly referred to as HL7.
HL7 has been at the center of global EHR development since 1987, as well as a key player in the more recent U.S. efforts to design and implement a national EHR system by 2014, a commitment made by President George W. Bush in his State of the Union Address in January of 2004.
With the Obama Administration's pledge to meet that commitment, and to direct massive amounts of funding towards ensuring its success, it is critical that the standards needed to support this ambitious goal are not only available, but the right tools for the job as well.
In this interview, HL7 CEO Charles Jaffe, M.D. shares his perspective on what's been accomplished, what remains to be done, and where the critical decisions that will lead to success or failure in creating a national EHR system must be made.
There are not a few commentators that would tell us that the latter half of the 20th century will best be remembered as the Computer Age, a time when advances in information technology truly transformed the way we live our lives. If medical science continues to advance at current rates, I believe that the first half of this century will as likely be recalled as the Age of Life Science — the time when our lives were transformed at the metabolic level. Indeed, on every front, whether it be genomics or oncology, neurology or stem cell research, reports of dramatic discoveries arrive almost daily, many suggesting the promise of cures that only a short time ago would have seemed little short of miraculous.