Have you discovered The Alexandria Project?
Some time back, I wrote a blog entry called "The Wikipedia and the Death of Archaeology." The thesis of the piece was this: archaeologists study periods as recent as a hundred years ago, because even with newspapers, magazines and photographs, a substantial percentage of everyday reality still slips through the historical cracks. If that sounds bogus, consider the fact that several times in your life (at least if you're an American), you've read an account of someone sealing or opening up a "time capsule" intended to preserve everyday objects for about the same time frame.
The humbling lesson is that much of our life is made up of trivial things - office supplies, flower pots, talk shows, you name it - that don't become the stuff of newspapers or novels. It follows that if you lose the trivia, then you lose much of the texture and context that makes everyday life real and understandable, if not always exactly inspiring. That's why archaeologists (or at least archaeologist lacking travel budgets) are wont to dig up such recent garbage - to recapture our trivia, in order to backfill our knowledge of who we were after the conscious memory has been lost. But, as I pointed out in my earlier entry, understanding the realities of the past by counting peach pits in privies at best allows you to make guesses about a single home owner, and not about society writ large.
Cloud computing is all the rage today, with everyone from the U.S. Federal government to Apple herding us into a brave new world of remotely hosted data and services. There are, of course, many advantages to the cloud concept. But as usual, this new IT architecture has some inherent and serious risks that cloud proponents hope potential customers will not dwell on.
There's nothing new about that, of course - except for the stakes. Innovation usually outruns caution and comprehensive consideration of concerns like safety and unintended consequences. But if we want to put all of our computing resources and data into one bucket, we had better make damn sure that it's got a pretty strong bottom.
Here's a nightmare scenario of what could happen otherwise. And it's not pretty.
Oh my goodness. It's happening again. Will there be anywhere to hide this time, or are we already trapped — tied like poor little Pauline to the railroad tracks as the engine of another high tech bubble barrels down upon us.
Until last Thursday, there was some cause for hope. True, the day before the New York Times had written a piece reporting on the growing prevalence of "acqhire" transactions. That's where a company (like Facebook) buys a company for millions of dollars, only to promptly shut it down. Why? Because it wants the employees — $500,000 to $1 million per engineer is the current going rate. That's not quite as high as it was during the Internet Bubble years, but the same companies are doing lots of big-ticket acquisitions as well. Whether or not these transactions pay off in new revenues, the dilution to existing stockholders will be the same.
At intervals, the Federal Trade Commission (FTC) and Department of Justice (DoJ) have undertaken public initiatives intended to support the standards development process from the antitrust perspective. In each case, I've found the regulators to be open minded and genuinely interested in understanding the marketplace. Often, the goal of their information gathering efforts is to later issue guidelines that encourage good behavior, and make clear what they consider to be over the line. The result is that it makes it easier and safer for stakeholders to participate actively in the standard setting process. Regulators in the European Union follow the same practice.
Last week, the FTC announced a new standards development process fact-finding effort, this time announcing a workshop intended to help them better understand whether "patent holdup" is causing a problem in the marketplace. It's open to the public, and you're free to submit written comments as well.
Long time readers will know that I have been reporting on the Semantic Web for many years - since June of 2005, in fact, when I dedicated an issue of my eJournal to The Future of the Web. The long interview I included with Tim Berners-Lee remains one of the most-read articles on this site of all time. Ever since then, I've periodically given an "attaboy" to the Semantic Web. And guess what? It's that time again.
Why? Because the more the Web is capable of doing, the more we can get out of it. And given how much we now rely on the Internet and the Web, we can't afford to allow either to be less than they are capable of being.
It’s very rare for me to write a blog entry directed solely at what someone else has written, but there’s an exception to every rule. This one is directed at a posting by Alex Brown, entitled UK Open Standards *Sigh*.
The short blog entry begins with Alex bemoaning the hard, cruel life of the selfless engineers that create technical standards:
It can be tough, putting effort into standardization activities – particularly if you're not paid to do it by your employer. The tedious meetings, the jet lag, the bureaucratic friction and the engineering compromises can all eat away at the soul.
Poor Alex. It does sound tough, doesn’t it?
If you’re a regular reader of The Standards Blog, there’s an excellent chance that you already know that Pamela Jones – "PJ" to one and all – announced on Saturday that she would post her last article at Groklaw on May 16. Certain aspects of the site will remain available indefinitely.
It’s difficult to know where to begin in saying “goodbye” to Groklaw. What PJ and her many cohorts accomplished there has been unique in my experience. In many ways, Groklaw exemplifies the transformational power that the Internet has brought to law, society, technology, and the advancement of all things open.
This morning brings news of what may become another new and important consortium – the Open Network Foundation (ONF). This time the goal is to adapt network architecture to streamline its interoperation with cloud computing. And while the news is intriguing, the way in which it has been broken is a bit odd, on which more below.
According to the press release announcing the new entity:
Six companies that own and operate some of the largest networks in the world — Deutsche Telekom, Facebook, Google, Microsoft, Verizon, and Yahoo! — announced today the formation of the Open Networking Foundation (ONF), a nonprofit organization dedicated to promoting a new approach to networking called Software-Defined Networking (SDN). Joining these six founding companies in creating ONF are 17 member companies, including major equipment vendors, networking and virtualization software suppliers, and chip technology providers.
Have you discovered The Alexandria Project?
This week a new consortium was launched that may signal who will finally own the last great, unclaimed consumer computing platform – the automobile. The new organization is the Car Connectivity Consortium, and the winner is . . . well, we’ll come back to that a little later. Suffice it to say for now that the fifteen year battle to control the digital future of the automobile could be at an end, and that its resolution may tell us something about the future of the digital desktop as well.
As I noted last week, the British government recently came out strongly in favor of royalty free standards and active consideration of open source software in public procurement. As expected, this decision was greeted with less than universal enthusiasm. One trade group that predictably reacted negatively was the Business Software Alliance, an association that supports the robust preservation of rights in patents. The BSA had previously lobbied actively (and successfully) to delete a similar position from the new version of the European Interoperability Framework (EIF).
According to a story posted at ZDNet.co.uk this Tuesday, the BSA released objections to the new British government policy, stating in part:
BSA strongly supports open standards as a driver of interoperability; but we are deeply concerned that by seeking to define openness in a way which requires industry to give up its intellectual property, the UK government's new policy will inadvertently reduce choice, hinder innovation and increase the costs of e-government.