A little while back I pointed out at some length how flimsy was the logic found in a white paper that claimed Microsoft's Vista would bring "benefits" of $40 billion to six European countries - conveniently forgetting the fact that those $40 billion of "benefits" were actually a cost.
And now, what do we find, but a study from the US film industry that purports to show:
movie piracy causes a total lost output for U.S. industries of $20.5 billion per year, thwarts the creation of about 140,000 jobs and accounts for more than $800 million in lost tax revenue.
But fortunately, there's someone else on hand who isn't taken in by this Vista-like logic:
It's important to remember, however, that even though piracy prevents money from reaching the movie industry, those dollars probably stay in the economy, one intellectual property expert said.
"In other words, let's say people are forgoing paying for $6 billion in movies by downloading or consuming illegal goods but end up spending that $6 billion on iPods, computers and HDTV sets on which to watch the movies, which leads to $25 billion in job creation in the computer/software/consumer electronics field," Jason Shultz, staff lawyer at the Electronic Frontier Foundation, wrote in an e-mail.
29 September 2006
A little while back I pointed out at some length how flimsy was the logic found in a white paper that claimed Microsoft's Vista would bring "benefits" of $40 billion to six European countries - conveniently forgetting the fact that those $40 billion of "benefits" were actually a cost.
Some time back I wrote about the European Digital Library. But it seems that this isn't enough: now we have the European Archive, too, which seems even more ambitious. For as well as providing access to digital versions of traditional content, it seems to be aiming to become a European mirror of the wonderful Internet Archive:
The European Archive is a non-profit foundation working towards universal access to all knowledge. The archive will achieve this through partnerships with libraries, museums, other collection bodies, and through building its own collections. The primary goal of collecting this knowledge is to make it as publicly accessible as possible, via the Internet and other means.
As the web has grown in importance as a publishing medium, we are behind in bringing into operation the archiving and library services that will provide enduring access to many important resources. Where some assumed web site owners would archive their own materials, this has not generally been the case. If properly archived, the Web history can provide a tremendous base for time-based analysis of the content, the topology including emerging communities and topics, trends analysis etc. as well as an invaluable source of information for the future.
The foremost effort to archive the Web has been carried on in the US by the Internet Archive, a non-profit foundation based in San Francisco. Every two months, large snapshots of the surface of the web are archived by the Internet Archive since 1996.
This entire collection offers 500 terabytes of data of major significance in all domain that have been impacted by the development of the Internet, that is, almost all. This represent large amount of data (petabytes in the coming years) to crawl, organize and give access to.
By partnering with the Internet Archive, the European Archive is laying down the foundation of a global Web archive based in Europe.
Obviously, all this begs scads of questions to do with access and copyright, but at least it's a start.
'McPatent' McCreevy is at it again:
In the context of the debate about the resolution of the European Parliament on future patent policy the EU Commission will press ahead with an official communication and an action plan of its own and will thereby seek to support the much criticized European Patent Litigation Agreement, Charlie McCreevy, the European Commissioner for Internal Market and Services, told the European Parliament during a plenary session in Strasburg on Thursday.
And there was a telling quotation:
"compared to our major trading partners, Europe is losing ground," Mr. McCreevy, referring to the patent systems in, for example, the United States and Japan, critically observed.
He's clearly referring to the number of patents in Europe (too low), and their quality (too high) compared to those in the US and Japan. But don't worry, Chas'll fix it....
Here's an interesting take on open access.
The benefits of this kind of openness for scientists and the public have been rehearsed many times; but this paper by Paul Peters, the Senior Publishing Developer of Hindawi Publishing Corporation, one of the leading open access outfits, presents some pretty compelling reasons why opening up is good for publishers - well, the smaller ones, at least:
While advocates of open access publishing have tended to focus on the benefits that it can offer authors and readers, there are equally important benefits that an open access publishing model can provide for small and mid-sized publishers. Within the existing subscription-based publishing industry there are a number of market forces that work against smaller publishers, and this is making it increasingly difficult for these smaller publishers to stay competitive. However, by adopting a business model based on publication charges, smaller publishers can overcome many of the difficulties that they currently face in the subscription market.
There are three main advantages that open access can provide for smaller publishers. One important advantage is that it makes the growth of both new and existing journals much easier. In addition, a shift to open access will promote more competition between publishers, which will enable many smaller publishers to gain a competitive edge over the largest and most well-established publishing houses. Finally, an open access publishing model will make a journal far more attractive to potential authors, since they can avoid many of the unnecessary limitations imposed by subscription-based models.
28 September 2006
Bad, bad BBC:
The BBC has signed an agreement with Microsoft to explore ways of developing its digital services.
The only thing that Microsoft understands is control; if the BBC teams up with Bill Gates' company in any way, we can kiss goodbye to our televisual heritage.
...because they will have no choice. As this Reg piece explains, the worlds of computing and the environment are inexorably becoming more intertwined. From the story:
"Today, energy costs typically form less than 10 per cent of an overall IT budget. However, this could rise to more than 50 per cent in the next few years. The bottom line is that the cost of power on this scale would be difficult to manage simply as a budget increase and most CIOs would struggle to justify the situation to company board members."
Enough to make anyone go green.
Here's another example of open source being invoked in the context of helping to solve environmental problems:
Non-patentable shared "open energy technology" has the potential to have a profound impact on the reduction of the greenhouse gases that cause global warming, in the same way that open source software has changed computers and the Internet.
There's an interesting twist, in that it also suggests
Possibly the most ideal person to do it would be South African billionaire, Mark Shuttleworth, who is currently taking on Microsoft's domination of the operating system market through the development of the open source operating system, Ubuntu Linux.
The reason being that Shuttleworth is a local boy for the publication in question. A nice idea, despite the nepotism.
The Direct Economy:
In a system of direct democracy, sovereignty is lodged with the citizens - or at least, with those among them that choose to actively participate in the system. They can not only pick among prepackaged options (vote) or candidates (election) but they also can deeply co-shape the policy process. Switzerland is probably the strongest case: here new laws can be put forth, and even the Constitution modified, by citizens’ initiative. Translate that into business terms and we have a description of a system where consumers have a direct influence on what companies develop and produce for them. The more informed, opinionated and wired (socially connected) they are, the more they are likely to make use of this influence and to try to organize it - exactly as in a direct democracy system.
Sounds like a business using Web 2.0 to me. Still, good to see the memes spreading.
Talking of the past, do you remember Henry Blodgett?
Amazon was selling for about $275 a share when a little-known analyst, Henry Blodgett, predicted it would go to $400 - even though Amazon had never made a profit. Amazon did go to $400 and beyond.
Amazon's backer, Merrill Lynch, responded by replacing its pessimistic Amazon analyst. His replacement? Henry Blodgett. While this was great for Blodgett, it proved not so good for investors, many of whom got soaked when Amazon's value fell 75 percent.
Blodgett has said his prediction was based on sound analysis using new ways to measure a company's performance. Wall Street coined a new verb: to "blodgett" a stock.
Now what do we hear?
MySpace, the social-networking Web site, could be worth around $15 billion within three years, measured in terms of the value created for shareholders of parent company News Corp., a Wall Street media analyst forecast Wednesday.
Those who cannot remember the past....
Update: Sometimes truth is stranger than fiction....
27 September 2006
I must have blinked. Bloglines has started offering package tracking and weather forecasts. Makes sense of course: as blogs and RSS feeds in general become the common purveyors of information, you may as well use your aggregator to pull in info from all kinds of sources - even those that have nothing to do with the blogosphere.
Only really, really old-timers - and sad ones at that - remember one of O'Reilly's less well-known products, called simply Internet in a Box, which came out in 1995. Well, it was clearly an idea ahead of its time.
Now, though, we have a real Internet in a box - or rather, Internet dans
la première « box » associant l’accès à l’Internet haut débit et les principales fonctionnalités d’un ordinateur.
For 40 Euros a month, and a deposit of 150 Euros, you get EasyGate, which is effectively a super-router that handles not just the Internet side, but the entire PC side as well. If you add a screen, mouse, keyboard and webcam, it'll cost you an extra 99 Euros. But since you end up with a GNU/Linux-based system, completely with Firefox and OpenOffice.org and a GNOME-like environment, that's not bad.
As hardware prices plummet, this was bound to come. But having come, it does look extremely attractive as an all-in-one, techophobe-safe system - rather like the highly-successful Amstrad PCW8256.
I predict we're going to see more and more of these systems, which means that GNU/Linux and open source software in general is going to start popping up in all sorts of unexpected places. (Via LXer and The Inquirer.)
New Scientist reports:
Linguists are calling for an online public database, similar to the human genome project, that would allow researchers to collaboratively share different studies of language impairment.
By gathering together studies of developmental disorders that cause communication impairments – such as autism or Down’s syndrome – they hope to provide new clues about the origins of language.
Aside from the interesting nature of the project, what is striking is that the key element is not creating new knowledge, but consolidating it in a database, allowing higher-level knowledge to emerge. Clearly, for this to work in an optimal way, all the data and papers need to be open access. Whether it will be, assuming the project goes ahead, remains to be seen.
Update: Wow, the original article behind the NS story is not behind the usual paywall. So from this I can read:
We close by illustrating how systematic analyses within and between disorders, suitably informed by evolutionary theory—and ideally facilitated by the creation of an open-access database—could provide new insights into language evolution.
I don't want to push the analogy too far, but it's amazing how the following passage could almost be talking about free software:
The interest and investment and technological momentum in all of these technologies is not only accelerating, but I think reaching some kind of tipping point where the markets really get to be transformed in a way that the next stage of change is actually a lot easier. Even though the amount of growth is going to be a lot larger, it actually gets easier to do, because the whole political equation is different. You've got big energy companies that are supporting renewable energy rather than opposing it, because it's part of their business plan, as opposed to before when they saw it as an unwanted competitor. That in turn changes the political equation, which means that it's easier to get new laws enacted, which in turn tends to speed up investment, so it becomes a real self-reinforcing circle.
Gilberto Gil is something of an icon in the open content world, and with good cause. He's a big name that backs the idea of others creating around his own art. And as Minister of Culture, he's also an influential politician in his native Brazil and far beyond.
Put the two together and you have a man who is in a unique position to talk to powerful people about important things. For example:
I had a meeting with the president of WIPO [on 25 September], and I was very much enthusiastic about the future role about the future role we think WIPO should play in terms of interpreting the trends, the tendencies, of intellectual property flexibility, inclusion, as the president himself puts it. Meaning, not just including as many as possible number of countries in the functioning of the institution today, but also inclusion in the sense that we should include the new themes, the new demands, and intellectual property flexibilities is one of the main things today. Not only considering the protection of the authors and of the authors’ rights, but also taking care of the public domain, of the social role of intellectual property, democratisation, universalisation, all of those contexts that should be referential to the work of an organisation like WIPO today already but mainly in the future. So like horizon, we were discussing horizon ahead of us for the next years. This is, I think, besides the regular day-to-day process of the subjects, and the multilateral and bilateral situations for WIPO, we should consider this advancing in terms of substance, of policy, I would even use the word ideology.
Not many people could have that conversation.
Sony is a strange company. Despite its numerous mis-steps - remember that DRM rootkit? - people still seem to harbour a certain affection for the outfit. Maybe it's that all those years spent playing on the Playstation have addled their brains...(well, it couldn't be because of the Walkman, could it?).
Me, well, I never played on the Playstation. I did own a Vaio laptop once (The horror! The horror!): I hated it, and I have sworn never to buy another. So this story about the possibility of Sony going permanently down the tubes rather warmed the cockles of my heart. Pathetic, I know. (Via Monkchips.)
26 September 2006
With the jolly kerfuffle over GNU GPL v3, it's easy to overlook the fact that the less well-known GNU Free Documentation Licence is also being updated, and that the first draft of version 2 is available. So why is this important? Because Wikipedia uses the GFDL.
Let's hope Jimmy Wales doesn't feel the same way Linus does over this process....
IBM's announcement of a new patent policy is obviously important, if only because Big Blue has a big collection of the critters. Whether it will do much to help fix a deeply broken system is another matter:
The worldwide policy, built on IBM's long-standing practices of high quality patents and transparency of ownership, is designed to foster integrity, a healthier environment for innovation, and mutual respect for intellectual property rights. IBM encouraged others in the patent community to adopt similar policies and practices, more stringent than currently required by law.
For a good first analysis, see Andy Updegrove's blog.
Update 1: And here's a salutary reminder from Andy on why it's best to get all the facts before you express your enthusiasm.
Update 2: Richard Poynder also makes some good points about the move.
One of the problems with DRM is that it can override traditional copyright to forbid anyone ever having access to content: effectively it is removed from the intellectual commons forever. If that's too abstract, here's a concrete example of something that it's going to be hard to open up - with equally serious problems for the environmental commons.
...Each little tyrant with his little sign
Shows where man claims earth glows no more divine
But paths to freedom and to childhood dear
A board sticks up to notice ‘no road here’
And on the tree with ivy overhung
The hated sign by vulgar taste is hung
As tho’ the very birds should learn to know
When they go there they must no further go...
Any pamphlet that begins with a quotation from John Clare about the first enclosure movement is clearly doing something right. As it happens, Rosemary Bechler's Unbounded Freedom, nominally "A guide to Creative Commons thinking for cultural organisations", does just about everything right. It is probably the single best short introduction to intellectual monopoly issues I have ever read. It is well written, accessible, packed with good examples and surprisingly comprehensive.
What's even more amazing is that it comes from the British Council, a body that used to be even stodgier than the British Library. Clearly - or Clarely - stodge ain't what it used to be. (Via OpenBusiness.)
Update: There's now a blog for discussing this book and its ideas. Sadly, there are already some rather obtuse comments that wilfully misrepresent the idea of open content. We've still got a long way to go....
I've written extensively - some would say too extensively - about Microsoft's long tradition of FUD. This has gone through many incarnations in a desperate attempt to find something that might convince people to stay away from that nasty GNU/Linux stuff. It appears that even the fertile minds of Microsoft's FUDmeisters are running out of ideas, since they've resurrected the old TCO argument.
I won't even bother going through why this PDF is a waste of electrons - even I'm bored with refuting these tired old arguments. But I would like to point out the underlying flaw with all these studies: that traditional TCO fails utterly to take into account things like the cost of vendor lock-in that the Microsoft route implies.
Even when the TCO for Windows is lower than that for GNU/Linux - and yes, it happens - there is the problem that Microsoft will always bring out a new version of Windows that requires massive software and hardware investments over and above those budgeted for in simplistic TCO analyses (i.e. all those prepared by analysts). Of course, according to Microsoft, this isn't a problem, since it represents a huge economic "benefit".
25 September 2006
The British Library is not normally regarded as a beacon of enlightened thought when it comes to intellectual monopolies - it's in cahoots with Microsoft for much of it's IT stuff. But this "IP Manifesto" is decidely clueful:
1 Digital is not different – Fair dealing access and library privilege should apply to the digital world as is the case in the analogue one.
2 Contracts and DRM – New, potentially restricting technologies (such as DRMs /TPMs) and contracts issued with digital works should not exceed the statutory exceptions for fair dealing access allowed for in the Copyright, Designs and Patents Act.
3 Archiving – Libraries should be allowed to make copies of sound (and film) recordings to ensure they can be preserved for posterity in the future.
4 Term of copyright – The copyright term for sound recording rights should not be extended without empirical evidence and the needs of society as a whole being borne in mind.
5 Orphan works – The US model of dealing with orphan works should be considered for the UK.
6 Unpublished works – The length of copyright term for unpublished works should be retrospectively brought in line with other terms – life plus 70 years.
(Via Open Access News.)
Search lies at the heart of modern desktop computing (just ask Google). So if free software wants to make a breakthrough on the desktop, coming up with a better search tool might just be the way to do it. Perhaps this could help.
An interesting meditation on the way in which the application of the commons metaphor to information - something I've certainly been doing in these posts - commits the sin of ignoring the way in which computers, the creators of that metaphorical commons, are destroying the concrete commons of the environment through the toxic materials they habitually contain, and which are dumped when they reach the end of their life.
It therefore suggests:
Perhaps the time has come to revisit the metaphor of an 'environmentalism for the net' to talk not only about multiple forms of resistance to an ever expanding intellectual property regime, but quite literally of the ecopolitical implications of the very infrastructures that facilitate and sustain the net.cultural dynamic of collaborative creation. Such an environmentalism, articulated conceptually and organisationally in the challenging context of electronics manufacturing's 'global flagship networks', could significantly broaden existing efforts by labour unions and NGOs to develop a broader agenda of economic and environmental justice.
Food for thought. (Via OnTheCommons.)
One of the central themes of this blog is how the ideas at the heart of free software - collaborative, participatory, distributed development - are gradually seeping out into other areas, with dramatic effects. Mostly I write about the obvious examples - open access, open content, open genomics etc. - but occasionally I slip in instances popping up in areas that seem to have little to do with software and yet are obviously still highly germane.
An example is this report called People and Participation. It comes from the dubiously-named "Involve", which sounds like a front for some bunch of religious nutters, but as a page entitled "Connectivity" makes clear, it has some interesting ideas that, er, plug straight into the technological origins of these movements:
The 21st century is delivering endless opportunities to connect with one another wherever we are. The new technology that fills our pockets allows us 24/7 contact with friends, family and work; it is also central to Digital Britain, the second stage of the digital revolution that could transform the lives of everyone in the UK. But this new culture of connection is not limited to bluetooth and WiFi, connectivity also underpins the enabling state, the Government vision of a modern social contract.
(Via P2P Foundation.)
24 September 2006
It seems that the old feed at
is broken (I don't know whether this is a temporary glitch with the beta of the new Blogger or a permanent change). In any case, the following URL seems to work
Apologies for the inconvenience.
Update: As I rather suspected it might, the original address is now working again, so it was probably some problem at Google. The other address also seems to work, so you can take your choice. The bottom line is, whichever one you've subscribed to, you should be OK.
I normally try to avoid posting about politics, since it tends to bring out the worst in bloggers on both sides of the political spectrum. But I'll make an exception for this, since I think it makes an important point about politics in the age of blogs:
Labour is a party that won and held power by mastering mainstream media, and as Mr Dale puts it "Blogs are a spin doctor's worst nightmare come true". That's bad news for the current ruling elite.
And good news for us proles.
There's a fascinating story on WorldChanging examining the current outbreak of the potentially lethal bacterium E. coli O157:H7 in the US spinach industry. Interestingly:
A curious yet widespread claim is that, because some of the spinach so far identified as contaminated came from organic farms, organic farming is unsafe. It's a curious claim, because scientists understand pretty well where the O157:H7 is coming from: the bellies of factory-farmed cows. Their manure, as it turns out, is now crawling with the critters.
The piece then goes on to suggest:
But I think there's something bigger coming, which is a move towards not just buying local food, but knowing the backstory of the food we buy.
Here, the backstory is what happened to our food before we bought it. Who raised it? Where was it grown, and on what kind of land? Did the farmer use fertilizers and pesticides, or integrated pest management? Antibiotics or free-range grazing? Was the soil conserved, or is it eroding? How did it reach us, and how was the money we spent on it split up?
Another way of putting it is that food should be open source, not the current "black box" that has to be taken on trust - with sometimes fatal consequences.
23 September 2006
I have animadverted before upon the fact that I find TechCrunch - for all its undoubted virtues - just a little too breathless in its excitement over Web 2.0 startups. So a wry smile did play upon my lips when I came across the aptly-named Techcrush:
Techcrush will review the progress of web 2.0 startups 6 and 12 months after they debuted. Did their apps turn out to be a success or a failure?
No points for guessing which way most of them will turn out. (Via Alex Bosworth.)
22 September 2006
If you thirst for new ideas, try this:
Virtual water is the amount of water that is embedded in food or other products needed for its production. For example, to produce one kilogram of wheat we need about 1,000 litres of water, i.e. the virtual water of this kilogram of wheat is 1,000 litres. For meat, we need about five to ten times more.
The per capita consumption of virtual water contained in our diets varies according to the type of diets, from 1m3/day for a survival diet, to 2.6m3/day for a vegetarian diet and over 5m3 for a USA style meat based diet.
In one of my random wanders, I came across this neat encapsulation of a key advantage that open source companies enjoy:
I sat in on a sales visit yesterday, and they were wowed by our demo and presentation. In fact, the results are getting so predictable with prospective customers that it’s almost boring - we show them the stuff, and they show us the money. Before coming to Hyperic, I had never seen sales calls this easy.
It's almost a truism that open source software sells itself; the knock-on consequence is that you don't really need salespeople, which in turn means more money for developers and support.
Kerala is probably best-known for its democratically-elected communist government, but its decision to go for GNU/Linux instead of Windows in its schools is probably now a close second. A few weeks back, Richard Stallman was explaining his role in the decision, and now here's a piece in Business Week that has some figures (alas, only made-up ones by analysts) about the broader Indian market. (With thanks to James Tyrrell for the link.)
Neither of these is new, but I've not mentioned them before, and I should have done.
Semapedia.org is a non-profit, community-driven project founded September 2005. Our goal is to connect the virtual and physical world by bringing the right information from the internet to the relevant place in physical space.
To accomplish this, we invite you to create and distribute Semapedia-Tags which are in fact cellphone-readable physical hyperlinks to the free online encyclopedia Wikipedia (or any of Wikipedias' sisterprojects such as Wikibooks, Wikinews, and Wikiquote). You can create such Tags easily yourself by choosing and pasting a Wikipedia URL into our creation-form. Pressing the button will generate a custom PDF file to download and be printed. Once created, you put the Tags up at their according physical location. Others can now use their cellphone to 'click' your Tag and access the information you provided them.
*SWiK.net is a project to help people collaboratively document open-source software*
SWiK is visited by over 10,000 people daily, it’s a place to make notes and publish articles on software development and open source projects, tag projects to help organize the world of open source, or just browse around and find interesting stuff.
I particularly like the Zeitgeist page as a snapshot of what's hot.
21 September 2006
We know it works, and we know why it works, but somebody would like to know exactly how and why it works:
A group of UC Davis researchers has just received a three-year, $750,000 grant from the National Science Foundation to study how open source software such as the Apache Web server is built.
The researchers will focus on the Apache Web server, the PostgreSQL database and the Python scripting language. They will collect information from the message boards, bug reports and e-mail discussions to understand how design teams organize themselves and interact.
Am I the only one who finds it slightly ironic that $750,000 is being spent to write some papers about something that is written for nothing? (Via LXer.)
Here's a fascinating project: Open Prosthetics. It's exactly what it says, free designs for prosthetics, although the exact licensing isn't entirely clear (anyone?). The back-story is told in Wired. (Via BoingBoing.)
An interesting coupling of Google with China - but not for the usual reasons.
Dr. Kai-Fu Lee, the head of Google in China said:
Open source software affords Google the flexibility it needs to be able to respond to market demands. Since Google can redesign its software anytime, it can follow market changes quickly.
Open source also gives Google better control over sensitive business information. "If we buy software from other companies, they can tell how many servers we have from how many we pay. Now, that's only our own business," Lee said.
Meanwhile, Ni Guangnan, an academician at the Chinese Academy of Engineering, spoke of
"taking our fate into our own hands." Ni says that China is promoting open source as part of its strategy of being an innovative country, for national information security, and to solve the software pirate problem. He estimates China's open source industry will boom in upcoming years.
Yale has jumped on the open courseware bandwagon - with a twist:
Yale University is producing digital videos of selected undergraduate courses that it will make available for free on the Internet through a grant from the William and Flora Hewlett Foundation.
Happily, this is rather more than just open courseware for the YouTube generation:
The project will create multidimensional packages—including full transcripts in several languages, syllabi, and other course materials—for seven courses and design a web interface for these materials, to be launched in the fall of 2007.
20 September 2006
Good - if belated - news on the OOo front:
First, OpenOffice.org shall get Firefox-like extensions capabilities by the 2.0.4. This release should be ready somewhere between the coming week and the end of the month. What this means is that besides the fact that OpenOffice.org could include extensions before, now the way to develop, include, select and manage them will be made easy. Aside the traditionnal .zip and unopkg extensions packages, a new and definitive extension format, .oxt, shall be used across the extensions that can be developed using a breadth of languages ranging from StarBasic to Java. New wizards and configuration tools shall be added for the benefit of our endusers.
Second, and I think that although we have no clear roadmap for this yet (besides, our version naming scheme is going to change once again ), OpenOffice.org and StarOffice shall include the Mozilla Foundation's Thunderbird and Sunbird (calendaring application) in the future. Besides the inclusion of those two softs inside the office suite, connectors to Sun Calendar Server and Microsoft Exchange will also be developed accordingly.
Great, but why not Lightning instead, and then we'd be in complete harmony? (Via Slashdot.)
As I noted before, Munich's much-bruited migration to open source has not been entirely wrinkle-free; but the good news (well, for the free world at least), is that things seem to be moving at last, and definitively:
After a city-wide test and pilot phase, the [GNU/]Linux team of the IT Department gave the start signal on Tuesday for the first official version of the future workplace system.
Now, here's an interesting idea: a film about the idea of the commons:
Our idea is to make the film a kind of celebration of the commons and remix culture by making a hybrid film that uses public domain images and sounds to create animated sequences, archival sequences featuring Spooky as a Zelig-like character, archival sequences in which Spooky interacts with the people in the footage, and mashups.
And to blog about it in the process:
My wife says I'm crazy to publicly blog my process of writing my next film. Too much pressure? No one should see the sausage making of the creative process?
I think it is a worthwhile exercise. It is a routine, to get the writing going. An appointment with whomever reads this. It is a leap of faith, believing in the value of the internet as commons. It is novel...sort of.
The philosophical schism between open source and free software is well known, but there's another interesting split emerging between free software and the Creative Commons movement. This isn't exactly new, but as the open content movement begins to gain momentum, it's an issue that people are starting to worry about.
If you want a good introduction to the basics of the dispute, Intellectual Property Watch has a useful report from the recent Wizards of OS 4 conference, where these tensions were exposed.
Geek that I am, the only thing that really interests me about Lonelygirl15 is the technology behind the follow-on Web site:
On a shoestring budget themselves, the trio supports the Web site with open-source technologies like MySQL databases. "Our entire backend that supports the Web site is free because we use WordPress," Beckett said. "Five years ago, you would have had to buy UNIX boxes and build a custom content management system."
That is, a LAMP stack like just about every other Web 2.0 startup - not so lonely. In this respect, it feeds off the same forces that made the original videos possible:
The Lonelygirl15 episodes cost virtually nothing to create. All are shot with a $130 Web camera. The sound is recorded from the internal microphone. Two desk lamps provide the lighting. Beckett's laptop is the computer required to record the segment.
No wonder Hollywood is in trouble.
Because he talks the talk:
She's good to go, hoist anchor!
Here's some real booty for all you land-lubbers.
There's not too many changes, with t'bulk of the patch bein' defconfig updates, but the shortlog at the aft of this here email describes the details if you care, you scurvy dogs.
Header cleanups, various one-liners, and random other fixes.
Linus "but you can call me Cap'n"
(Via Tuxmachines.org and ZDNet Australia.)
Antony Mayfield, on the "other" Open blog, has the following wise words to say about blogging:
A large part of my job is about keeping up my knowledge of what is happening in media, technology and marketing. It's not enough to read all that's out there I need to make sure I have digested, understood and it put it context for myself. When I blog that's exactly what I'm doing.
My thoughts exactly. In fact, I'd go further: blogging has become my notebook and general repository of digital bits and bobs. Whenever I find something of interest (to me), I usually bung it up; I hope that it will be of interest to others, but that's really secondary. A blog is as much a very practical tool for my everyday work as an exercise in itself.
19 September 2006
FON is such an obviously clever and right-on idea that I have struggled to articulate exactly why it is I have been reluctant to write about it. After all, the basic plan is brilliant:
FON is the largest WiFi community in the world. Our members share their wireless Internet access at home and, in return, enjoy free WiFi wherever they find another Fonero’s Access Point.
It all started as a simple idea. Why should you pay for Internet access on the go when you have already paid for it at home? Exactly, you shouldn’t. So we decided to help create a community of people who get more out of their connection through sharing.
We call members of the FON Community Foneros. It’s simple to become a Fonero. You just need to buy La Fonera, which enables you to securely and fairly share your home broadband connection with other Foneros.
Then when you’re away from home and you need Internet access, just log on to a FON Access Point, and you can use the Internet for free. You don’t need to take your router with you – you just need to remember your Fonero login and password.
But it then rises close to genius by making the following distinction:
# Most of us are Linuses. That means that we share our WiFi at home and in return get free WiFi wherever we find a FON Access Point.
# Aliens are people who don’t share their WiFi yet. We charge them just €/$ 3 for a Day Pass to access the FON Community.
# Bills are in business and so want to make some money from their WiFi. Instead of free roaming, they get a 50% share of the money that Aliens pay to access the Community through their FON Access Point.
And now, you can get La Fonera - a WiFi access point that joins you to the FON network - for just a few Euros.
So what's my problem? Maybe it's this:
Interestingly this video was shot with a Nokia N80 (disclosure I am on Nokia's Internet Board) and sent over wifi to a Fonera (disclosure I am the CEO of Fon) which automatically posted the clip in VPOD (disclosure I am an investor in Vpod.tv) which is then linked to my blog which is in Moveable Type (disclosure, two good friends of mine Loic Le Meur and Joichi Ito who are partners in Six Apart well known bloggers and members of the Japan and French Fon boards).
Disclosure: this makes me sick. (Via GigaOM.)
The Knowledge Commons is
a distributed network architecture that enables the culturing of knowledge through construction, distribution, and recombination.
This model provides:
* collaborative knowledge creation
* knowledge correlation through metadata
* identity and authentication brokering
* peer-based content distribution and retrieval
* automated commons management
Er, yes? Sounds interesting, but could we have some more details, please?
Now here's a spooky story:
PicksPal is a free sports site where people “bet” on upcoming games. No money is involved. If they win, their point total goes up and they have bragging rights around the office. Since launching about a year ago over 100,000 people have joined the site, making daily picks on just about every kind of sporting event in the U.S. - boxing, NFL football, pro football, bass fishing, ultimate fighting, basketball, baseball, etc. The site makes money from advertising.
Recently, however, the PicksPal team noticed that a very small percentage of users tend to be correct in their picks significantly more often that they should be statistically. When they grouped these special users they found them to be a powerful predictive force.
I care not a jot for sports or betting, but what is interesting here is that the idea can be generalised. You set up a site devoted to a particular domain with uncertain results, and invite visitors to predict the future. You then analyse the patterns over time and try to find groups of people who consistently beat random guesses.
18 September 2006
I must confess my initial scepticism to the One Laptop Per Child project has waned a little, not least because it does seem to have some genuinely cool technology behind it.
But I was nonetheless intrigued to find that there is already something similar that is not just on the drawing board, but off it and deep in the field (or jungle/savannah as the case may be). It's called the Inveneo Communication System. It uses GNU/Linux (of course), and draws as little as 12 Watts of power, which can be supplied by sun, wind or bicycle.
I and several thousand other people have been writing about the open source enterprise stack for a while; now free software's Eminence Rouge has given its benediction:
Red Hat Application Stack is the first fully integrated open source stack. Simplified, delivered, and supported by the open source leader. It includes everything you need to run standards-based Web and enterprise applications. Red Hat Application Stack features Red Hat Enterprise Linux, JBoss Application Server with Tomcat, JBoss Hibernate, and a choice of open source databases: MySQL or PostgreSQL, and Apache Web Server.
Although my interest in art photography is more passing than passionate, here's an idea that brings together a number of threads in a novel way. JPG Magazine is a Web site and a magazine with a difference:
JPG Magazine is made by you! As a member, you can submit photos and vote on other members' submissions.
So it's a kind of Digg meets Flickr meets Worth1000.com, with more to come, apparently.
The Web was invented by Tim Berners-Lee while he was working at the European Centre for Nuclear Research (CERN) in Geneva. Now the boys and girls at CERN are at it again, with a radical proposal that will re-invent scientific publishing in their field.
Essentially, they suggest that enough of the big particle physics establishments get together to sponsor the publication of most of the main titles in their field for the next few years as part of a transition to an open access approach, funded in part by savings on subscriptions. At a stroke this solves the biggest problem with OA - getting there.
Major laboratories such as CERN will have to take a lead initially in steering the community through the OA transition – both politically and financially – but ultimately the particle physics funding agencies will have to provide the lion’s share of the financial support. This accounts in particular for the fact that about 80% of the original research articles in particle physics are theory papers.
Tentatively, the task force envisages a transition period of five years to establish a ‘fair share’ scenario between funding agencies and other partners, to allow time for funding agencies to redirect budgets from journal subscriptions to OA sponsoring, and to allow time for more publishers to convert journals to OA. At the end of this period, the vast majority of particle physics literature should be available under an OA scheme.
The sums involved are big for publishing, but puny compared to the cost of your average accelerator, so it's a good mix. And they're thinking strategically too:
With about 10,000 practising scientists worldwide, particle physicists represent a medium-sized community that is small enough for publishers and funding agencies not to take incalculable risks, yet big enough to provide a representative test bed and to set a visible precedent for other fields of science and humanities.
In other words, if this works, the hope is everything else will come tumbling down too. This is one experiment I'll follow with interest. (Via Open Access News.)
Squirreling away prior art in an attempt to stave off software patents sounds like a jolly sensible idea. But that old curmudgeon, Richard Stallman, points out some very cogent reasons why in fact this isn't such a jolly sensible idea. Essentially, the only solution to software patents is to abolish them.
17 September 2006
At the end of last year, I asked whether Wikipedia might fork.
The answer is "yes".
Update 1: Here's Clay Shirky on why he thinks it's doomed to fail.
Update 2: And here's Larry Sanger's response to those points.
There's an article on Language Log about Microsoft's use of the term "genuine":
Microsoft has a new advertising campaign focussing on their efforts to reduce "piracy" of their software, that is, the sale of their software in violation of license agreements. You can read about it here. They call this campaign the "Microsoft Genuine Software Initiative" and use the term "genuine" in contexts such as this:
In the month of May, 38,000 customers purchased genuine Windows software after being notified that they had been sold non-genuine software. Customers recognize that the value of genuine is greater than ever.
I find this use of "genuine" to be most peculiar. An unlicensed copy of Microsoft Windows is perfectly genuine. It has exactly the same functionality as a licensed copy and was made by the same company. In contrast, if you buy a "Rorex" watch, it is not genuine because it is not made by the Rolex company and does not have the aesthetics, functionality, and resale value of a real Rolex. What Microsoft is concerned about is the software equivalent of buying a refrigerator that fell off the truck. The problem is not that you are not getting the real thing - the problem is that the transaction is not legal.
I point this out not so much for the post itself, which seems a little thin - after all, the bits may be genuine, but the packaging certainly isn't, so in this sense Microsoft is right - but as an excuse to recommend Language Log itself. It's simply one of the best places to read interesting reflections on language in all its glory.
16 September 2006
In principle, open content applies to all kinds of materials, not just words. But it's certainly true that most open content is text-based. The basic problem is coming up with a framework that allows collaboration with other kinds of media. So here's an idea: WikiMusic.
The idea here is collaborative asynchronous recording of music, wherin you record your parts to a music editing software file, then upload it for others to add to. Each version of the file can be left online, so that people can revert back to older versions.
Microsoft has a long and inglorious history of working closely with companies only to shaft them royally when it suits. Now it looks like the same is happening with music. According to this TechDirt piece:
Microsoft's super hyped up portable entertainment device, Zune, isn't even compatible with protected Windows Media files that use Microsoft's own "PlaysForSure" copy protection. Yes, that's right. All of the content that people bought on services like Napster, Rhapsody, Yahoo, Movielink or Cinemanow that they figured would continue to be supported by everyone outside of Apple... just discovered that Microsoft has cut them off.
So all those copmpanies who thought they were one of Microsoft's closest pals just found out why you should always use a long spoon to sup with the devil.
I've written about Vyatta, a company producing an open-source router, before. Now it's got some serious VC dosh: you don't have to be clairvoyant to see that this company is going to be very big. Starting queueing for shares now. (Via Enterprise Open Source Magazine.)
15 September 2006
If you still don't believe me when I say that Cory Doctorow can deliver, try this harangue, one of the finest I've read in a long time. Like Doctorow, I order a lot of stuff from Amazon; like him too, I will never in a billion February 29ths order one of these terrible un-Amazon-like miscegenations.
The difference between Amazon and Amazon Unbox is like night and day. When you sign onto Unbox, you sign away all the amazing customer rights that Amazon itself is so careful to protect. Amazon Unbox takes away your privacy and every conceivable consumer right you have, and then tells you that the goods you buy from them don't belong to you, and they can take them away from you at any time, or change the deal you get from them without any appeal by you.
Amazon Unbox's user agreement isn't just galling for its evilness -- it's also commercially suicidal. No sane person will agree to this. Amazon Unbox user agreement is only a couple femtometers more dignified than being traded to another inmate for a couple packs of cigarettes.
I hate the For Dummies series. The idea that you buy a book because you're stupid is simply insulting. How about calling it For the Curious? Is that so much worse? Anyway, it seems that there is someone with intelligence at said book publishers, since they've come out with the utterly improbably Linux Smart Homes for Dummies:
364 pages and a CD-ROM that cover not only the typical X10 hardware and software characteristic of home automation, but also networking, video, audio, and even heating, ventilating, and air-conditioning (HVAC) control that can make your house the envy of your neighborhood.
Update: And here's the author's blog, with lots of useful stuff about Home Automation using GNU/Linux (with thanks to Neil for the comment below.)
The British Academy has traditionally been a rather staid institution, but this press release about a forthcoming report shows that they get all the main points about copyrights and its wrongs:
A report from the British Academy, to be launched on 18 September, expresses fears that the copyright system may in important respects be impeding, rather than stimulating, the production of new ideas and new scholarship in the humanities and social sciences.
The situation is aggravated by the increasingly aggressive defence of copyright by commercial rights holders, and the growing role – most of all in music – of media businesses with no interest in or understanding of the needs of scholarship. It is also aggravated by the unsatisfactory EU Database Directive, which is at once vague and wide-ranging, and by the development of digital rights management systems, which may enable publishers to use technology to circumvent the exceptions to copyright which are contained in current legislation.
Let's hope the Gowers Review of Intellectual Property takes note and gets it too.
Remember Nupedia? No, not many people do. But it was the trail-blazing precursor of Wikipedia. Apparently the code is open source, and it's available from Larry Sanger, Nupedia's Editor-in-chief, and co-founder of Wikipedia.
One of Microsoft's favourite justifications for its monopoly is that any brake on it would be a brake on "innovation" - as if Microsoft were some hotbed of the latter. The danger with letting this kind of nonsense pass unchallenged is that others start using it. Here's a prime specimen:
Speaking in Washington, D.C., on Wednesday, Thomas Barnett, assistant attorney general at the DOJ’s antitrust division, warned that forcing companies to reveal their intellectual property stifles innovation. He used Apple as an example, in a nod to growing discontent in Europe regarding the way that music purchased from iTunes is tied to the iPod.
Well, no: if you read works like the splendid Against Intellectual Monopoly, you find in fact that
intellectual monopoly is not neccesary for innovation and as a practical matter is damaging to growth, prosperity and liberty.
The book gives plenty of examples - like James Watt and the steam engine - that are eye-opening in this respect. And since the book is freely available, there's no excuse for not finding out about these fascinating things and helping to stamp out this wretched "innovation" meme.
14 September 2006
The downside of the European Union is that Europe-wide laws can be passed in a completely undemocratic way that affect everyone. The upside is that challenging such laws - and winning - can effectively knock them out all across Europe. This makes the effort by Digital Rights Ireland to fight the paranoid EU Data Retention Law critically important. Send them all your lucky shamrocks. (Via The Open Rights Group.)
I know that Cory Doctorow gets up some people's nostrils - there's even a site dedicated solely to his denigration - but you've got to allow that the man (a) knows what he's talking about when it comes to copyright and (b) can really write when he has a following wind.
I offer Exhibit A, entitled "How Copyright Broke", without doubt one of the most accessible introductions to where copyright came from and what's wrong with it. It's the perfect solution for explaining a tricky subject to aunts and uncles.
I've written before about the pernicious WIPO Broadcasting Treaty that is being discussed. Sadly, it's rumbling ever closer, and bringing with it a terrible cloud:
US industry was prominent at the meeting, as several representatives from information and communications technology (ICT) companies were there in opposition. Jeffrey Lawrence, director of digital home and content policy at Intel, said it would create a "whole cloud of liability issues."
"We have the patent cloud, the copyright cloud, and now we’re going to have a broadcast cloud," Lawrence said. He predicted such a treaty would "stifle innovation because it creates uncertainty." In addition, it has significant Internet ramifications, as it could impact cable and home networking, seen as critical to ICT industries. The movement of content is the "next killer application" for industries, he said. Lawrence called on industries to "stand up" to fight the treaty as it is proposed. Other opposed companies present at the meeting were Verizon and AT&T.
I've got a new column up on Linux Journal in which I talk about some of the various hacking blogs around. One thing that struck me was how many are called "Planet this" or "Planet that". Now a reader has kindly pointed out that this is down to some cool software called, er, Planet:
an awesome 'river of news' feed reader. It downloads news feeds published by web sites and aggregates their content together into a single combined feed, latest news first.
The listing of Planets - dozens of them - on the site is impressive.
Update: Here's some practical info on how to set up and use Planet.
...Access. Amazing, the Gates Foundation is giving to
Public Library of Science (PLoS), to launch a new medical journal on neglected diseases -- US$1.1 million: PLoS will launch PLoS Neglected Tropical Diseases, a new open-access, peer-reviewed medical journal covering science, policy, and advocacy on neglected tropical diseases.
Yup, that's "open access", as in practically the same as open source, but applied to academic papers.
Hm, Gates is piquantly close to getting it.... (Via Open Access News.)
...the techno-war for liberty, that is. At least, that's what the venerable Computer Chaos Club (CCC) reckons:
"We have lost the war ," is what it all boils down to according to the assessment delivered by Frank Rieger, the former CCC spokesman, at the last CCC Congress in Berlin in December. "We are living in that dark world of sci-fi novels we always sought to forestall. The police state is now."
The new technologies have opened up a plethora of possibilities for collecting, storing and linking data. The desire to data mine these huge international repositories has become ever more intense, especially since 9/11. Scared people make for pliable populations and governments have no hard time getting their hands on the information they want. Whence the relinquishing of civil rights and liberties is taking place - in a creeping fashion.
The CCC has not been able to prevent these developments. But it saw them coming. Since its founding on September 12, 1981 the Club has sought to be more than a kindergarten for nerds and geeks. Very early on the CCC showed a commitment to educating the public. It has repeatedly warned of the downsides of the technology it so fervently embraces. An attitude that may appear a little schizophrenic but that is nonetheless indispensable.
I have been asked so many times "Is it possible to tell if someone has read my email?" And the answer, of course, is no: email gets sent; whether it is received or read is entirely unknown.
Unless you use DidTheyReadIt. I presume this works by adding an invisible HTML element to the email message that calls back to the company so they can track when messages are read. Although this might satisfy people who insist on knowing whether their masterpieces have been read, it's really Bad News because of the tracking it carries out. And just think of the field-day spammers will have: now, they won't have to guess whether an address works or not.
I hope that email clients will add the facility to block this kind of stupidity. It's not what email is about: if the person who receives your email can't be bothered to reply, either your message wasn't important enough - or maybe they aren't. (Via Lessig .)
And talking of Web 2.0 trouble, the news that Bart Decrem is leaving Flock, the company he helped to found, and of which he was CEO, doesn't look too good, despite the corporate spin being put upon it. (Via The Inquirer.)
There is an iron rule in the Internet world: once people start launching services for pets, the End is Nigh. For Web 1.0, the classic case was Pets.com; and now, for Web 2.0, we have Dogster:
We are dog freaks and computer geeks who wanted a canine sharing application that's truly gone to the dogs. Such a site didn't exist, so we built it ourselves. The fluffy love is backed with serious technology and years of coding experience under our collars. Dogster has since become more contagious than kennel cough.
As the Website itself puts it: "for the love of dog...." (Via GigaOM.)
13 September 2006
Live CDs - bootable, self-configuring GNU/Linux distributions - are one of the free software world's secret weapons. They not only let you try out a distro before installing it, but they also let you swap between them as easily as swapping CDs. Try doing that with Windows (of which there is pretty much only one main variety, anyway).
So I was pleased to come across LiveCD News to satisfy my hunger for info on this front, one to put alongside the equally indispensable DistroWatch, which plays the same role for GNU/Linux distros in general. (Via Digg.)
Jimmy's done it again.
After my last encomium, Jimmy Wales has now taken on Dale Hoiberg, editor-in-chief of Britannica, in an email exchange that includes the following words from Wales (plus his elfin helpers, I presume):
You wrote: "I have had neither the time nor space to respond to them properly in this format. I could corral any number of links to articles alleging errors in Wikipedia and weave them into my posts, but it seems to me that our time and space are better spent here on issues of substance."
No problem! Wikipedia to the rescue with a fine article on the topic.
Fortunately, there is a vast army of volunteers eager to help good people like you and me who don't quite have enough time and space to do everything from scratch ourselves, and they are writing a comprehensive encyclopedic catalog of all human knowledge. They have quite eagerly amassed a fantastic list and discussion of dozens of links to such articles.
We are open and transparent and eager to help people find criticisms of us. Disconcerting and unusual, I know. But, well, welcome to the Internet.
And yes, this is an issue of substance and a fine demonstration of the strength of the new model.
I'm a huge fan of PubMed. If you've not visited it, it's well worth the effort. It's the nearest thing we have to a complete compendium of medical knowledge. Most of it, alas, is only at the abstract level, but as open access takes off, PubMed will become the natural gateway to the full texts (and there's a UK PubMed under construction, too).
So I was interested to see this mashup involving PubMed: a Greasemonkey script that adds blog post trackbacks to PubMed. Sounds like a brilliant idea.
12 September 2006
Here's a perceptive post that points out all is not well at the commercial Web 2.0 sites like MySpace:
The problem is that many social networks hosted by corporations are essentially appropriating – and monetizing – the socially created value of the commons for themselves. They entice users onto the faux commons by offering them recognition and attention to a large audience – but then they leverage the power of the assembled commons for their own profit, at the expense of users.
And it concludes with a chilling thought:
If the first enclosures were those of common lands in Great Britain, and the second enclosures were those achieved through expansions of copyright and patents (Cf. James Boyle), then the "third enclosures" (as Michel Bauwens calls them ) uses contract terms (with users of websites and software, and with trustees of public resources) to convert commons into proprietary monopolies.
Peter Murray-Rust has an interesting post on the concept of open data, its (short) history and its present status, with some good links. As he notes:
There seem to be several related threads:
* scientific data deemed to belong to the commons (e.g. the human genome)
* infrastructural data essential for scientific endeavour (e.g. GIS)
* data published in scientific articles which are factual and therefore not copyrightable
* data as opposed to software and therefore not covered by OS licenses and potentially capable of being misappropriated. (this is a very general idea)
He points out that "the current usages are sufficiently close that we should try to bring them together", a move that would help open data's future greatly.
It's probably only natural that we tend to hear about the headline pilot schemes for open source: after all, these are (usually) the big breakthrough. But in a sense what really counts is whether the pilots are successful and there is a wider roll-out of open source. So it's good to find that in Malaysia, at least, the pilots were successful and that the roll-out is indeed proceeding. (Via LXer.)
Update: Here is the excellent and self-explanatory Open Malaysia blog, which looks to be a good place to follow this story.
OASIS may not be the grooviest organisation, but it's certainly helped ODF achieve respectability remarkably quickly. Now it's set up something called OpenDocument XML.org:
This is a community-driven site, and the public is encouraged to contribute content. Use this site to:
* Learn. Knowledge Base pages provide reliable background information on OpenDocument.
* Share. OpenDocument Today serves as a community bulletin board and directory where readers post news, ideas, opinions, and recommendations.
* Collaborate. Wiki pages let users work with others online and add new pages to the site.
11 September 2006
I know this is only a "stylized mathematical model" of how Windows and GNU/Linux interact in the marketplace, but it's more akin to a Swiss cheese model, so many holes does it exhibit. For example:
The model captures what we believe are the most important features of the Linux-Windows competitive battle (faster demand-side learning on the part of Linux and an initial installed base advantage for Windows), but makes important assumptions regarding other aspects.
"Faster demand-side learning" has almost nothing to do with it these days: issues like control, stability and security are more to the fore.
And then this is a completely erroneous assumption, too:
Our paper introduces a dynamic mixed duopoly model in which a profit-maximizing competitor (Microsoft) interacts with a competitor that prices at zero (Linux), with the installed base affecting their relative values over time.
Nobody equates GNU/Linux with zero price anymore: even if TCO is a slippery concept, it is certainly more realistic than simply looking at the price tag, as this study does.
And so on, and so on. The fundamental problem is that open source is driven by so many complex - and often non-economic - factors that any simplistic mathematical modelling is doomed to fail from the start.
I can't say I see eye-to-eye with everything Mr. Wales does, but in this case he seems to be on the side of the angels:
The founder of Wikipedia, the online encyclopaedia written by its users, has defied the Chinese government by refusing to bow to censorship of politically sensitive entries.
Jimmy Wales, one of the 100 most influential people in the world according to Time magazine, challenged other internet companies, including Google, to justify their claim that they could do more good than harm by co-operating with Beijing.
Wikipedia, a hugely popular reference tool in the West, has been banned from China since last October. Whereas Google, Microsoft and Yahoo went into the country accepting some restrictions on their online content, Wales believes it must be all or nothing for Wikipedia.
An interesting piece by Om Malik (not on GigaOM) about the rise of the widget. But no surprise here really: the atomisation of programs is just another reflection of the tidal wave that is open source currently sweeping over programming in general. As I've written several times, modularity is key to free software's success: widgetification is simply the same idea applied to Web services.
Harald Welte, untiring defender of the GPL, has won a splendid victory in the German courts. No details yet, but this is what King Harald has to say in his royal blog:
Today I have receive news that we've won the first regular civil court case on the GPL in Germany. This is really good news, since so far we've only had a hand full of preliminary injunctions been granted (and an appeal case against an injunction), but not a regular civil trial.
The judge has ruled, but the details of the court order have not been publicised yet. I'll publicised the full details as soon as thus details are available in the next couple of weeks.
Go, Harald, go. (Via Heise Online.)
Update: Details have now emerged, as has a clarification of what the court decided:
the judges in Frankfurt-on-the-Main also confirmed the fundamental validity of the GPL: "In particular, the provisions of the GPL cannot be read as a relinquishing of copyright or copyright-law legal positions," the judges write in their opinion. The court explicitly confirmed as valid paragraph 4 of the license, which prohibits distribution of any kind in the event of any GPL clause being violated. D-Link had therefore not been entitled to market the GPL-licensed software without abiding by the conditions imposed by the license, while Mr. Welte for his part had been entitled to send the warning notice and make his claims for reimbursement, the judges state.
I find that my way of working is becoming increasingly Webified: I use Gmail, Writely and (just occasionally) the odd bit of Firefox. One of the key apps still missing from that line-up is the spreadsheet. Google's online Spreadsheets wasn't a serious option, because it didn't offer ODF suport - until now. If they could just get the charts sorted out, I would be too. (Via Tecosystems.)
Update: Google Spreadsheets is known as "Spreadly" among Googlers, apparently. I like it.
One of the central themes of this blog is that the opens - open source, open content, open genomics and the rest - share certain key characteristics, and thus form part of a broader movement, of great historical importance.
There's a fine articulation of just this viewpoint in another of Richard Poynder's splendid interviews. It's with Michel Bauwens, creator of the Foundation for P2P Alternatives. The whole thing is well worth reading, but here's a typical sample from the second part - it's a two parter:
We need to increase the scope of applications in which open and free principles are applied; we need to apply and experiment with peer governance, and learn from our mistakes; and, as I said earlier, we need to interconnect and learn from each other, in the understanding that all these efforts are related, and have a larger common purpose.
In addition, we have to defensively stop the destruction of the biosphere, and stop the new enclosures of the information commons we are witnessing. Instead, we need to be constructively building the new world, and in a way that ends and means are congruent with each other. If we do this then the P2P subsystem will continue to strengthen, and eventually reach a tipping point. At that juncture it will become the dominant model.
10 September 2006
If any proof were needed that open source has wide ramifications, consider this:
the open source software adage, known as Linus' Law (that "with enough eyeballs, all bugs are shallow") is coming to apply to international trade and the global behavior of multinational corporations. With enough observers, all trade is transparent, whether the interests involved want it to be or not.
This is important, because for certain products - diamonds, for example - lack of transparency is crucial:
Diamond merchants depended on a veil of secrecy about the origins of their stones to protect them from the consequences of their trade. Global Witness realized that if it could tear down that veil, consumers would react with horror and disgust to the reality they saw
For example, they would learn that
the international trade in diamonds has destabilized whole regions and promoted criminal regimes. They have helped fuel the genocidal Congo wars and kept Angola in chaos. They are intimately tied to the black market in weapons. Terrorists even traffic in them to finance their plots. And these "blood diamonds" are sold in large numbers, by the billions of dollars, on the diamond bourses of Antwerp and other cities.
As well as diamonds, there is much to reveal about illegal logging and oil, and it's good to know that hackers have pioneered processes that are playing an important role here.
09 September 2006
DTP is not something you normally associate with the world of free software. But there is an open source DTP package, and a damn fine one. It's called Scribus, it's cross-platform, and there's a nice tutorial about some of its more advanced features in Tux - a great magazine marred by annoying pop-up ads.
...but I can't agree on this one. You write:
Check out webcitation.org -- a project run at the University of Toronto. The basic idea is to create a permanent URL for citations, so that when the Supreme Court, e.g., cites a webpage, there's a reliable way to get back to the webpage it cited. They do this by creating a reference URL, which then will refer back to an archive of the page created when the reference was created. E.g., I entered the URL for my blog ("http://lessig.org/blog"). It then created an archive URL "http://www.webcitation.org/5IlFymF33". Click on it and it should take you to an archive page for my blog.
This is the TinyURL problem all over again. It destroys one of the greatest features of the Web: its transparency. You can generally see where you are going and some of the structure of what you will find there. TinyURLs and Larry's recommendation do away with this.
Another point is that it's actually harder to enter gobbledygook like "http://www.webcitation.org/5IlFymF33" than even long, but comprehensible URLs, so this system doesn't even achieve the goal of making addresses easier to enter.
Agreed, we need an archive of the Web: but we already have one in the wonderful Internet Archive. What we really need to do is to support it better, with more dosh and more infrastructure.
08 September 2006
One of the key issues that needed to be addressed in order to promote free software in the early days was support: until mainstream companies like IBM and HP started to offer formal support there was a natural concern that users of free software would be left to sort out problems on their own. So when IBM announces a similar step for Eclipse, it's clearly of great symbolic importance, whatever the reality of the offering.
Web 2.0 is all about conversations, they say. So clearly what we need is a search engine for conversations. Enter Talk Digger:
Talk Digger is a web application developed by Frédérick Giasson that helps users to find, follow and join conversations evolving on the Internet.
Talk Digger greatly evolved in 2006. I[t] started being a comparative search engine using the link-back feature of many search engines. Then it evolved in a full-scale meta-search engine reporting web sites linking to another web site. Then it evolved in a search engine of its own: a "conversation search engine" with feature helping the creation of communities around each conversation.
According to this story, the finance authorities in Belgium are starting a pilot project using OpenOffice.org instead of Microsoft Office. Nothing earth-shattering in that, of course, but another nail in the coffin (it's a big coffin.) (Via Erwin's StarOffice Tango.)
07 September 2006
One of the great conundrums of the open world is how to make money by giving stuff away. The solution, as far as I can tell, seems to be to capitalise on the uniquely personal aspects that can't be replicated by competitors by copying. After all, as I've described elsewhere, openness demands that anyone can build on your work by simply taking what you have done and using it, so you can't depend on making money from the control of open content, for example.
Again, as I've written before, it's striking that many top pop stars, for example, now make more money from their concerts than from selling music: the latter is simply a marketing device for the former. This means that music could be given away - no DRM - and stars could still make lots of money.
Now here's the same idea applied in a very different field - Web 2.0 companies. As this interesting piece on a recent acquisition in this sector points out:
With a wide array of sources for private equity providers there is a great deal of competition for leadership and vision in spending their money effectively. Increasingly this calls upon both startups and developed properties and their management to be "hired" in effect to help the "winners" finance their next dreams.
It's a natural adaptation to an investment market that's much less likely to push half-baked ideas to a hasty IPO and far more likely to invest in people with the acumen to move quickly and effectively in rapidly shifting content markets driven by equally rapid shifts in technology.
The really innovative and unique thing that a Web 2.0 company has to offer is the intelligence and originality of the people that power it. Others might be able to copy and re-implement your ideas (you know, that sharing business), but if they can't come up with an equivalent flow of creativity, they are always a step behind.