28 October 2009

If a Sequoia Falls in the Forest...

...maybe somebody's listening:

Today, Sequoia Voting Systems officially introduced its latest revolutionary new offering – the Frontier Election System – the first transparent end-to-end election system including precinct and central count digital optical scan tabulators, a robust election management and ballot preparation system, and tally, tabulation, and reporting applications based on an open architecture with publicly disclosed source code developed specifically to meet current and future iterations of the federal Voting System Guidelines.


“Security through obfuscation and secrecy is not security,” said Eric D. Coomer, PhD, Vice President of Research and Product Development at Sequoia Voting Systems. “Fully disclosed source code is the path to true transparency and confidence in the voting process for all involved. Sequoia is proud to be the leader in providing the first publicly disclosed source code for a complete end-to-end election system from a leading supplier of voting systems and software. Sequoia’s Frontier Election System has been designed to comply with all the current Election Assistance Commission’s Voluntary Voting System Guidelines.”

As you may recall, Sequoia had previously been a big fan of precisely that "security through obfuscation and secrecy", so this is an astonishing turnaround. Kudos to them for finally seeing the light - its rivals will doubtless follow suit soon, because no one can be seen to be *against* transparency.

But even more kudos to all those people who fought the e-voting industry's attempts to shut them up and get on with murky business as usual. This will surely become a textbook example of how dogged and rigorous advocacy finally wins out.

Follow me @glynmoody on Twitter or identi.ca.

27 October 2009

Biophysical Economics: A Different View

One of the things that I have felt for a while is that mainstream economics isn't really the best way to look at free software, or any of the other intellectual commons or - even more importantly - the environmental commons, since economics is really about consumption. And now, it seems, some academics are beginning to call into question basic assumptions about that consumerist, consumptive viewpoint:

The financial crisis and subsequent global recession have led to much soul-searching among economists, the vast majority of whom never saw it coming. But were their assumptions and models wrong only because of minor errors or because today's dominant economic thinking violates the laws of physics?

A small but growing group of academics believe the latter is true, and they are out to prove it. These thinkers say that the neoclassical mantra of constant economic growth is ignoring the world's diminishing supply of energy at humanity's peril, failing to take account of the principle of net energy return on investment. They hope that a set of theories they call "biophysical economics" will improve upon neoclassical theory, or even replace it altogether.

Here's the heart of the problem:

Central to their argument is an understanding that the survival of all living creatures is limited by the concept of energy return on investment (EROI): that any living thing or living societies can survive only so long as they are capable of getting more net energy from any activity than they expend during the performance of that activity.

Great to see some new thinking in this area; I'm sure in time it will have knock-on consequences for the way we look at the commons, too.

Follow me @glynmoody on Twitter or identi.ca.

26 October 2009

How Proprietary JAWS Bites the Blind

Here's a heart-warming tale of those kind people who make proprietary software, specifically of the piquantly-named company Freedom Scientific, which produces a program called JAWS:

JAWS (an acronym for Job Access With Speech) is a screen reader, a software program for visually impaired users, produced by the Blind and Low Vision Group at Freedom Scientific of St. Petersburg, Florida, USA. Its purpose is to make personal computers using Microsoft Windows accessible to blind and visually impaired users. It accomplishes this by providing the user with access to the information displayed on the screen via text-to-speech or by means of a braille display and allows for comprehensive keyboard interaction with the computer.

Clearly, JAWS fulfils an important function for the visually impaired. One might presume it is a font of benevolence and altruism, doing its utmost to help a group of people who are already at a disadvantage. Maybe not, according to this petition:

Braille displays require a screen reader in order to work. Freedom Scientific has steadfastly refused to provide Braille display manufacturers with the driver development kit required to enable a particular Braille device to communicate with JAWS. Instead, the manufacturer must first pay an outrageous sum of money before support for the Braille device will be permitted. What's more, this charge to the Braille display manufacturer is not a one-time fee but is imposed annually.

Well, that doesn't sound very kind. So why on earth do people put up with this?

One might ask how Freedom Scientific can play the gatekeeper to its JAWS product where Braille driver support is concerned. The answer is simply and for no other reason because it can.

...

I for one am shocked, appalled, and amazed that Freedom Scientific would impose such limitations and restrictions not only upon its own customer base but also on those organizations which manufacture products that supplement the information that JAWS provides. This draconian and self-serving policy is not at all in keeping with the pro-Braille spirit exemplified by the Braille Readers are Leaders Initiative set into motion earlier this year by the National Federation of the Blind in honor of the Bicentennial celebration of Louis Braille. Instead of offering an additional opportunity to expand the usage of Braille, it stifles the ability of the blind consumer to choose the Braille display that will best meet his/her needs.

And the reason it can, of course, is because it is proprietary software, which means that nobody can route around the problem.

This episode shows once again why it is vital for such software to be open source so that there is no gatekeeper, and so that the community's needs come first, not the desire of a company to make as much money as possible regardless of the plight of the people it affects.

Follow me @glynmoody on Twitter or identi.ca.

23 October 2009

UK Government Blows it on Lobbying

If you wanted proof that the UK government is still an enemy of transparency, try this:

The Government is grateful to the Public Administration Select Committee for its examination of lobbying in the UK, which is the first Parliamentary inquiry on the subject since 1991.

It is right that the Government remains alert for signs of improper influence over any aspect of our public life and the Committee's Report provides a helpful opportunity to look again at arrangements and to ensure that it has the right framework in place to ensure confidence in the way outside interests interact with government.

In responding to the Committee's recommendations, it is first important to set out the context of this inquiry. While the Committee's Report focuses mainly on the relationship between the lobbying industry and Government, it must be remembered that lobbying goes much wider than this. Lobbying is essentially the activity of those in a democracy making representations to government on issues of concern. The Government is committed to protecting this right from improper use while at the same time seeking to avoid any unnecessary regulation or restriction. As well as being essential to the health of our democracy, its free and proper exercise is an important feature of good government.

What this conveniently glosses over is the difference between "making representations to government on issues of concern" - which is what you and I as citizens do all the time, mostly by sending emails to MPs and ministers - and *lobbying*, which is now an entire industry of people employed to use every trick in the book, from the most to least subtle, to get what their clients want.

The first - making representations - is just what it seems: someone expressing their view and/or asking for action. Lobbying, by contrast, is your typical iceberg, with most of its intent invisible below the surface. That is why a lobbyists' register is needed - so that others can work out the iceberg. The UK government's refusal to countenance this - and the pathetic excuse it offers for doing so - are yet another blot on this administration's record as far as openness is concerned.

And if you're wondering why it is so obstinate, maybe this has something to do with it:

The Government agrees that any system of regulation, whether it is voluntary self-regulation or statutory regulation, requires a register of lobbyists to ensure that lobbying activity is transparent. The Government agrees with most of the elements for such a register outlined by the Committee.

However, the Government does not agree that such a register should include the private interests of Ministers and civil servants. This should not be a matter for a register of lobbyists. Ultimately, major decisions are taken by Ministers. Information about Ministers' relevant private interests is now published as well as information in the Registers of Members' and Peers' Interests. In addition, relevant interests' of departmental board members are also available publicly. However, the Government believes that the proposal for a Register of the private interests of civil servants would be a disproportionate requirement that would place a significant burden on departments and agencies while adding very little to the regulation of lobbying. Both Ministers and civil servants are already subject to clear standards of conduct for dealing with lobbyists.

But hang on a minute: if the argument is that such information is *already* made available, then there would be no *extra* burden in providing it to both authorities. It would only be information not already declared that might require effort - and that is precisely what should be made available. Yet more pathetic - and transparently false - logic from the UK government, which is still trying to keep us from scrutinising the engines of political power.

Follow me @glynmoody on Twitter or identi.ca.

The Utter Moral Bankruptcy of the DNA Database

This is staggering:

Detections using the national DNA database have fallen over the past two years despite the number of profiles increasing by 1m and its running costs doubling to £4.2m a year.

A report on the database covering the years 2007-09, published today, shows that crimes cleared up as a result of a match on the DNA database fell from 41,148 to 31,915 over the period. At the same time the number of DNA profiles on the database – already the largest in the world – rose from 4.6m to 5.6m. Duplicates mean that the police database now holds details of 4.89 million individuals.

That is, despite increasing the size to getting on for 10% of the UK population, the number of crimes cleared *fell* by over 25%. How pathetic is that? Not as pathetic as this statement from the truly Orwellian "National Policing Improvement Agency":

Nevertheless, Peter Neyroud, the head of the National Policing Improvement Agency (NPIA), which hosts the DNA database, says in the report that it continues to provide the police with the most effective tool for the prevention and detection of crime since the development of fingerprint analysis more than a century ago.

Against the background that this "most effective tool for the prevention and detection of crime since the development of fingerprint analysis more than a century ago" is getting ever-less effective and more costly, and infringing on the rights of ever more people, this statement proves just one thing: that the British police are getting more and more incompetent, and have to rely on more and more Draconian laws and tools just to stop their already footling success rate dropping even more precipitously.

This is an utter scandal on so many levels, but above all because the UK government is continuing to foist this intrusive, disproportionate, racist and morally repugnant approach upon us when it's *own figures* demonstrate that it is failing more and more each year.

Follow me @glynmoody on Twitter or identi.ca.

22 October 2009

Of Open Source and Open Government

One of the key figures in the open government in Australia - and indeed globally, given the paucity of such people - is Kate Lundy. She's been speaking at the Free and Open Source Software for Geospatial Conference 2009. Understandably, her talk was mostly about geospatial data, but there was also this nice section:

FOSS is like a living blueprint – a map if you will – for trust, sustainability and interoperability in the implementation of Gov 2.0 principles. FOSS principles and methodologies are the best case studies you can find for tried and proven online collaboration with constructive outcomes. FOSS applications provide reference implementations for pretty much anything, which government can build upon and customise. We have a lot to learn from FOSS methods and practices, and I would ask all of you to assist us, in government, to understand.

Would that more politicians were so perspicacious.

Follow me @glynmoody on Twitter or identi.ca.

Artists to Fans to Artists: Positive Feedback

One of the sad things about the current mess in the music industry is that artists are too often pitted against fans, when in fact both want the same thing: good music in a convenient format at a fair price. Here's a welcome initiative that's trying to bridge that gulf of misunderstanding:

Artists need to be paid, and fans want to pay them.

Our goals at a2f2a are:

* Help each community better understand the other;

* Help find a practical and workable system which offers artists fair remuneration in exchange for access to material by fans; and

* Help set the agenda for discussions about the role P2P can play within the emergent digital record industry.

Together, we can do it – artist to fan to artist.

What I particularly like about this - aside from the dialogue - is that it starts from the premise that people *do* want to pay for stuff. I think that's absolutely central: most people realise that artists need to be supported, and that if everyone pays, the overall price will be lower. But the music industry likes to portray the public as split in two: those who don't want to pay anything, ever, and those who will meekly pay whatever exorbitant price the labels demand. It ain't that Manichean, and if a2f2a can help to dispel that myth, that's got to be good news.

Follow me @glynmoody on Twitter or identi.ca.

21 October 2009

No Patents on Seeds...or We're Really Stuffed

Good to see that I'm not a lone voice crying in the wilderness:


The continuing patenting of seeds, conventional plant varieties and animal species leads to far-reaching expropriations of farmers and breeders: farmers are deprived of their rights to save their seeds, and breeders are under strong limitations to use the patented seeds freely for further breeding. The patent holder controlls the sale of the seeds and the planting, decides about the use of herbicides and can even collect royalties at the harvest – up to the finished food product.

Our food security is increasingly dependent on a few transnational chemical and biotechnological companies.

The European Patent Office (EPO) has continuasly broadened the scope of patentability and undermined existing restrictions, in the interest of multinational companies.

Allthough plant varieties and animal species are by law exempt from patentability several hundret patents on genetically modified plants have been granted already. Basis for these decissions is the highly controversial EU Biotech Patents directive and a decission by the EPO's Enlarged Board of Appeal, which ruled in 1999 that in principle such patents could be granted.

Now the European Patent Office again has to deal with a basic question: Patents on conventional plants and animals!

The Enlarged Board of Appeal of the EPO will use a patent on broccoli (EP 1069819) for a fundamental ruling, on whether or not conventional plants are patentable. The broccoli in question was merely diagnosed using marker assisted breeding methods to identify its natural occuring genes. The genes were not modified. All other broccoli plants with similar genes are considered as "technical inventions“ by the patent. Thus even their use for breeding and the plants themselves are monopolised. Through this the provision which prohibits the patenting of "essentially biological proceses" is to be undermined. The EPO has already granted similar patents: e.g.: only recently the company Enza Zaden Beheer received a patents on pathogene resitant lettuce ( EP1179089B1)

Should the Enlarged Board of Appeal uphold the patent, then this decission (case T0083/05) will be binding for all other pending patent applications and even for animals and their offspring.

This exactly parallels the situation with software patents, where the EPO is using every trick in the book to approve them; except it's even worse.

Follow me @glynmoody on Twitter or identi.ca.

Why not Participatory Medicine?

If this participation thing is so great, why don't we apply it to something really important, like medicine? Why not, indeed?

Welcome to JoPM, a New Peer-Reviewed, Open Access Journal

Our mission is to transform the culture of medicine to be more participatory. This special introductory issue is a collection of essays that will serve as the 'launch pad' from which the journal will grow. We invite you to participate as we create a robust journal to empower and connect patients, caregivers, and health professionals.

More specifically:

Because the Journal of Participatory Medicine is a new journal publishing multidisciplinary articles on topics within a new, not yet defined field, we have established draft parameters that define the journal’s range of interest. We anticipate that these parameters will change somewhat as the field develops. In the meantime, the following characterize the field of participatory medicine.

I particularly liked the following section, with its emphasis on openness:

New Knowledge Creation stems from the collaboration of researchers and patients, as individuals and as groups.

1. Health professionals and patients sharing in the discussion of scientific methods, including open discussion about the level of evidence of the research

2. Open, transparent process that demonstrates collaboration and participation in research

3. Patients with significant interest in a topic joining together to create repositories for research, including (but not limited to) registries, tissue banks and genetic databases; demonstrating mutual respect for the contributions of the data owners and health research professionals with the tools to gain insight from those repositories. Interpretation of results and conclusions including involvement of all stakeholders.

Important stuff, worth a read.

Follow me @glynmoody on Twitter or identi.ca.

Won't Somebody Please Think of the Orphans?

This is droll: the European Commission is finally waking up to the copyright orphans problem - thanks to some healthy panic induced by Google's book digitisation plans [.pdf]:


Orphan works are works that are in copyright but whose right holders cannot be identified or located. Protected works can become orphaned if data on the author and/or other relevant right holders (such as publishers, photographers or film producers) is missing or outdated. A work can only be exploited only after obtaining prior permission from the right holders. In the case of orphan works, granting such authorisation is not possible. This leads to a situation where millions of works cannot be copied or otherwise used e.g. a photograph cannot be used to illustrate an article in the press, a book cannot be digitised or a film restored for public viewing. There is also a risk that a significant proportion of orphan works cannot be incorporated into mass-scale digitisation and heritage preservation efforts such as Europeana or similar projects.

Libraries, universities, archives, some commercial users and several Member States claim that the problem of existing instruments, such as the Commission Recommendation 2006/585/EC 7 or the 2008 Memorandum of Understanding on Orphan Works and the related diligent search guidelines, is that these are not legally binding acts and that the issue of mass digitisation has not been addressed. Since non-legislative initiatives neither provide sufficient legal certainty nor solve the fact that using orphan works constitutes a copyright infringement, they advocate a legislative approach at the European level to allow different uses of orphan works. It is also stressed that obstacles to intra-Community trade in orphan works may emerge if each Member State were to adopt its own set of rules to deal with the problem.

For publishers, collecting societies and other right holders, orphan works are a rights-clearance issue. They are sceptical about introducing a blanket exception to use orphan works. For them, the crucial issue is to ensure that a good faith due diligence search to identify and locate the right holders is carried out, using existing databases.

The utter cluelessness and fatuity of the publishers' response is breath-taking: the problem is that the rights-holders *cannot be found* - that's why they're called "orphans". Demanding "due diligence" just misses the point completely.

At least this proves that publishers simply have no credible arguments against introducing an "exception" to use orphan works, for example in digitisation projects. And against their non-existent downside, the upside is just immense. Let's hope the European Commission is beginning to understand this.

Follow me @glynmoody on Twitter or identi.ca.

20 October 2009

Racing to the Bottom of Openness

Here's some interesting news about Barnes & Noble's e-reader:

The reader, named the “Nook,” looks a lot like Amazon’s white plastic e-book, only instead of the chiclet-keyboard there is a color multi-touch screen, to be used as both a keyboard or to browse books, cover-flow style. The machine runs Google’s Android OS, will have wireless capability from an unspecified carrier and comes in at the same $260 as the now rather old-fashioned-looking Kindle.

Linux-based: no surprise there. But this is:

And over at the Wall Street Journal, somebody got a peek at an at ad set to run in the New York Times this coming Sunday. The ad features the line “Lend eBooks to friends”, and this has the potential to destroy the Kindle model. One of the biggest problems with e-books is that you can’t lend or re-sell them. If B&N is selling e-books cheaper than the paper versions, then the resale issue is moot. And lending, even if your friends need a Nook, too, takes away the other big advantage of paper.

In fact, this loaning function could be the viral feature that makes the device spread. Who would buy a walled-garden machine like the Kindle when the Nook has the same titles, cheaper, and you can borrow? The Nook is already starting to look like the real internet to the Kindle’s AOL.

It's a classic "race to the bottom", where the bottom is total openness: see you there, Amazon.

Follow me @glynmoody on Twitter or identi.ca.

19 October 2009

Monsanto: Making Microsoft Look Good

Following my recent post about Bill Gates helping to push genetically-modified and patented seeds towards needy African farmers, Roy Schestowitz kindly send me links to the follow-on story: Gates attacking anyone who dares to criticise that move:

Microsoft founder Bill Gates said on October 15 that environmentalists who are adamantly opposed to using genetically modified crops in Africa are hindering efforts to end hunger on that continent.

Gates was speaking at the annual World Food Prize forum, which honors those who make important contributions to improving agriculture and ending hunger. He noted that genetically modified crops, fertilizers, and chemicals could all help small African farms produce more food, but environmentalists who resist their use are standing in the way.

“This global effort to help small farmers is endangered by an ideological wedge that threatens to split the movement in two,” Gates told the forum. “Some people insist on an ideal vision of the environment. They have tried to restrict the spread of biotechnology into sub-Saharan Africa without regard to how much hunger and poverty might be reduced by it, or what the farmers themselves might want.”

This is, of course, a clever framing of the debate: if you're against patented GMOs it's because you're an "idealist" (now where have I heard that before?), with a hint of Luddite too. The same post - which writes from a very Gates-friendly viewpoint - quotes him as saying:

On one side is a technological approach that increases productivity.

On the other side is an environmental approach that promotes sustainability.

Productivity or sustainability — they say you have to choose.

It’s a false choice, and it’s dangerous for the field. It blocks important advances. It breeds hostility among people who need to work together. And it makes it hard to launch a comprehensive program to help poor farmers.

The fact is, we need both productivity and sustainability — and there is no reason we can’t have both.

Do genetically-modified seeds bring increased productivity? There seem doubts; but even assuming it's true, Gates sets up a false dichotomy: one reason GMO seeds aren't sustainable is because they are patented. That is, farmers *must* buy them year after year, and can't produce their own seeds. It's a situation that's relatively easy to solve: make GMOs patent-free; do not place restrictions on their use; let farmers do what farmers have done for millennia.

And look, there you have it, potentially: productivity and sustainability. But we won't get that, not because the idealistic environmentalist are blocking it, but because the seed industry wants farmers dependent on their technology, not liberated by it. It is sheer hypocrisy for a fan of patents to accuse environmentalists of being the obstacle to productivity and sustainability: that would be the industrial model of dependence, enforced by intellectual monopolies, and espoused by big companies like Monsanto, the Microsoft of plant software.

I wrote about the human price paid in India as a result of these patented seeds and the new slavery they engender a few months back. The key quotation:

Tara Lohan: Farmer suicides in India recently made the news when stories broke last month about 1,500 farmers taking their own lives, what do you attribute these deaths to?

Vandana Shiva:
Over the last decade, 200,000 farmers have committed suicide. The 1,500 figure is for the state of Chattisgarh. In Vidharbha, 4,000 are committing suicide annually. This is the region where 4 million acres of cotton have been grown with Monsanto's Bt cotton. The suicides are a direct result of a debt trap created by ever-increasing costs of seeds and chemicals and constantly falling prices of agricultural produce.

When Monsanto's Bt cotton was introduced, the seed costs jumped from 7 rupees per kilo to 17,000 rupees per kilo. Our survey shows a thirteenfold increase in pesticide use in cotton in Vidharbha. Meantime, the $4 billion subsidy given to U.S. agribusiness for cotton has led to dumping and depression of international prices.

Squeezed between high costs and negative incomes, farmers commit suicide when their land is being appropriated by the money lenders who are the agents of the agrichemical and seed corporations. The suicides are thus a direct result of industrial globalized agriculture and corporate monopoly on seeds.

Here's an excellent, in-depth feature from Vanity Fair on the tactics Monsanto uses in the US. A sample:

Some compare Monsanto’s hard-line approach to Microsoft’s zealous efforts to protect its software from pirates. At least with Microsoft the buyer of a program can use it over and over again. But farmers who buy Monsanto’s seeds can’t even do that.

...

Farmers who buy Monsanto’s patented Roundup Ready seeds are required to sign an agreement promising not to save the seed produced after each harvest for re-planting, or to sell the seed to other farmers. This means that farmers must buy new seed every year. Those increased sales, coupled with ballooning sales of its Roundup weed killer, have been a bonanza for Monsanto.

The feature is from last year, but I don't imagine the situation has got better since then. Indeed, the picture it paints of Monsanto is so bleak and depressing that I'm forced to admit that Microsoft in comparison comes off as almost benevolent. Given Monsanto's size, methods and evident ambitions, I fear I shall be writing rather more about this company in the future.

Follow me @glynmoody on Twitter or identi.ca.

18 October 2009

Opencourseware Comes Under Attack

It was bound to happen: opencourseware is under attack:

While seeking to make college more accessible, the Obama administration has launched a largely unnoticed assault upon the nation’s vibrant market in online learning. As part of an ambitious bill designed to tighten federal control over student lending, the House of Representatives included a scant few sentences green-lighting a White House plan to spend $500 million on an “Online Skills Laboratory,” in which the federal government would provide free online college courses in a variety of unspecified areas. The feds would make the courses “freely available” and encourage institutions of higher education to offer credit for them. The measure is now before the Senate.

Ah yes, "freely available": that communistic cancer again.

It is not clear what problem the administration is seeking to solve. The kinds of online courses that the administration is calling for already exist, and are offered by an array of publishers and public and private institutions. Online enrollment grew from 1.6 million students in 2002 to 3.9 million in 2007. Nearly 1,000 institutions of higher education provide distance learning.

More than half a dozen major textbook publishers, and hundreds of smaller providers, develop and distribute online educational content. To take one example, Pearson’s MyMathLab is a self-paced, customizable online course, which the University of Alabama uses to teach more than 10,000 students a year. Janet Poley, president of the American Distance Education Consortium, doesn’t see the need for federal dollars to be spent “reinventing courses that have already been invented.”

Since it's "not clear what problem the administration is seeking to solve", allow me to offer a little help.

The article suggests that the kinds of online courses that will be created are already on offer: well, no, because those produced by "major textbook publishers, and hundreds of smaller providers" are neither "free of charge" nor "free" in the other, more interesting sense that you can take them, rework them, reshape them, and then share them. And why might that be a good idea? Well, most importantly, because it means that you don't have to "reinvent courses that have already been invented."

Oh, but wait: isn't that what the article says the current situation avoids? Indeed: and that, of course, is where the article goes wrong. The existing courses, which are proprietary, and may not be copied or built on, cause *precisely* the kind of re-inventing of the wheel that opencourseware is accused of. That's because every publisher must start again, laboriously recreating the same materials, in order to avoid charges of copyright infringement.

That's an absurd waste of effort: the facts are the same, so once they are established it's clearly much more efficient to share them and then move on to create new content. The current system doesn't encourage that, which is why we need to change it.

Given the article gets things exactly the wrong way round, it's natural to ask how the gentleman who penned these words might have come to these erroneous conclusions. First, we might look at where he's coming from - literally:

Frederick M. Hess is director of education-policy studies at the American Enterprise Institute.

Here's what SourceWatch has to say on this organisation:

The American Enterprise Institute for Public Policy Research (AEI) is an extremely influential, pro-business, conservative think tank founded in 1943 by Lewis H. Brown. It promotes the advancement of free enterprise capitalism, and succeeds in placing its people in influential governmental positions. It is the center base for many neo-conservatives.

And if that doesn't quite explain why Mr Hess might be pushing a somewhat incorrect characterisation of the situation, try this:

In 1980, the American Enterprise Institute for the sum of $25,000 produced a study in support of the tobacco industry titled, Cost-Benefit Analysis of Regulation: Consumer Products. The study was designed to counteract "social cost" arguments against smoking by broadening the social cost issue to include other consumer products such as alcohol and saccharin. The social cost arguments against smoking hold that smoking burdens society with additional costs from on-the-job absenteeism, medical costs, cleaning costs and fires. The report was part of the global tobacco industry's 1980s Social Costs/Social Values Project, carried out to refute emerging social cost arguments against smoking.

So, someone coming from an organisation that has no qualms defending the tobacco industry is unlikely to have much problem denouncing initiatives that spread learning, participation, collaboration, creativity, generosity and general joy in favour of all their antitheses. And the fact that such a mighty machine of FUD should stoop to attack little old opencourseware shows that we are clearly winning.

17 October 2009

The Commons Meme Becomes More Common

One of the great knock-on benefits of Elinor Ostrom sharing the Nobel prize for Economics is that the concept of the commons is getting the best airing that it's ever had. Here's another useful meditation on the subject from someone who knows what he's talking about, since he's written a book on the subject:

Old fables die hard. That's surely been the history of the so-called "tragedy of the commons," one of the most durable myths of the past generation. In a famous 1968 essay, biologist Garrett Hardin alleged that it is nearly impossible for people to manage shared resources as a commons. Invariably someone will let his sheep over-graze a shared pasture, and the commons will collapse. Or so goes the fable.

In fact, as Professor Elinor Ostrom's pioneering scholarship over the past three decades has demonstrated, self-organized communities of "commoners" are quite capable of managing forests, fisheries and other finite resources without destroying them. On Monday, Ostrom won a Nobel Prize in Economics for explaining how real-life commons work, especially in managing natural resources.

As he notes:

Although Ostrom has not written extensively about the Internet and online commons, her work clearly speaks to the ways that people can self-organize themselves to take care of resources that they care about. The power of digital commons can be seen in the runaway success of Linux and other open-source software. It is evident, too, in the explosive growth of Wikipedia, Craigslist (classified ads), Flickr (photo-sharing), the Internet Archive (historical Web artifacts) and Public.Resource.org (government information). Each commons acts as a conscientious steward of its collective wealth.

And this is an acute observation:

A key reason that all these Internet commons flourish is because the commoners do not have to get permission from, or make payments to, a corporate middleman. They can build what they want directly, and manage their work as they wish. The cable and telephone companies that provide access to the Internet are not allowed to favor large corporate users with superior service while leaving the rest of us--including upstart competitors and non-market players--with slower, poorer-quality service.

In an earlier time, this principle was known as "common carriage"--the idea that everyone shall have roughly equivalent access and service, without discrimination. Today, in the Internet context, it is known as "net neutrality."

Neat: another reason we need to preserve Net neutrality is to preserve all the commons - past, present and future - it enables.

Follow me @glynmoody on Twitter or identi.ca.

15 October 2009

Open Sourcing America's Operating System

And how do you do that? By making all of the laws freely available - and, presumably, searchable and mashable:

Public.Resource.Org is very pleased to announce that we're going to be working with a distinguished group of colleagues from across the country to create a solid business plan, technical specs, and enabling legislation for the federal government to create Law.Gov. We envision Law.Gov as a distributed, open source, authenticated registry and repository of all primary legal materials in the United States.

This is great news, because Carl Malamud - the force behind this initiative - has been urging it for years: now it looks like it's beginning to take a more concrete form:

The process we're going through to create the case for Law.Gov is a series of workshops hosted by our co-conveners. At the end of the process, we're submitting a report to policy makers in Washington. The process will be an open one, so that in addition to the main report which I'll be authoring, anybody who wishes to submit their own materials may do so. There is no one answer as to how the raw materials of our democracy should be provided on the Internet, but we're hopeful we're going to be able to bring together a group from both the legal and the open source worlds to help crack this nut.

I particularly liked the following comment:

Law.Gov is a big challenge for the legal world, and some of the best thinkers in that world have joined us as co-conveners. But, this is also a challenge for the open source world. We'd like to submit such a convincing set of technical specs that there is no doubt in anybody's mind that it is possible to do this. There are some technical challenges and missing pieces as well, such as the pressing need for an open source redaction toolkit to sit on top of OCR packages such as Tesseract. There are challenges for librarians as well, such as compiling a full listing of all materials that should be in the repository.

What's interesting is that this recognises that open source is not just an inspiration, but a key part of the solution, because - like the open maths movement I wrote about below - it needs new kinds of tools, and free software is the best way to provide them.

Now, if only someone could do something similar in the UK....

Open Source Mathematics

This is incredibly important:

On 27 January 2009, one of us — Gowers — used his blog to announce an unusual experiment. The Polymath Project had a conventional scientific goal: to attack an unsolved problem in mathematics. But it also had the more ambitious goal of doing mathematical research in a new way. Inspired by open-source enterprises such as Linux and Wikipedia, it used blogs and a wiki to mediate a fully open collaboration. Anyone in the world could follow along and, if they wished, make a contribution. The blogs and wiki functioned as a collective short-term working memory, a conversational commons for the rapid-fire exchange and improvement of ideas.

The collaboration achieved far more than Gowers expected, and showcases what we think will be a powerful force in scientific discovery — the collaboration of many minds through the Internet.

You can read the details of what happened - and it's inspiring stuff - in the article. But as well as flagging up this important achievement, I wanted to point to some interesting points it makes:

The process raises questions about authorship: it is difficult to set a hard-and-fast bar for authorship without causing contention or discouraging participation. What credit should be given to contributors with just a single insightful contribution, or to a contributor who is prolific but not insightful? As a provisional solution, the project is signing papers with a group pseudonym, 'DHJ Polymath', and a link to the full working record. One advantage of Polymath-style collaborations is that because all contributions are out in the open, it is transparent what any given person contributed. If it is necessary to assess the achievements of a Polymath contributor, then this may be done primarily through letters of recommendation, as is done already in particle physics, where papers can have hundreds of authors.

The project also raises questions about preservation. The main working record of the Polymath Project is spread across two blogs and a wiki, leaving it vulnerable should any of those sites disappear. In 2007, the US Library of Congress implemented a programme to preserve blogs by people in the legal profession; a similar but broader programme is needed to preserve research blogs and wikis.

These two points are also relevant to free software and other open endeavours. So far, attribution hasn't really been a problem, since everyone who contributes is acknowledged - for example through the discussions around the code. Similarly, preservation is dealt with through the tools for source code management and the discussion lists. But there are crucial questions of long-term preservation - not least for historical purposes - which are not really being addressed, even by the longest-established open projects like GNU.

For example, when I wrote Rebel Code, I often found it hard to track down the original sources for early discussions. Some of them have probably gone for ever, which is tragic. Maybe more thought needs to be given - not least by central repositories and libraries - about how important intellectual moments that have been achieved collaboratively are preserved for posterity to look at and learn from.

Talking of which, the article quoted above has this to say on that subject:

The Polymath process could potentially be applied to even the biggest open problems, such as the million-dollar prize problems of the Clay Mathematics Institute in Cambridge, Massachusetts. Although the collaborative model might deter some people who hope to keep all the credit for themselves, others could see it as their best chance of being involved in the solution of a famous problem.

Outside mathematics, open-source approaches have only slowly been adopted by scientists. One area in which they are being used is synthetic biology. DNA for the design of living organisms is specified digitally and uploaded to an online repository such as the Massachusetts Institute of Technology Registry of Standard Biological Parts. Other groups may use those designs in their laboratories and, if they wish, contribute improved designs back to the registry. The registry contains more than 3,200 parts, deposited by more than 100 groups. Discoveries have led to many scientific papers, including a 2008 study showing that most parts are not primitive but rather build on simpler parts (J. Peccoud et al. PLoS ONE 3, e2671; 2008). Open-source biology and open-source mathematics thus both show how science can be done using a gradual aggregation of insights from people with diverse expertise.

Similar open-source techniques could be applied in fields such as theoretical physics and computer science, where the raw materials are informational and can be freely shared online. The application of open-source techniques to experimental work is more constrained, because control of experimental equipment is often difficult to share. But open sharing of experimental data does at least allow open data analysis. The widespread adoption of such open-source techniques will require significant cultural changes in science, as well as the development of new online tools. We believe that this will lead to the widespread use of mass collaboration in many fields of science, and that mass collaboration will extend the limits of human problem-solving ability.

What's exciting about this - aside from the prospect of openness spreading to all these other areas - is that there's a huge opportunity for the open source community to start, er, collaborating with the scientific one in producing these new kinds of tools that currently don't exist and are unlikely to be produced by conventional software houses (since spontaneously collaborative communities can't actually pay for anything). I can't wait.

Follow me @glynmoody on Twitter or identi.ca.

Gates Gives $300 million - but with a Catch

It's becoming increasingly evident that Bill Gates' philanthropy is not simple and disinterested, but has woven into it a complex agenda that has to do with his love of intellectual monopolies - and power. Here's the latest instalment:


The Bill and Melinda Gates Foundation, which is donating another $120 million to boosting agriculture in the developing world, will focus on self-help aid for poor farmers to sustain and grow production, a top adviser to the world's leading charitable foundation said.

Sounds good, no? Here are more details:

The Gates Foundation, with a $30 billion endowment to improve health and reduce poverty in developing countries, began investing in agricultural projects three years ago. The latest grants bring its farm sector awards to $1.4 billion.

One of its first investments was in African seeds through the Alliance for a Green Revolution in Africa (AGRA). The group is expected to introduce more than 1,000 new seed varieties of at least 10 crops to improve African production by 2016.

"Alliance for a Green Revolution in Africa" also sounds good; here's a little background on that organisation:

It has not gone unnoticed that AGRA falls under the direct supervision of the Global Development Program, whose senior programme officer is Dr. Robert Horsch, who worked for Monsanto for 25 years before he joined the Gates Foundation. Horsch was part of the scientific team in the company that developed Monsanto’s YieldGard, BollGard and RoundUp Ready technologies. Horsch’s task at the Gates Foundation is to apply biotechnology toward improving crop yields in regions including sub-Saharan Africa. Lutz Goedde another senior program officer of the Global Development Program, is also a recruit from the biotech industry as he used to head Alta Genetics, the world's largest privately owned cattle genetics improvement and artificial insemination Company, worth US$100 million.

That is, AGRA not only has close links with the Gates Foundation, but also with Monsanto - the Microsoft of the seed world.

If you read the rest of the document from which the above information was taken, you'll see that the AGRA programme is essentially promoting approaches using seeds that are genetically modified and patented. Here's the conclusion:

Sub-Saharan Africa represents an extremely lucrative market for seed companies. The development interventions by AGRA appear on the face of it, to benevolent. However, not only will AGRA facilitate the change to a market based agricultural sector in Africa replacing traditional agriculture, but it will also go a long way towards laying the groundwork for the entry of private fertilizer and agrochemical companies and seed companies, and more particularly, GM seed companies.

So Gates' donations are ultimately promoting an agriculture based on intellectual monopolies - just as Microsoft does in the software field. The latest $300 million doesn't sound quite so generous now, does it?

Follow me @glynmoody on Twitter or identi.ca.

14 October 2009

Who is La Rochefoucauld of Twitter?

Mozilla's Tristan Nitot has come up with a rather fine aphorism:

Twitter, c'est la version XXI°S des salons mondains, mais limitée à 140 caractères, et à l'échelle du globe.

So come on people, start polishing those tweets: somewhere out there is La Rochefoucauld of Twitter....

Follow me @glynmoody on Twitter or identi.ca.

12 October 2009

Windows Does Not Scale

Who's afraid of the data deluge?


Researchers and workers in fields as diverse as bio-technology, astronomy and computer science will soon find themselves overwhelmed with information. Better telescopes and genome sequencers are as much to blame for this data glut as are faster computers and bigger hard drives.

While consumers are just starting to comprehend the idea of buying external hard drives for the home capable of storing a terabyte of data, computer scientists need to grapple with data sets thousands of times as large and growing ever larger. (A single terabyte equals 1,000 gigabytes and could store about 1,000 copies of the Encyclopedia Britannica.)

The next generation of computer scientists has to think in terms of what could be described as Internet scale. Facebook, for example, uses more than 1 petabyte of storage space to manage its users’ 40 billion photos. (A petabyte is about 1,000 times as large as a terabyte, and could store about 500 billion pages of text.)

Certainly not GNU/Linux: the latest Top500 supercomputer rankings show that the GNU/Linux family has 88.60%. Windows? Glad you asked: 1%.

So, forget about whether there will ever be a Year of the GNU/Linux Desktop: the future is about massive data-crunchers, and there GNU/Linux already reigns supreme, and has done for years. It's Windows that's got problems....

Follow me @glynmoody on Twitter or identi.ca.

09 October 2009

Why Creativity Needs Shorter Copyright Terms

In response to a tweet of mine about shortening copyright to stimulate creativity, someone questioned the logic. It's an important point, so it seems useful to do some thinking out loud on the subject.

First, I should probably address the question of whether *longer* copyright stimulates creativity. The basic argument seems to be that longer copyright terms mean greater incentives, which means greater creativity. But does anyone seriously think about the fact that their creations will still be in copyright 69 years after their death? It won't do them any good, and probably won't do their descendants much good either, since the income at this point is generally close to zero.

Indeed, speaking as an author, I know that practically all my income from writing comes within the first couple of years; after that, it's dribs and drabs. If my copyright were cut down to even five years, it would make only a marginal difference to my total remuneration.

Now, clearly I'm not JK Rowling, but the point is, neither are 99.99999% of authors: I know from talking to other run-of-the mill writers that the same holds for them, too. So in practical terms, reducing the copyright term would have little effect on the money that most creators earned as result.

But let's look at the main part of my claim: that reducing copyright's term would encourage creativity. This is based on the rough syllogism that all artists draw on their predecessors in some way; making more prior creativity available would allow more artists to draw on it in more ways; and so this would increase overall creativity.

For the first assertion, look at history. Painters once began by mixing paints in another artist's studio, then drawing unimportant bits in his (usually his) works, learning how to emulate his style. Then they gradually painted more important bits in the style of that artist, often doing the low-cost jobs or rush jobs that he didn't have time or inclination to execute. Then, one day, that apprentice would set up on his (usually his) own, building on all the tricks and techniques he had learned from his master, but gradually evolving his own style.

Today, would-be artists tend not to become apprentices in the same way. Instead, they typically go to art school, where they learn to *copy* the masters in order to learn their techniques. Often you see them doing this in art galleries, as they strive to reproduce the exact same effect in their own copy. It teaches them the basics of painting that they can then build on in their own work.

In music, something very similar happens: journeyman composers write pieces in the style of the acknowledged masters, often copying their themes and structure very closely. This is true even for extreme geniuses. For example, in order to learn how to write in the new early classical style, the eight-year-old Mozart arranged three piano sonatas from J C Bach's Op. 5 as keyboard concertos.

Mozart also "borrowed" entire themes - most famously in the overture to The Magic Flute, where he takes a simple tune from a piano sonata by Clementi, and transforms it. Some composers did this on a regular basis. Handel, in particular, was quite unscrupulous in taking themes from fellow composers, and turning them into other, rather better, works. Moreover, the widely-used form of musical variations is based generally on taking a well-known theme and subjecting it to various transformations.

That was in the past, when art was an analogue artefact. Copying took place through trying to reproduce an artistic effect, or by borrowing musical themes etc. Today, in the digital age, copying is not such an incidental act, but central to how we use computers. When we access something online, we copy it to our computers (even audio streaming has to be assembled into copies of small chunks of sound before we can hear it).

Digital plasticity - the ability to compute with any content - makes the clumsy copying and learning processes of the past trivially easy. A child can take a digital image of a work of art and cut and paste elements of it into his or her own work; anyone can sample music, distort it and mix it with their own; texts can be excerpted and juxtaposed with others drawn from very diverse backgrounds to create mosaics of meaning.

All these amazingly rich and innovative things are now very easy to do practically, but the possibilities of doing so are stymied by laws that were drawn up for an analogue age. Those laws were not designed to forbid artists from learning from existing creations, but to stop booksellers producing unauthorised copies - a totally different issue. The idea of using just part of a work was not really a concern. But it is today, when the cut and paste metaphor is central to the digital world. That is why we need to reduce copyright to the bare minimum, so that the legal obstacles to creating in this new, inherently digital way, are removed.

If we don't, one of two things will happen. Either we will fail to realise the full creative potential of computing, or else the younger generation of artists will simply ignore the law. Either is clearly unsatisfactory. What is needed is a copyright regime that is balanced. That is far from being the case today. As the media industry (sic) ratchets up copyright terms again and again, creation has become subservient to the corporation, and the creators are cut off from their past - and hence future.

Follow me @glynmoody on Twitter or identi.ca.

07 October 2009

EU Consultation on Post-i2010 - Please Do It

Stupidly, I thought this EU consultation would be the usual clueless nonsense, unredeemable even by witty comments from people like me. I was wrong. It's actually an incredibly wide-ranging questionnaire about very important topics. Indeed, it's not even obvious to me what my "correct" answers should be - it actually makes you think.

Here's a small sample of the deep questions it wants us to consider:

The future of the sustained internet services growth - internet to drive innovation

Challenges and issues here include:

- Design and development of the future internet - semantic web, Internet of Things, scalability, mobility, security etc.

- Keeping the internet open to competition, innovation and user choice - issues here include: interoperability, keeping the internet and internet-based services open and a level playing field for innovation (end-to-end connectivity, service level agreements, cross-platform services, net neutrality and open business models), open standards, low barriers to entry, etc.

...

Promoting access to creativity at all levels

In terms of expectations, Internet users' and the creative content providing sector have never been as at odds as they are today. Creative industry players are struggling to find new viable business models that are able to ensure sufficient revenues for creators and to meet consumer expectations. The market for digital content is still fragmented and broadcasters and other content providers, together with end-users are prevented from benefiting from a true digital Single Market.

Participative platforms have grown as passive users (readers, viewers, consumers etc.) have become active producers (or "prosumers"). These users tend to ignore their statutory rights and their obligations towards rights holders for the content they transform or/and simply share in web 2.0 communities. Moreover, intermediaries generally impose take-it- or-leave-it complex standard terms of use to their users. Against this background, users currently do not enjoy a clear set of rights balancing the conditions set by rights holders (with DRMs [Digital Rights Management] and/or license agreements) and internet services or platforms imposing restrictive standard terms of use.

...

Openness as a global issue

The challenge is to keep the internet open, based on open platforms and open standards. Many issues can only be resolved through international cooperation. The ICT strategies in the EU have often been inward-looking, which is difficult to justify, given the globalisation of modern ICT and the internet.

...

Challenges of participatory web

The growth of the participatory web is adding new challenges and pressures on public administrations, as well as opportunities. Web 2.0 enables citizens to shift their relationship with government. There is increasing demand on administrations to become ever more transparent and open to citizen involvement both in the delivery of services and in the design of public policies. If managed correctly, these demands may lead to delivery of better, more personalised services at lower cost as well as more trust in the public administration. This also applies to key services such as health care and education, where practitioners and beneficiaries of the service alike can benefit from mutually enriching communities of interest.

This is all really important stuff; so if you are an EU citizen, please take part - you have until this Friday, 9 October. The good news is that you don't need to fill in the whole thing - you can just pick and choose the bits that matter to you. Usefully, you can download the questionnaire in a variety of languages before you fill it in online - I highly recommend doing so.

Follow me @glynmoody on Twitter or identi.ca.

Browser Ballot Screen: Time to Prepare

It looks like it's happening:


The European Commission will on 9 October 2009 formally invite comments from consumers, software companies, computer manufacturers and other interested parties on an improved proposal by Microsoft to give present and future users of the Windows PC operating system a greater choice of web browsers. The commitments have been offered by Microsoft after the Commission expressed specific concerns that Microsoft may have infringed EC Treaty rules on abuse of a dominant position (Article 82) by tying its web browser (Internet Explorer) to its client PC operating system Windows, and are an improved version of the proposals made by Microsoft in July 2009 (see MEMO/09/352 ). The improvements concern greater information to consumers about web browsers, the features of each browser, an improved user experience as well as a review by the Commission to ensure the proposals genuinely work to benefit consumers. Interested parties can submit comments within one month. The Commission welcomes Microsoft’s proposal as it has the potential to give European consumers real choice over how they access and use the internet. Following the market test, the Commission could decide to adopt a decision under Article 9 (1) of Regulation 1/2003, which would make the commitments legally binding on Microsoft.

It's hard to comment on this until we see what form the ballot screen will take, but I'm prepared to accept that this may be done in a fair manner. Assuming it is, what might the implications be?

Perhaps the most important one is that Firefox needs to be prepared for a massive onslaught when this goes live. I have heard the slightly tongue-in-cheek suggestion that Microsoft is hoping to bring Firefox's servers to their collective digital knees by allowing such a ballot screen; even assuming that's not the case, it's certainly true that Mozilla must start planning for the sudden peak in interest that is likely to follow the implementation of the ballot screen idea. It would be a terrible shame if people tried to download Firefox and failed because the Mozilla servers keel over.

Follow me @glynmoody on Twitter or identi.ca.

Meet Microsoft, the Delusional

This is hilarious:

Jean Philippe Courtois, president of Microsoft Europe, described the company as an underdog in Paris today.

He said Bing had between three and five percent market share in search and could only grow - although he admitted it could take a long time.

...

Despite Microsoft having to live with open source software for 10 years, it had retained its share in the market place, he said.

Er, what, like the browser sector, where Firefox now has nearly 24% market share worldwide, and Microsoft's share is decreasing? Or Apache's 54% in the Web server world, where Microsoft's share is decreasing? Or GNU/Linux's 88% market share of the top 500 supercomputers in the world, where Microsoft's share is static?

Microsoft the underdog? Or just a dog?

Follow me @glynmoody on Twitter or identi.ca.

Becta Says: Teach Us a Lesson...

...which is surely a offer we can't refuse.

For many years, Becta was one of the main obstacles to getting open source used within UK schools: it simply refused to budge from an almost pathological dependence on Microsoft and its products. Today, the situation is slowing improving, but it will take years to undo the harm caused by Becta's insistence on propagating the Microsoft monoculture in education.

At least Teach Us a Lesson seems to be starting off on the right foot:


Becta’s Teach us a Lesson competition launches today, Wednesday 7 October, following the speech that Kevin Brennan, the Minister for Further Education, made at the Learning Revolution Expo yesterday.

The competition seeks to find the brightest and best ideas for developing online resources for people to find informal learning opportunities that interest them. This will happen by having entries submitted to the competition website, where they will be commented on and rated by other site users.

This, then, is about opening up in terms of drawing on ideas outside Becta. More specifically:

There are some things we are trying to avoid:

* Using proprietary products which will not permit open sharing or which run counter to Government policy on open standards

At long last, Becta seems to have learned its lesson...

Follow me @glynmoody on Twitter or identi.ca.

06 October 2009

Postcodes: Royal Fail

Here's a perfect example of why intellectual commons should not be enclosed.

The UK Postcode data set is obviously crucial information for businesses and ordinary citizens - something that is clearly vital to the smooth running of everyday life. But more than that, it is geographic information that allows all kinds of innovative services to be provided by people with clever ideas and some skill.

That's exactly what happened when the Postcode database was leaked on to the Internet recently. People used that information to do all sorts of things that hadn't been done before, presumably because the company that claims to own this information, Royal Mail, was charging an exorbitant amount for access to it.

And then guess what happened? Yup, the nasties started arriving:

On Friday the 2nd October we received correspondence from the Royal Mail demanding that we close this site down (see below). One of the directors of Ernest Marples Postcodes Ltd has also been threatened personally.

We are not in a position to mount an effective legal challenge against the Royal Mail’s demands and therefore have closed the ErnestMarples.com API effective immediately.

We understand that this will cause harm and considerable inconvenience to the many people who are using or intend to use the API to power socially useful tools, such as HealthWhere, JobcentreProPlus.com and PlanningAlerts.com. For this, we apologise unreservedly.

Specifically, intellectual monopolies of a particularly stupid kind are involved:

Our client is the proprietor of extensive intellectual property rights in the Database, including copyright in both the Database and the software, and database rights.

Here's what Wikipedia has to say about these "database rights":

The Directive 96/9/EC of the European Parliament and of the Council of 11 March 1996 on the legal protection of databases is a European Union directive in the field of copyright law, made under the internal market provisions of the Treaty of Rome. It harmonizes the treatment of databases under copyright law, and creates a new sui generis right for the creators of databases which do not qualify for copyright.

Before 1996, these sui generis "database rights" did not exist; they were created in the EU because lobbyists put forward the argument that they would offer an incentive to create more databases than the Americans, whose database publishers strangely didn't seem to need this new "right" to thrive, and so make the mighty EU even mightier - at least as far as those jolly exciting databases were concerned.

Rather wisely, afterwards the EU decided to do some research in this area, comparing their creation before and after the new sui generis right was brought in, to see just how great that incentive proved to be - a unique opportunity to test the theory that underpins intellectual monopolies. Here are the results of that research:

Introduced to stimulate the production of databases in Europe, the “sui generis”protection has had no proven impact on the production of databases.

According to the Gale Directory of Databases, the number of EU-based database “entries” was 3095 in 2004 as compared to 3092 in 1998 when the first Member States had implemented the “sui generis” protection into national laws.

It is noteworthy that the number of database “entries” dropped just as most of the EU-15 had implemented the Directive into national laws in 2001. In 2001, there were 4085 EU-based “entries” while in 2004 there were only 3095.

While the evidence taken from the GDD relies on the number of database “entries” and not on the overall turnover achieved or the information supplied by means of databases, they remain the only empirical data available.

So, the official EU study finds that the sui generis protection has had no proven impact on the production of databases; in fact, the number of databases went *down* after it was introduced.

Thus these "database rights" have been shown to stifle the production of databases - negating the whole claimed point of their introduction. Moreover, the Royal Mail's bullying of a couple of people who are trying to offer useful services that would not otherwise exist, shows the danger of entrusting such a critical data commons to commercial entities who then enclose it by claiming "database rights" in them: they will always be tempted to maximise their own profit, rather than the value to society as a whole.

Giving the Royal Mail a monopoly on this critical dataset - one that for all practical purposes can never be created again - is like giving a genetics company a monopoly on the human genome. That was attempted (hello, Celera) but, fortunately for us, thwarted, thanks largely to free software. Today, the human genome is an intellectual commons (well, most of it), and the Postcode data should be, too.

Follow me @glynmoody on Twitter or identi.ca.