The Acer Aspire 5710Z has gone on sale in Singapore pre-loaded with Ubuntu Linux instead of Windows. Ubuntu is currently one of the world's most popular and easiest-to-use Linux distributions.
But a spokesperson for Acer told ZDNet.co.uk on Tuesday that the company — one of the world's top laptop manufacturers — had "no plans" to sell any Linux-based systems in the UK. "[Acer models] with Ubuntu pre-loaded are available at the factory level. However, there is no demand for it in the UK. Therefore, those configurations are not an option [for UK customers] at the moment," said the spokesperson.
Well, let's just compare that with Dell, shall we.
Dell creates a site where people can tell the company what it wants. People ask for Dell systems running Ubuntu, people get systems running Ubuntu in the US. People then ask for Dell systems running Ubuntu outside the US, and it looks like that may well happen.
Acer, by contrast, does not ask anyone what they want. As a result, it has no clue what anyone wants, but being a superior company that knows much better than mere customers what they want, it knows full well that people outside the US don't want Ubuntu running on their laptops - even though that is what they are telling Dell.
Guess which company I shall be buying from when Dell starts selling GNU/Linux systems in the UK? Guess which company I shall be recommending to people when they ask for advice about buying PCs in the UK? Guess which company can go and get knotted?
Update: Maybe there's hope.
31 July 2007
In a move that seems like a great model of public and private cooperation, the National Archives and Amazon.com have reached a pact under which Amazon will sell films and video footage gathering dust in the archives’ vaults. These videos and films, which capture some of our most intriguing and important moments in history, are already available at no charge to folks who want to visit the archives’ facilities in College Park, MD, but now they’ll become available to anybody via the Internet.
So let me get this straight. Amazon makes money selling digital copies of archive material, which is freely available, back to the people who own that material: and that's a great model? When will Amazon starting selling digital Brooklyn Bridges, I wonder....
If you ever had any doubts about how amazingly wonderful the open source modelling package Blender was, take a peep at these highly impressive videos - made available as part of Tufts' opencourseware.
Yesterday I wrote about Knuth's wise words on software patents. In the course of trying to discover when exactly they were written (anyone know?) I found what seems to be the main source for it - a wonderful page entirely about patents and patent madness.
It only goes up to 2003, but nonetheless has lots of unusual links on the subject that are well worth exploring if, like me, you are a sad individual who finds this stuff both important and fascinating.
We went public with Lughenjo four weeks ago, primarily to test our idea on a wider audience. Since then we have continued our conversations with social entrepreneurs and NGOs and worked on producing a business plan.
The feedback that we received was overwhelmingly that Lughenjo was a good thing for us to do. There were, however, two problems. Firstly it was not obviously something that The Economist Group should do. Secondly, and more importantly, it became clear that there was not an immediate demand for a knowledge network from NGOs and social entrepreneurs.
The upshot was that we would have had to force the creation of the network from a demand point of view as well as marketing it to potential donors. This would have put a barrier in the way of us being able to grow the community quickly and therefore monetising it.
Well, anybody worried about "monetising" something deserves to fail in my book. Shame on you Economist, whatever happened to style?
30 July 2007
Hey, music industry, I think the people formerly known as the audience are trying to tell you something:
Now in its fourth year, the survey - carried out by Entertainment Media Research in conjunction with media lawyers Olswang - found that 43% of UK consumers admitted to downloading music without paying for it, adding up to a hefty hike from 36% in 2006.
Commenting on the slowing growth of authorised downloads (up by just 15 per cent this year, compared to 40 per cent in 2006), Hart said that folks are donning their pirate’s hats and grabbing illegal downloads because official downloads are seen as too pricey.
The survey backs up that claim, with 84 per cent saying that older digital downloads should be made cheaper, while nearly half (48 per cent) said that they’d be happy to pay more for newer releases.
John Enser, Oslang’s head honcho of music, added: “The music industry needs to embrace new opportunities being generated by the increasing popularity of music on social networking sites. Surfing these sites and discovering new music is widespread with the latest generation of online consumers but the process of actually purchasing the music needs to be made easier to encourage sales and develop this new market.”
If computing has a patron saint, it is the great and amazing Donald Knuth. Put another way, he is the god of computer algorithms, so I was particularly pleased to come across this definitive statement on their patentability:
In the period 1945-1980, it was generally believed that patent law did not pertain to software. However, it now a ppears that some people have received patents for algorithms of practical importance--e.g., Lempel-Ziv compression and RSA public key encryption--and are now legally preventing other programmers from using these algorithms.
This is a serious change from the previous policy under which the computer revolution became possible, and I fear this change will be harmful for society. It certainly would have had a profoundly negative effect on my own work: For example, I developed software called TeX that is now used to produce more than 90% of all books and journals in mathematics and physics and to produce hundreds of thousands of technical reports in all scientific disciplines. If software patents had been commonplace in 1980, I would not have been able to create such a system, nor would I probably have ever thought of doing it, nor can I imagine anyone else doing so.
I am told that the courts are trying to make a distinction between mathematical algorithms and nonmathematical algorithms. To a computer scientist, this makes no sense, because every algorithm is as mathematical as anything could be. An algorithm is an abstract concept unrelated to physical laws of the universe.
Nor is it possible to distinguish between "numerical" and "nonnumerical" algorithms, as if numbers were somehow different from other kinds of precise information. All data are numbers, and all numbers are data. Mathematicians work much more with symbolic entities than with numbers.
Therefore the idea of passing laws that say some kinds of algorithms belong to mathematics and some do not strikes me as absurd as the 19thcentury attempts of the Indiana legislature to pass a law that the ratio of a circle's circumference to its diameter is exactly 3, not approximately 3.1416. It's like the medieval church ruling that the sun revolves about the earth. Man-made laws can be significantly helpful but not when they contradict fundamental truths.
Congress wisely decided long ago that mathematical things cannot be patented. Surely nobody could apply mathematics if it were necessary to pay a license fee whenever the theorem of Pythagoras is employed. The basic algorithmic ideas that people are now rushing to patent are so fundamental, the result threatens to be like what would happen if we allowed authors to have patents on individual words and concepts. Novelists or journalists would be unable to write stories unless their publishers had permission from the owners of the words. Algorithms are exactly as basic to software as words are to writers, because they are the fundamental building blocks needed to make interesting products.
Amen to that. (Via Coding Horror.)
How mad is this?
Advanced European countries are increasingly looking for channels to school their neighbours and worldwide free-trade agreement partners on the enforcement of western-style intellectual property rights.
And how bad is the premise:
The basic underlying assumption of the meeting was that a stronger intellectual property system is beneficial, and that UNECE members have knowledge and ideas to patent and protect. A source characterised the view as: A well-designed intellectual property regime increases national wealth and benefits consumers by stimulating research and investment into new technologies and innovative products, and by enabling the transfer of technology, including between countries at different stages of economic development.
Well, no, actually. As history shows, intellectual monopolies do nothing to increase national wealth overall: they just make the holders of the monopolies richer. Society as a whole loses out. Spreading this kind of misinformation is downright immoral.
29 July 2007
This is why it is utterly pointless for the BBC to go to all the trouble of wrapping DRM around its content - note, *its* content, not other people's - and inconveniencing most of the online world in the process:
we're hearing that FairUse4WM strips the files of their DRM -- anyone try it out yet?
And the answer is....
27 July 2007
Here's a worrying development over in the Mozilla community:
Mozilla has been supporting Thunderbird as a product since the beginning of the Foundation. The result is a good, solid product that provides an open alternative for desktop mail. However, the Thunderbird effort is dwarfed by the enormous energy and community focused on the web, Firefox and the ecosystem around it. As a result, Mozilla doesn't focus on Thunderbird as much as we do browsing and Firefox and we don't expect this to change in the foreseeable future. We are convinced that our current focus - delivering the web, mostly through browsing and related services - is the correct priority. At the same time, the Thunderbird team is extremely dedicated and competent, and we all want to see them do as much as possible with Thunderbird.
We have concluded that we should find a new, separate organizational setting for Thunderbird; one that allows the Thunderbird community to determine its own destiny.
Mozilla is exploring the options for an organization specifically focused on serving Thunderbird users. A separate organization focused on Thunderbird will both be able to move independently and will need to do so to deepen community and user involvement. We're not yet sure what this organization will look like. We've thought about a few different options. I've described them below. If you've got a different idea please let us know.
What's worrying about this is that it seems to demonstrate a tunnel vision, where Firefox (and making money from it) are foregrounded above everything else. The fact is, email is a critical application, even if more and more people use Web-based mail (as I do - but I still use Thunderbird too). Moreover, Mozilla is a foundation, and that implies looking at the bigger picture, not concentrating - as a company might - on the success of its main "product".
The open source world needs Thunderbird - indeed, the wider software community needs it. Although I accept that it lacks the community that Firefox has generated, that is not a reason to jettison it, and hope for the best. On the contrary: the very difficulties that Thunderbird has in firing up a community and in moving forward are precisely why the Mozilla Foundation should keep it under its wing.
Extraordinary column in BusinessWeek:
While Microsoft leads in India and China, Linux is mounting a strong challenge in both nations. The Linux community has signed a deal with Beijing to make Linux the default operating system for computers used by the Chinese government and many parts of the Chinese educational system. In India, the prices of Windows and Office are so high that Linux is the only practical, affordable choice for most of the population.
In this context, applying Western IP enforcement policies to stem the flood of illegal copies of Windows in China and India risks winning the battle (to deter and punish IP infringement) while losing the war (to become the dominant standard operating system on the desktop). As long as Linux remains a serious rival in China and India, Microsoft should welcome pirated copies of its software. Illegal versions of Windows are free, which helps Microsoft offset the initial cost advantage of "free" open-source software.
Every pirated copy installed on a Chinese or Indian computer brings one more person into the Microsoft ecosystem. This strengthens Microsoft's market for third-party developers of applications, tools, and other complementary products. Equally important, it denies Linux that next new customer who would strengthen the open-source ecosystem against Windows.
Maybe it's to be expected that arch-capitalist tool BusinessWeek would be offering free advice to Microsoft on how to crush that commie open source stuff. What I find harder to comprehend is the fact that the author of this piece is a self-styled "authority on open innovation, open business models, and more open approaches to intellectual property management" - all with a view to stamping it out, apparently.
As the post below indicates, one reason that open content strategies are working is that online advertising is increasingly profitable (just ask Google). Further proof that advertising is evolving rapidly is the rise of OpenAds, one of open source's better-kept secrets. Here's a piece by Matt Asay with some useful background:
OpenAds is one of the most interesting open source projects/companies on the planet. Period. It's an open source ad server. Like Doubleclick without the lock-in or fees. In other words, open source. 100% GPLv2. I guess it should be no surprise that the world's most popular ad server, powering Web 2.0 business models, is open source, just as the LAMP stack is the technological basis for Web 2.0 sites/services.
Amazingly, OpenAds is British, too.
One of the constant themes of this blog is that there's plenty of money to be made by giving away things for free. Here's an interesting study by Neil Thurman of the UK newspapers sector that confirms precisely that:
Advertising is relevant to the issue of content charging because, to a certain extent, there is a trade-off between them. Content charging, by limiting access, reduces the number of users to whom a page is exposed. When FT.com introduced a subscription barrier to parts of its content in May 2002, user numbers fell dramatically, as did its advertising revenue (Ó hAnluain, 2004). Conversely, when Times Online removed the subscription barrier it had imposed on overseas users, it experienced a “huge” increase in traffic (Bale, 2006).
Users are put off by having to pay, but traffic is also affected for technological reasons. Content charging can alienate sites from search engines and aggregators like Google (Outing, 2005). Similarly, imposing a subscription barrier also isolates newspaper websites like the Wall Street Journal’s WSJ.com from blogs, a growing source of traffic (Penenberg, 2005). In the current market, many newspapers feel that the revenue they could gain from content charging would be less than what they would lose in advertising. Even the UK newspapers who are currently charging for significant amounts of content — FT.com, Independent.co.uk, and Scotsman.com—can see the potential benefits of dropping these barriers
A companion study indicates that opening up can bring with it some unexpected benefits:
Some British news websites are attracting larger audiences than their American competitors in US regional and national markets. At the British news websites studied, Americans made up an average of 36 per cent of the total audience with up to another 39 per cent of readers from countries other than the US. Visibility on portals like the Drudge Report and on indexes such as Google News brings considerable international traffic but is partly dependent on particular genres of story and fast publication times.
Opening up means that users get to decide whether to read you, and that quality often wins out. Newspapers with closed content are unlikely to attract this kind of passing trade, and will therefore lose global influence as well as advertising revenue. (Via Antony Mayfield.)
26 July 2007
Bring on the opens: here's a new foundation to support OpenBSD, the Cinderella of the open world, and a few other worthy projects:
The OpenBSD Foundation is a Canadian not-for-profit corporation which exists to support OpenBSD and related projects such as OpenSSH, OpenBGPD, OpenNTPD, and OpenCVS. While the foundation works in close cooperation with the developers of these wonderful free software projects, it is a separate entity.
Formally, the corporation's objects are to support and further the development, advancement, and maintenance of free software based on the OpenBSD operating system, including the operating system itself and related free software projects.
We all know that the ideas behind open source are out of this world, and now here's the proof:
Space enthusiast and engineer Paul Wooster wants to open the source code for outer space, because, he says, it should be easier for everyone who wants to contribute to human activities in space to do so, not just people with advanced degrees in rocketry. To that end, Wooster has established DevelopSpace, a community based on open source philosophies, designed to attract anyone interested in sharing their skills in order to make more space exploration possible.
"We're focused on building up the technical foundations of human activities in space, identifying the current barriers to those activities, and then coming up with engineered solutions to those barriers -- but doing so in an open source manner. If, for example, I design a solar-powered system for use on Mars and do some testing in the lab, rather than just writing up a paper and publishing it in a journal or a .PDF format where it's difficult to extract information, I would post all of the CAD files and the more detailed engineering analyses so someone else can come along and improve on my design -- they don't have to start from scratch. Over time, what will happen is that more and more people will get involved in these actitivies and we will make technical progress toward lowering the barriers to entry for someone who wants to set up a human base on Mars, or an orbiting outpost. I don't actually see the group in the near term doing those types of things. This is much more of laying the foundations."
(With thanks to James Tyrrell.)
Here's double good news:
SugarCRM Inc., the world’s leading provider of commercial open source customer relationship management (CRM) software, today announced the upcoming release of Sugar Community Edition 5.0 will be licensed under the new Version 3 of the GNU General Public License (GPL). The GPL is the most widely used free and open source (FOSS) license in the market.
Double because it sees yet another major open source enterprise stack company adopt the GNU GPL, and because it's gone straight to version 3, with no ifs and buts, which will only strengthen that licence's position. Interesting, too, Eben Moglen's quoted comments:
"We believe that sharing knowledge is good. We encourage other important free and open source software projects to take this step and join us in making better software."
One of the most compelling applications of open content is in the educational sphere. After all, it's crazy for teachers to keep on creating the same content again and again: the whole idea of knowledge is to build on what has been learned. So it's good to see the Creative Commons setting up a new arm aimed specifically at promoting the re-use of materials here. It's called ccLearn: at the moment, there's not much to see there apart from the Open Education Search project, but I'm sure that things will grow quickly - the logic is compelling.
I was pleased to see that the story about Prince giving away CDs in various ways, and making money from live performances, is starting to get picked up by more news outlets. Obviously, when people are presented with a real-life alternative to making money from CDs, things become a little clearer.
But I was disappointed to come across this story about the Los Angeles Times killing a feature by Patrick Goldstein, one of its own reporters, that suggested it follow suit:
His The Big Picture column for Tuesday was killed, apparently by associate editor John Montorio. Goldstein's offense was to propose that the Times follow the lead of the U.K.'s Mail on Sunday (which distributed 2.9 million free Prince CDs) and partner with older artists to give away music in the paper. He argued it could help make the Times website a destination for fans and reduce the need for front page ads (which the editor of the Times himself calls a huge mistake.)
This was doubly stupid. First, it's a great piece. Here's the conclusion:
Giving music away doesn’t mean it has lost its value, just that its value is no longer moored to the price of a CD. Like it or not, the CD is dying, as is the culture of newsprint. People want their music — and their news — in new ways. It’s time we embraced change instead of always worrying if some brash new idea — like giving away music — would tarnish our sober minded image. When businesses are faced with radical change, they are usually forced to ask — is it a threat or an opportunity? Guess which choice is the right answer.
But spiking this piece was also stupid because it was bound to get out - both the piece and story about its spiking - and people like me were bound to spread the news. Thanks to this new-fangled Internet thing, truth will out - eventually. (via TechDirect.)
Update: And as further proof that you can't just bury this stuff, here's a New York Times piece about the incident.
25 July 2007
I've always felt rather ambivalent about Tim O'Reilly. On the one hand, he is undoubtedly a very shrewd reader of markets, and has undoubtedly contributed hugely to the rise of the open source movement. On the other, he always seems to take what might be called an extreme pragmatist position, where questions of making plenty of dosh always seem to be lurking in the background (and sometimes in the foreground).
I'm glad to see it's not only me:
At the O'Reilly Open Source Convention today, Software Freedom Law Center director Eben Moglen threw down the gauntlet to O'Reilly founder and CEO Tim O'Reilly. Saying that O'Reilly had spent 10 years making money and building the O'Reilly name, Moglen invited O'Reilly to stop being "frivolous" and to join the conversation about software freedom.
So it's really a matter of whether your on Eben's side, or Tim's side....
Will this response from the UK Department for Culture, Media and Sport go down in history as the great turning point for copyright, when the constant extension ratchet was halted and eventually reversed?
Maybe I'm an incurably optimist, but I have to say I was pretty impressed by the generally sane tone of this document after years of industry-driven exaggeration about "piracy" and such-like. The best demonstration of this comes right at the end, where the earlier proposal by the House of Commons Culture Committee to extend the term of copyright in sound recordings is discussed. Here's what the report has to say:
The Government appreciates the work of the Committee and the deliberation it has given to this subject. As the Committee noted,the independent Gowers Review also considered this issue in detail and recommended that the European Commission retain a term of protection for sound recordings and performers of 50 years. The Review undertook a detailed analysis of all the arguments put forward,including the moral arguments regarding the treatment of performers. It concluded that an extension would not benefit the majority of performers,most of whom have contractual relationships requiring their royalties be paid back to the record label. It also concluded that an extension would have a negative impact on the balance of trade and that it would not increase incentives to create new works.Furthermore,it considered not just the impact on the music industry but on the economy as a whole,and concluded that an extension would lead to increased costs to industry,such as those who use music – whether to provide ambience in a shop or restaurant or for TV or radio broadcasting – and to consumers who would have to pay royalties for longer. In reaching such conclusions,the Review took account of the question of parity with other countries such as the US,and concluded that,although royalties were payable for longer there,the total amount was likely to be similar – or possibly less – as there were fewer revenue streams available under the US system.
This is doubly important, because it will have important knock-on effects beyond the UK. As Becky Hogge of the Open Rights Groups rightly points out:
This is significant, since the UK government is likely to have a disproportionately loud voice on this issue both because it is home to the most lucrative recording industry in Europe and because it has taken the time to review this issue in detail.
So we have the prospect of Europe following the UK's lead in halting the constant copyright extension. This, in its turn, will help to put a brake on such copyright extensions around the world, since there will no longer be the argument that "eveyone else is doing it, we must follow suit". Maybe it's too much to hope that in due course copyright terms will start to be reduced - but then, as I said, I'm an incurable optimist.
23 July 2007
Programs are sets of instructions - rather like recipes. So if you can have open source code, why not open source food:
Open Source Food came to fruition because me and my father wanted to create a place for people like us. We’re not professional cooks, we just love food. We want to share, learn and improve ourselves with the help of like-minded food lovers. Open Source Food is a platform for that.
Truly right-on, not only does it adopt CC licences for the content, it warns:
Please be aware that in legal terms, recipes count as a method or technique and therefore cannot technically be copyrighted.
Mind you, that hasn't stopped some sad individuals from trying. (Via eHub.)
The enterprise content management company Alfesco has cropped up a few times on these pages. It's increasingly clear to me that it is one of the leaders of the second-generation open source companies that are starting to make their mark in the wider world of business software - not least because it employs the one-man open source powerhouse that is Matt Asay.
A further sign of Alfresco's importance in this sector is the appearance of its Open Source Barometer:
The Alfresco open source barometer is a survey, conducted April through June 2007, using opt-in data provided by 10,000 of the 15,000 Alfresco community members with the aim of providing a global survey of trends in the use of open source software in the enterprise.
Users were asked about their preferences in operating systems, application servers, databases, browsers, and portals to capture the latest information in how companies today evaluate and deploy open source and legacy proprietary software stacks in the enterprise.
The report is valuable, because it's based on a serious, if necessarily skewed, sample size. Two results stand out: that people increasingly are developing on Windows, and then deploying on GNU/Linux (something I'd noticed too), and that the UK lags behinds other countries as far as Alfresco's products are concerned:
The survey found that the U.S. is leading open source adoption globally. We believe the Global 2000 is seeking innovation and better value for their technology investments whereas in Europe open source adoption is often driven by governments seeking better value for their citizens. The research also showed that the U.K. lags behind in the adoption of open source suggesting less government emphasis compared with other European countries such as France, Germany, Spain and Italy.
Apparently the survey will appear every six months, which is good news: tracking changes in its results should prove fascinating.
As I and a few other enlightened individuals have been banging on about for some time, allowing digital files to be copied is not the end of business - just of business as usual. Essentially, people selling physical things - like books or CDs - need to recognise the differences from digital ones, and build on them positively.
Here's a good example:
At a time when CD price wars and music downloads are putting entire High Street chains at risk, independent retailers Rough Trade are opening what they say is the country's biggest music-only specialist store.
Despite the company's niche reputation, he feels it can fulfil what he sees as the "enormous demand" for a shop that offers expertise and can recommend music with authority - and he doesn't think downloads are killing the CD.
"With this store, we feel there's a dormant music shopper out there who's not buying music from the High Street simply because they don't like High Street retailers, not because they've gone off physical formats," he says.
"If anything, the people I talk to appreciate vinyl and CDs more than ever in this digital age. It's just that they've gone off the way it's sold.
Exactly. Shops are about the experience of shopping, not just of buying. Similarly, CDs and other analogue obejcts are about the experience of having and holding such objects, not just what they contain. As the new Rough Trade shop shows, some people are beginning to get this. (Via TechDirt.)
22 July 2007
Well, it had to happen:
Social networking site Bebo is likely to follow Facebook's lead and open up its site to developers to create applications that work within the site.
Mind you, the following comment is so wide of the mark, that it makes you wonder whether Michael Birch, Bebo's chief executive, really understands why he's opening up, and what it really means:
"Obviously in social networks there's this conflicting thing of control, of being a closed network and us making all the money, and then opening up to the greater good of the social network.
No, no, no: opening up is how you make even more money, you twit. (Via Antony Mayfield).
Although much of the shine has worn off the Google halo, there's no denying that, regardless or whether its acting purely from altruistic motives (probably not), it certainly gets the benefits of openness:
Google announced today that should the Federal Communications Commission adopt a framework requiring greater competition and consumer choice, Google intends to participate in the federal government’s upcoming auction of wireless spectrum in the 700 megahertz (MHz) band.
In a filing with the FCC on July 9, Google urged the Commission to adopt rules for the auction that ensure that, regardless of who wins the spectrum at auction, consumers' interests are served. Specifically, Google encouraged the FCC to require the adoption of four types of "open" platforms as part of the license conditions:
* Open applications: Consumers should be able to download and utilize any software applications, content, or services they desire;
* Open devices: Consumers should be able to utilize a handheld communications device with whatever wireless network they prefer;
* Open services: Third parties (resellers) should be able to acquire wireless services from a 700 MHz licensee on a wholesale basis, based on reasonably nondiscriminatory commercial terms; and
* Open networks: Third parties (like internet service providers) should be able to interconnect at any technically feasible point in a 700 MHz licensee's wireless network.
Today, as a sign of Google’s commitment to promoting greater innovation and choices for consumers, CEO Eric Schmidt sent a letter to FCC Chairman Kevin Martin, stating that should the FCC adopt all four license conditions requested above, Google intends to commit a minimum of $4.6 billion to bidding in the upcoming 700 MHz auction.
This could get interesting.
A little while back I wrote about the idea of using wikis for open government. Peter Suber - about whom Bill Hooker commented recently "not a sparrow falls in the OA world but PS knows about it!" - emailed me with some interesting news about an earlier project of his called Nomic:
Nomic is a game I invented in 1982. It's a game in which changing the rules is a move. The Initial Set of rules does little more than regulate the rule-changing process. While most of its initial rules are procedural in this sense, it does have one substantive rule (on how to earn points toward winning); but this rule is deliberately boring so that players will quickly amend it to please themselves. The Initial Set of rules, some commentary by me, and some reflections by Douglas Hofstadter, were published in Hofstadter's "Metamagical Themas" column in Scientific American in June of 1982. It was quickly translated into many European and Asian languages. Games were regularly played, and kicked off, the ARPANET, the Defense Department network which sired the Internet. Nomic has been used to stimulate artistic creativity, simulate the circulation of money, structure group therapy sessions, train managers, and to teach public speaking, legal reasoning, and legislative drafting. Nomic games have sent ambassadors to other Nomic games, formed federations, and played Meta-Nomic. Nomic games have experienced revolution, oppressive coups, and the restoration of popular sovereignty. Above all, Nomic has been fun for thousands of players around the world. For me, it was intended to illustrate and embody the thesis of my book, The Paradox of Self-Amendment, that a legal "rule of change" such as a constitutional amendment clause may apply to itself and authorize its own amendment. (Nomic is the third appendix of the book.)
The connection with open governance is clear. Peter passed on the news that people are trying to apply a Nomic-based approach to open source:
Just last week, by chance, a total stranger proposed a Nomic-variant as a serious system of "Open-Source Self-Governance" (his words).
Here's what that site has to say about the project, which is called Efficasync:
Efficasync is a method of open-source self-governance, where all the members of a group have the ability to examine, discuss and modify their group’s set of operational goals, reasonings, constraints, procedures and arrangements. In computer lingo, each member of such a group has both ‘read’ and ‘write(2)’ permissions to this set of governing statements. As demonstrated by the previous two lines, this document occasionally recasts a few traditional views of governance into a computer programmer’s frame of reference. The programmer’s paradigm holds a new, and potentially valuable, perspective for democratic governance. This document’s purpose is to describe a specific way, based on this new perspective, that a directly-democratic group’s governing infrastructure could be arranged. In doing this, the three main components which constitute Efficasync are explained: a Nomic, a particular graphical interface, and a starting set of ‘rules.’ This document was written with the intention of presenting a prototype for emulation and extension by groups wishing to operate as open-source selfgoverning entities.
21 July 2007
Has everyone gone Facebook mad? It certainly seems so, and apparently I'm not the only one to think so. But whatever your views of Facebook now, it looks increasingly likely that it's going to be very big.
As I mentioned recently, the first sign that it had aspirations to being more than just another social network was when it opened up its platform. Now, it has underlined the platform aspect by purchasing Parakey.
Who? you might well say. Well, this might give you a hint of why it's an interesting move:
Parakey is intended to be a platform for tools that can manipulate just about anything on your hard drive—e-mail, photos, videos, recipes, calendars. In fact, it looks like a fairly ordinary Web site, which you can edit. You can go online, click through your files and view the contents, even tweak them. You can also check off the stuff you want the rest of the world to be able to see. Others can do so by visiting your Parakey site, just as they would surf anywhere else on the Web. Best of all, the part of Parakey that’s online communicates with the part of Parakey running on your home computer, synchronizing the contents of your Parakey pages with their latest versions on your computer. That means you can do the work of updating your site off-line, too. Friends and relatives—and hackers—do not have direct access to your computer; they’re just visiting a site that reflects only the portion of your stuff that you want them to be able to see.
Interested? You should be.
In explaining Parakey, Ross cuts to the chase. “We all know people…who have all this content that they are not publishing stored on their computers,” he says. “We’re trying to persuade them to live their lives online.”
"Live their lives online": well, that explains why Facebook bought the outfit. Among other things, Parakey will let Facebook users twiddle endlessly with their profiles even when they're offline.
Oh, and that "Ross" is Blake Ross, one of the moving forces behind Firefox. Parakey is based on Firefox technology, and will be (partly) open source. Assuming that Facebook keeps those parts open source (and it's hard to see how it could avoid doing so without rewriting the code from scratch), that means that Facebook could well become something of an ally for free software.
Well, I suppose that's a good reason to join the Facebook stampede.
Remember patents? They're those things that are supposed to promote innovation. Take surgeons, for example: they would never invent new ways of saving lives without some kind of financial incentive to do so - I mean, why should they?
So it's only logical that patent lawyers should be encouraging surgeons to patent anything that might save lives:
"What it does is it provides something for other companies to work around. The patent is out there. It's wide open. The whole world looks at it and thinks, 'How do I get around it?' That inspires more creativity and more development," Raciti said.
Well, that's logical: let's put obstacles in the way of people trying to save lives - it's more of a challenge.
The medical community is weary. "It's not clear that providing a monopoly over a certain process promotes innovation in the field of patient care delivery," said Aaron Kesselheim, a patent attorney and doctor who conducts health policy research at Brigham and Women's Hospital in Boston.
"The legal concern is that physicians won't do something because they're concerned that somebody will sue them, and if that affects the care that they are trying to provide to the patients, then that's a negative," he said.
Sometimes you get the impression that patent lawyers really want to hated. (Via TechDirt.)
20 July 2007
Radio frequencies form a commons for each country. Mostly these have been enclosed through auctions selling them to the highest bidder. Whether that's a good idea is another matter, but assuming for a moment that you think it is, at the very least you'd try to get plenty of dosh for this precious resource.
Well, according to this fascinating, and extremely thorough, paper, that didn't happen in the US:
According to calculations presented in this paper, since 1993, the government has given to private interests as much as $480 billion in spectrum usage rights without public compensation. That comes to more than 90 percent of the value of spectrum usage rights it has assigned from 1993 through the present.
Now, admittedly "as much as $480 billion" includes zero, but I don't think that's the case here. We're talking about hundreds of billions of dollars that the US public won't be getting. Which means that there are some companies - and corporate fatcats - who are richer by the same amount.
So, how about if we start treating like a commons instead? That way, you can be sure that everyone gets their fair share - unlike the situation in America.
19 July 2007
For anyone who is sceptical about the possibilities of Second Life - and virtual worlds in general - point them at this rather impressive video. It is a recreation, in 3D, of Van Gogh's Starry Night, which grows before our eyes. Interesting to note, too, that if copyright lasted for ever (even minus a day), this kind of creative re-use would never be possible.
I'm not a big fan of IP TV. After all, the Net is essentially everything TV isn't - interactive, non-linear, intelligent (well, some of it). But if you really must watch TV-like things online, the best thing to do is to check out Miro - the new name for Democracy Player (which always struck me as misleading). Speaking as a non-connoisseur of these things, it seems to do everything it should, it looks pretty cool - and it's free software. But only if you absolutely must.
Continuing his great series of interviews with key people in the world of business open source, Matt Asay (does this man never sleep?) talks to Matthew Szulik, CEO of Red Hat. I wrote a lot about Red Hat in the early days, but I've not followed it so closely recently (bad boy), so it was fascinating to get an update on what is arguably the most successful and most important open source company. In particular, I found this revealing:
In sum, our belief is that the best management is the peer process, just as in open source. If you measure up to your peers at Red Hat, you thrive. If you don't, you either change or self-select out. When you find people that can do things in an "honest way," without a mercenary view of their assignment, you win. A lot of people don't like this approach, and they leave.
In other words, the best way to run an open source company is to use the open source methodology. Imagine that.
18 July 2007
I've written before about Microsoft's Photosynth, which draws on the Net's visual commons - Flickr, typically - to create three-dimensional images. Here's another research project that's just as cool - and just as good a demonstration of why every contribution to a commons enriches us all:
What can you do with a million images? In this paper we present a new image completion algorithm powered by a huge database of photographs gathered from the Web. The algorithm patches up holes in images by finding similar image regions in the database that are not only seamless but also semantically valid. Our chief insight is that while the space of images is effectively infinite, the space of semantically differentiable scenes is actually not that large. For many image completion tasks we are able to find similar scenes which contain image fragments that will convincingly complete the image. Our algorithm is entirely data-driven, requiring no annotations or labelling by the user.
One of the most interesting discoveries was the following:
It takes a large amount of data for our method to succeed. We saw dramatic improvement when moving from ten thousand to two million images. But two million is still a tiny fraction of the high quality photographs available on sites like Picasa or Flickr (which has approximately 500 million photos). The number of photos on the entire Internet is surely orders of magnitude larger still. Therefore, our approach would be an attractive web-based application. A user would submit an incomplete photo and a remote service would search a massive database, in parallel, and return results.
In other words, the bigger the commons, the more everyone benefits.
Beyond the particular graphics application, the deeper question for all appearance-based data-driven methods is this: would it be possible to ever have enough data to represent the entire visual world? Clearly, attempting to gather all possible images of the world is a futile task, but what about collecting the set of all semantically differentiable scenes? That is, given any input image can we find a scene that is “similar enough” under some metric? The truly exciting (and surprising!) result of our work is that not only does it seem possible, but the number of required images might not be astronomically large. This paper, along with work by Torralba et al. , suggest the feasibility of sampling from the entire space of scenes as a way of exhaustively modelling our visual world.
But that is only feasible if that "space of scenes" is a commons. (BTW, do check out the paper's sample images - they're amazing.)
Maybe because films remain glamorous to some, applying open source ideas to cinema seems to be a perennial favourite. Here's another one, Jathia’s Wager:
Jathia’s Wager is a science fiction story about a young man living in an isolated community of humans, who must make a life changing decision about his future and his species.
Details of its open process:
Step 1: Initial Script is put online, press releases are issued and project is announced to the world (COMPLETED )
Step 2: Script changes and alternative versions are submitted and “hashed out in the forums.” Community votes (hopefully via digg if the community embraces this) on the scripts they like the most.
Step 3: Top 5 scripts are chosen from voting and resources (if you add tons of impossible effects, but no one donates resources to create those, then there’s not much we can do) and posted on the site.
Step 4: Casting / scene scouting starts in Los Angeles (of course everyone is free and encouraged to shoot their own versions as well). Videos and casting stuff will be posted online for the community to contribute to.
Step 5: Shooting, all raw video files are uploaded for the community to edit.
Step 6: Post production, editing, finishing touches, DVD authoring.
Step 7: 5 official versions of the same film are released. Links and posts to all derivatives will be posted in the forums and we’ll have successfully made a collaborative, open-source film that anyone can remake, reedit or reinterpret.
Although unintentional in this case, here's a good example of why we need open government:
"Big Brother" plans to automatically hand the police details of the daily journeys of millions of motorists tracked by road pricing cameras across the country were inadvertently disclosed by the Home Office last night.
Leaked Whitehall background papers reveal that Home Office and transport ministers have clashed over plans for legislation this autumn enabling the police to get automatic "real-time" access to the bulk data from the traffic cameras now going into operation. The Home Office says the police need the data from the cameras, which can read and store every passing numberplate, "for all crime fighting purposes".
Thank goodness there won't be any function creep.
Some while back I wrote a piece called "Parallel Universes" looking at the surprising similarities between the world of open source and open access. So I was interested to see that there's trouble 't mill over the use and misuse of the term "open access":
I don't know and I don't care what [Nature editor] Maxine means by "open" or "free". I care what the BBB [Budapest-Bethesda-Berlin] Declarations mean. Peter is not defining terms however he likes; he is working with published, widely accepted definitions. He is well within his rights to expect that other people will indeed use the same definitions: that is, after all, the point of having developed and published them. Nature does NOT have "many open access projects and products", it has one (barely) OA journal and the excellent Precedings, together with a number of commendable free-to-read initiatives (blogs, Nature Network, the various free-to-read web special collections, etc). "Open Access" is not a fuzzy buzzword that Maxine is free to define as she sees fit, and if she is going to start abusing it as marketing for Nature then she most certainly does need telling off.
Which is all rather similar to a discussion taking place in the computer world about who has the right to call themselves "open source".
Talking of open government:
To Chance in particular, and the Greens in general, the promoting of FOSS is ultimately the promotion of the party's own values. Simply encouraging the use of FOSS in public institutions, he suggests, would improve government, "both because it would be more focused on a just, equitable, and sustainable future and because it would force government to be more open, transparent, and participatory. We suffer from an incredibly centralized, opaque, and disempowering government in England and Wales. We desperately need the participatory ethic of free software to transform government."
17 July 2007
Oh, this is rich:
A revised version of FairUse4WM reappeared on forums late last week, and the utility now effectively strips the DRM from iPlayer content allowing it to be copied and played into perpetuity rather than for the limited period intended by the BBC.
Which, of course, was inevitable. But what's droll is the BBC's spin:
"We know that some people can — and do — download BBC programmes illegally. This isn't the first piece of software to be hacked or bypassed. Nor will it be the last. No system is perfect. We believe that the overwhelming majority of licence-fee payers welcome this service and will want to use it fairly."
So, let's get this straight. The "overwhelming majority of licence-fee payers welcome this service and will want to use it fairly", while "some people can — and do — download BBC programmes illegally".
And yet the BBC insists on imposing DRM on the "overwhelming majority" who "want to use it fairly" - and so don't need DRM; meanwhile, the people who "can - and do - download BBC programmes illegally" will be able to get around the DRM anyway, as the BBC admits.
So DRM is pointless for both groups, and hence pointless for everyone. Moreover, it not only inconveniences the law-abiding majority, it locks some of them out entirely, in the case of Mac and GNU/Linux users.
God, what a mess the BBC is in - and not just logically.
In case you hadn't noticed, the last Harry Potter novel is coming out on Saturday. It's the end of an era - not just because it's the last, but also because, apparently:
Harry Potter and the Deathly Hollows ... has hit BitTorrent.
Assuming this is actually Harry Potter and the Deathly Hallows, and if the most closely-guarded text in the world of intellectual monopolies really is out, maybe it's time for the guardians of those monopolies to forget about them.
(Hint: the latest Harry Potter book does not consist purely of text, and the essence of reading it is not something you can download from BitTorrent.)
Wow, this was precisely the kind of thing I was calling for - but not expecting to happen:
In the wake of the recently concluded broadcasting negotiations at WIPO in June 2007 (Standing Committee on Copyright and Related Rights) where a proposed instrument for the protection of broadcasting organizations was put on cold storage but not terminated, a Chilean proposal on the examination of limitations and exceptions in the copyright area has come to the fore.
Chile has proposed that the WIPO copyright committee examine limitations and exceptions for the blind, educators and librarians. India has reinforced Chile’s reformist thrust by calling upon WIPO to consider socially relevant issues such as access to knowledge and education.
Chile’s multi-pronged endeavours to imbue the WIPO patent committee and the WIPO copyright committee with a more reflective and development- oriented approach is welcome and of significant strategic import to the Development Agenda and the access to knowledge (a2k) movement. In addition to the limitations and exceptions proposal tabled to the SCP, Chile’s proposal on patents and standards carries reinforces discussions that have begun to take place at the World Trade Organization and the Internet Governance Forum on remedies to mitigate the inherent tension between the public interest and patents in information and communications (ICT) standards.
These might seem tiny, tangential, even trivial issues, but don't be fooled: even raising them within the context of WIPO's hitherto hardline pro-intellectual monopolist framework is of huge symbolic significance. (Via IP Justice.)
Oh-oh, not good:
The emerging open source insurgency in Pakistan may have found its plausible promise [= alpha code release]: to defeat the Pakistani military establishment.
If true, Pakistan may devolve much faster than anticipated. Will we see it hollow out?
Let's hope Western governments have plenty of radiation detectors on order....
There is a scandal brewing over open standards in Europe:
On June 29 2007, the European Commission agency IDABC published document written on contract by Gartner initiating the revision of the European Interoperability Framework (EIF) and the Architecture Guidelines (AG) .
The first version of this very important document has been published in 2004 and introduced a strong support and request for open standards and xml for the exchange of data between administrations within Europe, as well as with the citizens. This has been relayed and used in many countries to support open standards as well.
This is now threatened in this new report EIF v2.0 by Gartner
This second version, not yet endorsed by the European Commission, nor by the member states, but that could well enter soon such an endorsement process, wants to update the previous version of the European Interoperability Framework but, contrary to the first version, it threatens explictely the good process of more open standards that had been a long time push of IDABC.
The core of the problem is the following passage from Gartner's report:
Gartner acknowledges the importance of open standards. IT vendors and system integrators should also recognize that open standards are the way to go. The era where proprietary standards lead to a sure base of loyal customers is fading away. IT is becoming just like any other industry where true added value and competitive pricing determine the winners.
Yet, Gartner recommends not to focus on the use of open standards per se. Whether open or not, standards are to further the deployment of public services. EIF v2.0 should facilitate the most profitable business model(s) of cost versus public value, under proper recognition of intellectual property rights, if any. The support for multiple standards allows a migration towards open standards when appropriate in the long run.
The use of 'open source' software may further the deployment of public services. However again, whether open source or not, it is the most viable software that should be allowed to survive in the infrastructure. So again, EIF v2.0 should facilitate multiple options to co-exist, and to compete.
This is completely daft. Saying
Gartner recommends not to focus on the use of open standards per se. Whether open or not, standards are to further the deployment of public services. EIF v2.0 should facilitate the most profitable business model(s) of cost versus public value
is like saying
Gartner recommends not to focus on the use of moral standards per se. Whether moral or not, standards are to further the deployment of public services. EIF v2.0 should facilitate the most profitable business model(s) of cost versus public value
In other words, it fails to take into account that focussing narrowly on "the most profitable business model(s) of cost versus public value" is short-sighted, because by definition, "not to focus on the use of open standards per se" means allowing closed standards. And so the long-term costs are going to be greater because of vendor lock-in. In fact, Gartner itself says this:
To facilitate evolution over time and to support the migration from one standard to another and to avoid vendor lock-in it is therefore paramount to design for support of multiple standards.
But it confuses multiple standards of any kind with multiple open standards. There are no easy migrations between different closed standards, or closed standards and open ones. "To facilitate evolution over time", *all* the standards must be open.
Around this deeply flawed core thesis, the rest of the report reads like a puff for Gartner's methodology - including its tiresomely pretentious Hype Cycle (talk about hype). Pretentious and useless: at the "Peak of Inflated Expectations" it places - wait for it - IPv6. I hate to break it to Gartner, but IPv6 passed through that stage about eight years ago.
Give that the IDABC, which commissioned this study (who knows why) has hitherto been pretty sensible on open standards, we can only hope they consign this whole report to the bin where it belongs. To help it on its way, do sign the petition and send your (polite) comments to the IDABC before September as they have specifically requested:
Everyone who sees interoperability as an effective means to come to better pan-European eGovernment services is invited to read the document and reflect on its content.
IDABC is interested in your reactions.
A summary of reactions (that reach us before September 15, 2007) will be published on the IDABC web-site (http://ec.europa.eu/idabc) and will constitute another input into the revision process.
Really, an offer we can't - daren't - refuse.
Update: As I signed the petition I noticed that it insists on a full physical address - country isn't enough. This seems foolish to me, and is likely to lead to people not signing. Unless they were to enter random information in the unnecessary fields....
Although doing away with copyright altogether is probably not such a hot idea - after all, the GNU GPL, and the edifice of free software it supports, depends on it for its efficacy - there is increasing evidence that we should be limiting its scope.
Here's some more:
The 2001 Information Society Directive (2001/29/EC) is introduced thus: “If authors or performers are to continue their creative and artistic work, they have to receive appropriate reward for the use of their work…” (Recital 10). “A rigorous, effective system for the protection of copyright and related rights is one of the main ways of ensuring that European cultural creativity and production receive the necessary resources and of safeguarding the independence and dignity of artistic creators and performers”(Recital 11).
This study shows quite conclusively that current copyright law has empirically failed to meet these aims. The rewards to best-selling writers are indeed high but as a profession, writing has remained resolutely unprosperous.
Compared to the UK, writers’ earnings are lower and less skewed in Germany. This may reflect a more regulated environment for copyright contracts in Germany. It may also reflect the globalised nature of English language markets.
More about the study, and links to its consituent parts can be found on this page.
What if there was a library which held every book? Not every book on sale, or every important book, or even every book in English, but simply every book—a key part of our planet's cultural legacy.
First, the library must be on the Internet. No physical space could be as big or as universally accessible as a public web site. The site would be like Wikipedia—a public resource that anyone in any country could access and that others could rework into different formats.
Second, it must be grandly comprehensive. It would take catalog entries from every library and publisher and random Internet user who is willing to donate them. It would link to places where each book could be bought, borrowed, or downloaded. It would collect reviews and references and discussions and every other piece of data about the book it could get its hands on.
But most importantly, such a library must be fully open. Not simply "free to the people," as the grand banner across the Carnegie Library of Pittsburgh proclaims, but a product of the people: letting them create and curate its catalog, contribute to its content, participate in its governance, and have full, free access to its data. In an era where library data and Internet databases are being run by money-seeking companies behind closed doors, it's more important than ever to be open.
Fine words, but turning them into reality is a monstrous undertaking. Not because any of the required technologies are that difficult to develop or implement, but simply because the current hypertrophied copyright system makes it impossible.
At best, the Open Library will provide us with a bunch of public domain texts like Project Gutenberg, but prettified, plus what looks like a wikified catalogue with tantalising info about all the other books we can't read online.
That's all great to have, and kudos is due to all those behind the project, but is but a pale imitation of what we could - should - have if copyright did its job of encouraging new creation, and got out of the way of such laudable projects.
Given that it's clear what the source code of democracy is - its laws - an obvious thing to try would be to apply open source techniques to the process of drawing up legislation:
"In the world of open source, your contribution, vetted and approved by your peers, gets committed into the mainline in a completely transparent and accountable process," Amanda McPherson, director of marketing Email Marketing Software - Free Demo for the Linux Foundation, told LinuxInsider.
"If Joe Citizen could impact and view the legislative process in the way a Linux developer can, I believe the result would be superior legislation," she said. "Lawmakers would be judged on results, those with the most and best to contribute could do so, and special-interest groups working selfishly would be exposed."
Moreover, there's a technology just waiting for this kind of approach:
"Laws go through all kinds of markups, changes and amendments," Leyden said. "The process has evolved from making those changes on parchment to at least using word-processing documents, but it's not that big a step to think of moving to the next generation of tools and crafting a whole piece of legislation on a wiki."
Interestingly, one of the main voices quoted in these two articles on open legislation is Eben Moglen who - quite unsurprisingly - has many insightful comments on the idea. Yet another reason to read them.
The undisputed doyen of citizen media - aka open journalism - is Dan Gillmor. He's just published a splendid review of the field that is positively stuffed to the gunwales with links to the main sites and stories in this field. In fact, I'd go so far as to say that this is now the single best place to start for those wishing to understand open journalism.
16 July 2007
Here's a neat device:
With Exbiblio, you have seamless, direct access to digital information and the world of the Internet. Imagine once again that you are reading your newspaper, but instead of tearing out an ad or article, writing a reminder or recording a voice message, you use your portable, hand-held scanner to capture just a snippet from the article or ad, swiping it across the text as if using a highlighter.
When you connect your Exbiblio scanner to the digital world -- for example, by wirelessly connecting to the "smart" phone or PDA you are carrying -- the Exbiblio solution instantly searches for the information you have captured, and digital versions of the paper document are found and stored.
Sounds cool - but it depends critically on having free access to that cloud of information. In other words, it depends on the existence of a readily accessible knowledge commons that it can draw upon seamlessly. If such devices had to pay for every snippet they pull down, the knock-on cost and infrastructural complexity required to keep track of who is demanding their shilling will kill it. (Via Open Access News.)
Well, look at this. After all the high drama about the imminent death of Net radio because of the exorbitant licensing rates being demanded, we have an interesting twist:
SoundExchange announced yesterday new terms of a proposal to address the concerns regarding the minimum fees for webcasting set by the Copyright Royalty Judges (CRJs).
Under the new proposal, to be implemented by remand to the CRJs, SoundExchange has offered to cap the $500 per channel minimum fee at $50,000 per year for webcasters who agree to provide more detailed reporting of the music that they play and work to stop users from engaging in “streamripping” – turning Internet radio performances into a digital music library.
In other words, we won't kill you provided you enslave yourself and your listeners through DRM'd music. (Via Ars Technica.)
And talking of 0.01 code and self-deprecation:
I have released AjaxLife’s (very ugly and hackish) code under the revised BSD license. :D
You can find it at http://code.google.com/p/ajaxlife/. As it says, the code is messy. But eh.
That’s what you get when you throw something together over the weekend in a language you don’t know. And for added fun, part of the code was lost at some point (file corruption) and had to be recovered by decompiling. So, as I said. Ugly code. :p
Linus, er, Katharine.
Hmm, I'm not really clear what's going on with this "European Interoperability Patent" (EIOP) stuff:
Essentially, it is an idea that is based on what he calls the concept of “soft IP”, which, he says, is encapsulated within the Blue Skies strand of the EPO’s Scenarios project. The EIOP would be an EU-wide patent granted by the EPO that would be “open”. In other words, EIOP owners would not be able to get injunctive relief – either preliminary or permanent in cases of infringement; instead, EIOP owners would effectively be signing up to the concept of licences of right, so that anyone who wanted to use a patent would be able to do so as long as an appropriate licensing fee was paid (it is a concept that exists under the laws of some European countries already, including the UK). If a fee could not be agreed, then the matter would go to the courts, which would adjudicate on what amount would be reasonable.
Actually, I can understand where IBM is going with this, but I'm less sure about the FFII on the basis of the following hints:
“The FFII has a new leadership and we think that it has changed, and become more mature. The FFII is critical if Europe is going to develop as somewhere in which to build a patent system that can exist in a more facilitative and less conflicting nature with open innovation models … the FFII has influence and a strong voice; something it proved in the CII debate. We feel there is now an opportunity to engage and have a constructive dialogue.”
CII refers to the dreaded "computer-implemented invention", and is basically a trick to get European software patents in through the back door. I do hope that the FFII is not going to do something silly. I obviously need to investigate further. (Via The Inquirer.)
"Open notebook science" is a great term devised by Jean-Claude Bradley - great, because it makes explicit exactly where you can find, read and hack the source code that underlies open science. One of the best observers of that world, Bill Hooker, has an interesting comment on a fellow researcher's adoption of the open notebook approach:
It's also, to be honest, just plain fun to snoop around in someone else's lab notes! I was amused to note that Jeremiah talks to and about himself in his notebook, the same way I do -- "if I weren't so stupid I'd...", "next time load the control first, doofus", etc. I wonder if everyone does that?
Now, where have I heard this sort of thing before?
This is GOOD CODE!
Yeah, yeah, it's ugly, but I cannot find how to do this correctly, and this seems to work...Most of this was trial and error...Urghh
The programming comments of a very young Linus Torvalds as he hacked version 0.01 of a little program called Linux during the summer of 2001. Coincidence? I don't think so....
What has this got to do with open source and openness in general?
"Countries that have most closely followed the Anglo-Saxon, strongly market-led economic model show up as the least efficient," commented Nef's policy director, Andrew Simms.
"These findings question what the economy is there for. What is the point if we burn vast quantities of fossil fuels to make, buy and consume ever more stuff without noticeably benefiting our wellbeing?"
(Excercise left to reader).
14 July 2007
This looks very tasty:
The Asus Eee PC 701 notebook
* Display: 7"
* Processor: Intel mobile CPU (Intel 910 chipset, 900MHz Dothan Pentium M)
* Memory: 512MB RAM
* OS: Linux (Asus customized flavor)
* Storage: 8GB or 16GB flash hard drive
* Webcam: 300K pixel video camera
* Battery life: 3 hours using 4-cell battery
* Weight: 2lbs
* Dimensions: 8.9 in x 6.5 in x 0.82 in - 1.37 in (width x depth x thickness)
* Ports: 3 USB ports, 1 VGA out, SD card reader, modem, Ethernet, headphone out, microphone in
Even tastier is the price: with the dollar delightfully weak these days, we're talking just a smidge over a hundred quid each. Put me down for a brace.
This says it all, really:
An advertising framework may reside on a user computer, whether it's a part of the OS, an application or integrated within applications. Applications, tools, or utilities may use an application program interface to report context data tags such as key words or other information that may be used to target advertisements. The advertising framework may host several components for receiving and processing the context data, refining the data, requesting advertisements from an advertising supplier, for receiving and forwarding advertisements to a display client for presentation, and for providing data back to the advertising supplier. Various display clients may also use an application program interface for receiving advertisements from the advertising framework. An application, such as a word processor or email client, may serve as both a source of context data and as a display client. Stipulations may be made by the application hosting the display client with respect to the nature of acceptable advertising, restrictions on use of alternate display clients, as well as, specifying supported media.
In other words, every app running on Microsoft's advertising-enhanced OS will spy on you so that those nice advertisers can push junk in your face. Thanks, Microsoft. (Via Slashdot.)
...would be fine. And look, it's not just me:
The repeal of IP might create for it an additional cost of doing business, namely efforts to ensure that consumers are aware of the difference between the genuine product and impersonators. This is a cost of business that every enterprise has to bear. Patents and trademarks have done nothing to keep Gucci and Prada and Rolex impersonators at bay. But neither have the impersonators killed the main business. If anything, they might have helped, since imitation is the best form of flattery.
That was always true, but now there's another reason for believing it:
The Internet age has taught that it is ultimately impossible to enforce IP. It is akin to the attempt to ban alcohol or tobacco. It can't work. It only succeeds in creating criminality where none really need exist. By granting exclusive rights to the first firm to jump through the hoops, it ends up harming rather than promoting competition.
But some may object that protecting IP is no different from protecting regular property. That is not so. Real property is scarce. The subjects of IP are not scarce, as Stephan Kinsella explains. Images, ideas, sounds, arrangements of letters on a page: these can be reproduced infinitely. For that reason, they can't be considered to be owned.
BTW, the Stephan Kinsella paper referred to above, called simply "Against Intellectual Property", is also fantastic stuff. Good to know that the we few - we happy few - are growing in number. (Via Against Monopoly.)
13 July 2007
What does AIX say to you? Correct: big, old, proprietary. And yet:
Openness, such as compliance with open standards, has always been an integral part of the AIX operating system (OS). The next release of AIX, Version 6.1, extends this openness to the product release process with the first ever AIX open beta. The open beta will allow a broad set of IBM clients to download and gain experience with AIX 6 before it becomes generally available.
An "open" beta for the next AIX release differs from the traditional beta in three key areas:
* Almost anyone who is interested will be able to download and install a pre-release version of AIX 6. By contrast, only a few clients would have the opportunity to test a new AIX release in a traditional beta.
* Participants in the open beta will not receive traditional support from IBM. Instead, you access a Web forum to discuss questions and issues.
* The only legal document required for participation in the open beta is a "click to accept" license agreement that clearly states all program conditions.
Well, I suppose that's progress. (Via The Inquirer.)
Hm, were this not on the European Patent Office's own site, I might have doubted its authenticity:
Where do we stand in the discussion about patents on computer-implemented inventions (CII patents) two years after rejection in the European Parliament? This was the perspective under which the EPO had invited members of the European Parliament, representatives from industry and enterprise, NGOs and IP specialists to review developments since the rejection of the CII directive.
The bottom line (literally)?
All speakers welcomed unequivocally the opportunity to discuss the issue at a high level and made clear that a new CII debate followed by legal modifications was neither necessary nor desirable.
Wearing my cynical journalist's hat, I suppose this might mean that companies in favour of software patents (like SAP, which emerges once again as the Big Baddie of Europe in all this), think they'll be able to squeeze through their wretched computer implemented inventions under the present scheme.
Still, the EPO story's headline "No revival of software patents debate" is a good marker to have. (Via Slashdot.)
This is yet another reason why free software is so important: it lets people take their control of their linguistic destiny, liberating them from the money-based decisions of companies who have no interest in such matters.
OpenOffice.org 2.3 is scheduled to be released in early September and will include locales for:
* Sango - Marcel Diki-Kidiri
* Lingala - Denis Moyogo Jacquerye
* Luganda - Martin Benjamin (and others)
* English (Ghana) - Paa Kwesi Imbeah
For speakers of these languages, an estimated 5,5+ million people, this work has impact in that they can for the first time correctly choose dates and times for their language and country and adjust the behaviour of OpenOffice.org to cater for other cultural conventions.
But more critically in the long term it means that they can now create documents correctly tagged as having being written in that language. For most Africans who do not have locale support for their language they will traditionally write the document in their language while the computer assumes it is written in American English. While this works it is causing inestimable long term damage; search engines cannot find Lingala documents, we cannot draw text from Sango documents to help build spell checkers or do language research. But now for these languages and for users using OpenOffice.org they can create documents correctly labeled and in the future help researchers and users of their content access it correctly.
This is obviously good news, but I can't help finding the idea of an "official" spelling list rather quaint:
The OpenTaal project (Dutch for "OpenLanguage") has published the first open source word list to be certified by the Dutch Language Union as corresponding to official spelling. Simon Brouwer, project leader of OpenTaal, says, "This is a milestone. Users of open source software can trust their Dutch spell checker now. They have the guarantee that their word list is consistent with the official spelling."
In 2005, the Dutch language area got new spelling, which consists mainly of corrections to the spelling of 1995. Starting in August 2006, the new spelling would be mandatory for the government and schools. This revived the project of creating an open source word list. At the end of 2005 the Dutch government program Open Standards and Open Source Software (OSOSS) initiated the OpenTaal project to coordinate the various Dutch open source projects that had an interest in the new spelling, with the aim of developing a Dutch word list conforming to the new spelling. This would give users of open source software like OpenOffice.org, TeX, Thunderbird, and Firefox an up-to-date spell checker. OSOSS contacted the Dutch Language Union, which agreed to assist the project.
12 July 2007
There have been plenty of arguments over copyright and what an appropriate term for it should be, but, to my knowledge, precious few mathematical theories, especially those that take into account the impact of digital technologies.
Enter Rufus Pollock, with his splendid paper Forever Minus a Day: Some Theory and Empirics of Optimal Copyright. And if you get the feeling from the title that this may not be exactly beach literature, you are probably right:
Take any exogenous variable X which affects the welfare function (whether directly and/or via its effect on production N). Assuming that the initial optimal level of protection is finite, if d2W/dXdS is positive then an increase (decrease) in the variable X implies an increase (decrease) in the optimal level of protection.
Go that? Well, get this, at least:
Using a simple model we characterise optimal term as a function of a few key parameters. We estimate this function using a combination of new and existing data on recordings and books and find an optimal term of around fourteen years. This is substantially shorter than any current copyright term and implies that existing copyright terms are non-optimal.
Non-optimal: there you have it in a nutshell. (Via Boing Boing.)
I knew biofuels were environmentally bad news, but it seems that they are even worse than I imagined:
Glub, glub. The plant consumes over a million kilos of corn per day. That’s good news for area farmers especially as the price has almost doubled due to high demand. The bad news is that our current agricultural system is petroleum-soaked. Chemical pesticides and fertilizers, machinery, irrigation pumps, and grain transport all depend on the stuff. Sustainable Table reports that each acre of corn, just in chemical pesticides and fertilizers, requires 5.5 gallons of petroleum .
Glub, glub. The plant uses 275 tons of coal a day, trucked down from Wyoming. Five rail cars, powered by diesel engines, head east with the finished ethanol each day.
Shluurrp. The plant uses 600,000 gallons of water every day to produce 150,000 gallons of ethanol. This water figure doesn’t account for pumped irrigation water (requiring petroleum) during corn cultivation.
So, nominally bio-friendly biofuels actually require lots of concretely polluting petrol and coal in order to be manufactured. So, wouldn't it just be easier to spend a little more time working on electric cars, renewable energy, you know, all that boring old stuff that might actually mitigate things, instead of creating this Escheresque staircase of pointless energy transmutation?
Good to hear:
The BBC Trust has asked to meet open source advocates to discuss their complaints over the corporation's Windows-only on demand broadband TV service, iPlayer.
The development came less than 48 hours after a meeting between the Open Source Consortium (OSC) and regulators at Ofcom on Tuesday. Officials agreed to press the trust, the BBC's governing body, to meet the OSC. The consortium received an invitation on Wednesday afternoon.
Since they had to be shoved into doing this by Ofcom, I somehow can't see the BBC actually doing anything as a result. But I'm willing to be proved wrong.
11 July 2007
Behold the African Cookbook Project:
whose goal is to archive African culinary writing and make it widely available on the continent and beyond. A database is being developed and copies of hundreds of cookbooks are already being catalogued at BETUMI: The African Culinary Network. Google has offered assistance in eventually digitizing some of the information.