31 October 2006
It was a custom for Romans, in their wills, to free some of their slaves. Neil Gaiman's post about problems with intellectual monopolies after the death of a writer prompts me to suggest a similar manumission for their works. It would be simple to arrange and a fitting point at which to liberate creations. (Via Copyfight.)
The European Computer Driving Licence is not a joke, despite its Monty Python-ish name. More to the point:
The ECDL Foundation will now include a module on the use of Sun's Star Office Writer, Calc and Base applications for word processing, spreadsheets and database work.
So, shame on me that I've never heard of it, and good on them for creeping out from under the Redmond shadow, albeit only a smidgeon.
Here's a man after my own heart:
I've never had an idea that couldn't be improved by sharing it with as many people as possible -- and I don't think anyone else has, either. That's why I have become interested in the various "Open" movements making increasing inroads into the practice of modern science. Here I will try to give a brief introduction to Open Access to research literature; in the second instalment I will look at ways in which the same concept of "openness" is being extended to encompass data as well as publications, and beyond that, what a fully Open practice of science might look like.
(Via Open Access News.)
30 October 2006
An entire site about Digital Rights Management sounds like some torture from the Spanish Inquisition. But the fact that DRM.info is not a site about Digital Rights Management but Digital Restrictions Management gives a clue as to why its rather more tolerable: it's not exactly for the idea.
It comes from the Free Software Foundation Europe, and is designed presumably to catalogue the deletorious effects of DRM, offering them up as a warning and stimulus to remedial action.
An interesting post from Mr Carr, notable as much for its title - "Larrying Wikipedia" - as for the idea it encapsulates:
Why, in other words, hasn’t anyone done to Wikipedia what Larry Ellison last week did to RedHat?
Somewhat belatedly, scientists are localising the physical basis for the kind of altruism that lies at the heart of the opens:
They found that the part of the brain that was active when a person donated happened to be the brain's reward centre—the mesolimbic pathway, to give it its proper name—responsible for doling out the dopamine-mediated euphoria associated with sex, money, food and drugs. Thus the warm glow that accompanies charitable giving has a physiological basis.
Cory Doctorow has given some details about a course he is running:
an undergrad class about DRM, EULAs, copyright, technology and control in the 21st century, called "Pwned: Is everyone on this campus a copyright criminal?"
No, wait, even if you can't stand the Cory.
The course itself is pretty conventional. But this, frankly, seems brilliant:
The main class assignment is to work through Wikipedia entries on subjects we cover in the class, in groups, identifying weak areas in the Wikipedia sections and improving them, then defending those improvements in the message-boards for the Wikipedia entries.
What if every university course did the same, tidying up Wikipedia entries that were sub-par? Think about it.
The conclusions of the Stern Review will not come as any surprise to readers of this blog:
The scientific evidence is now overwhelming: climate change is a serious global threat, and it demands an urgent global response.
This Review has assessed a wide range of evidence on the impacts of climate change and on the economic costs, and has used a number of different techniques to assess costs and risks. From all of these perspectives, the evidence gathered by the Review leads to a simple conclusion: the benefits of strong and early action far outweigh the economic costs of not acting.
But as I commented before about a similar case, what makes this report so important is that it coming from the establishment, not from groups who would be expected to make statements like that above. It is also meticulous in detailing the situation. Kudos to the UK Government for commissioning it - and for making it freely available.
Despite its portentous message, I find its appearance - and of an increasing number of similar reports - strangely heartening: I can't help feeling that we are close to not one but two tipping points.
The first is catastrophic, when the earth's environmental system is so far out of kilter that it changes dramatically; the second is rather more positive - the moment when enough people get what is going on, and start doing something effective to avert or at least mitigate the effects of the first tipping point.
Maybe I'm just an incurable optimist, but I was particularly pleased to read this point:
The loss of natural forests around the world contributes more to global emissions each year than the transport sector. Curbing deforestation is a highly cost-effective way to reduce emissions; largescale international pilot programmes to explore the best ways to do this could get underway very quickly.
Halting deforestation seems a way not only to slow down global warming, but to address many other issues like species loss and even poverty. I say let's do it. Please?
The Citizendium project is nothing if not intriguing. The drip-feed of information about it doesn't hurt in terms of provoking interest. Here's the latest two installments from Dr. Sanger: Why Make Room for Experts in Web 2.0? and The Role of Content Brokers in the Era of Free Content. I've not had time to digest them yet, but Larry writes well and interestingly, so they're likely to be worth reading.
29 October 2006
As I've mentioned before, mashups are all about the underlying mesh. And what better mesh for knowledge than Google Earth? And what better to mash it with Wikipedia? Here you are, then. (Via Openpedia.org.)
What do you want if you are worth $18 billion and have the third-largest motor yacht in the world? Simple: revenge.
Oracle's Unbreakable Linux is about revenge - for the fact that Red Hat dared to snatch JBoss from under Larry Ellison's nose. It's a warning that you don't mess with lovely Larry. It's also a bit of kite-flying: maybe offering support for Red Hat is a viable business, though I can't see it myself. In any case, even if Unbreakable fails as a service, it's already succeeded as a punishment.
28 October 2006
A post on Bob Sutor's blog points to IBM's mega-site devoted to open source. Interesting enough, but even more interesting his comment on it:
It’s hard to think of any part of IBM’s business that is not now affected by open source
One of the first, but certainly not the last.
27 October 2006
I wrote some while back about the Open University's plans to offer its materials as open courseware. Its dedicated site, called OpenLearn, is now up and running, with lots of interesting content. The licence? - a Creative Commons Attribution-NonCommercial-ShareAlike 2.0 Licence.
Mixed news on the UK patent front:
The Court of Appeal has ruled on two cases involving software patents today. It rejected one and unfortunately granted the other. It was hoped that the ruling would confirm that software development which relates only to new business logic does not have to worry about patent threats. As more and more companies in the United States get tied up in business method patent litigation, this decision should be a big worry for UK companies.
Full details here.
As a big fan of the explanatory power of Darwinian evolution - which, for those still concerned about its "theoretical" status, is basically just maths - I have to say I'm impressed by this story:
SpamThru takes the game to a new level, actually using an anti-virus engine against potential rivals.
Of course, this is precisely the same strategy that baby cuckoos use. Self-standing, evolving computer viruses living across the Net are getting ever closer....
One of the central problems with DRM is that it is hard to know how to fight back. Boycotting DRM'ed goods is all very well, but needs lots of people taking part to make an impact. This means that getting out the fact that many consumer products are Defective by Design is crucially important. Against this background, here's a clever idea: tagging DRM'ed products on Amazon. Fight force with cunning. (Via Boing Boing.)
26 October 2006
...you wait for ages and then two come along at once.
First we had Citizendium, now here's Scholarpedia. The dynamics are slightly different, and it will be fascinating to watch their respective evolution. In particular, it will be great to see online Darwinism in action as these two and Wikipedia fight it out from their respective positions.
Although this Digital Freedom Campaign is highly partial - in both sense of the word - in that it's totally US-centric as far as I can tell, the groups supporting it seem to be right ones. Whether its dinky Flash videos (grrr) make a fig of difference to what is, after all, a global problem, remains to be seen. (Via Open Access News.)
Oracle's announcement of its "unbreakable" GNU/Linux has provoked plenty of comment from around the blogosphere. I've not had a chance to mull it all over yet (not least because I've been up at the LinuxWorld show, where I spent some time talking to a man from Oracle....). In the meantime, you can find plenty of interesting analysis via Technorati.
25 October 2006
To my eternal shame the UK is not exactly at the forefront of free software adoption, not least because Our Glorious Tone seems as dazzled by the business and intellectual achievements of Bill Gates as he is by the social and political ones of George W. Bush. But apparently we are to get our very own National Open Centre. I'm not holding my breath for massive open source uptake, but it's a start.
Say it ain't true, Bruce:
Britain's BT Group has snapped up United States-based Counterpane Internet Security for a sum of more than $20 million as part of a continuing commitment to the security offering and overall growth of its Global Services business.
Counterpane provides managed network security services.
As part of the deal, Counterpane's founder, CTO and highly regarded security guru, Bruce Schneier, will join the BT payroll. Schneier will maintain his position as CTO within Counterpane, based in Mountain View, Calif.
Bruce Schneier, security god, meets BT, ex-monopolistic monster.
Ah, well, I suppose you deserve the dosh, if nothing else.
The high priest of open access, Steve Harnard, has some thoughts on how open access and open source relate to each other - and how they don't. He also uses this analogy:
I am personally in favour of open-code pharmacology ("OP"): The formula for potential cures should not be kept secret, or prevented from being used to sell or even give away the medicine.
It does *not* follow from this, however, that if a commercial pharmaceutical company develops a non-OP cure for AIDS today that I will refuse to use it or promote it! Nor will I try to suppress or refuse to cooperate with OP research or OP researchers, while there are still diseases and patients, needing to be cured now.
The reason this is different from the situation with open source is that for the latter you (as in an idealised "you") always have the option of sitting down and writing some free code. You do not really have this possibility when it comes to inventing drugs.
It's all-too easy to forget that free software is a truly global phenomenon. So these stories, one from Turkey, the other from Pakistan, are a timely reminder of how much is happening beyond the glare of the anglophone media.
Both are about the public sector being encouraged to turn towards open source software. Both have interesting sites associated with them. In Turkey, there is the home-grown Pardus distribution, while in Pakistan there is an information site called FOSSFP: The Free and Open Source Software Movement.
Talking of titles, this one sounds pretty germane: Source Code for Biology and Medicine. Here's some more information:
Source Code for Biology and Medicine is a peer-reviewed open access, online journal that publishes articles on source code employed over a wide range of applications in biology and medicine. The aim of the journal is to publish source code for distribution and use in the public domain in order to advance biological and medical research. Through this dissemination, it may be possible to shorten the time required for solving certain computational problems for which there is limited source code availability or resources.
24 October 2006
What a fab name for a journal. The sponsoring body
IASC is an association devoted to understanding and improving institutions for the management of environmental resources that are (or could be) held or used collectively. Many will refer to such resources and their systems of usage as "commons".
Given the subject-matter, it will come as no surprise that the new title will be adding to the commons that is open content. (Via Open Access News.)
Here's an interesting little Google map, showing where StarOffice is being used in academic institutions in Italy. OK, so it's a little recondite, but the point is there's a lot of StarOffice about. And as we know from Apple's history, if you get them young, you get them old.... (Via Erwin's StarOffice Tango.)
Here's a tiny little straw in the intellectual monopoly wind:
During their next meeting the G8 governments should engage those of the five newly industrialized countries the People's Republic of China, India, Brazil, South Africa and Mexico in talks about intellectual property, representatives of the German federal government elucidating the proposed agenda for the next G8 summit meeting in June said during a briefing while commenting on the latter's motto "Growth and Responsibility."
And why is that interesting? Well, because
Oliver Moldenhauer of Netzwerk Freies Wissen [Free Knowledge Network] told heise online that he thought the G8 summit was not the proper forum for discussing intellectual property rights. "The G8 are made up almost exclusively of rich industrial countries. The interest they have in this dialogue is likely to consist above all in putting pressure on the newly industrialized ones," he said. In the opinion of Mr. Moldenhauer other forums such as the World Intellectual Property Organization (WIPO) or the World Health Organization (WHO) are better suited to the task.
And why has G8 suddenly gone off its darling WIPO? Well,
the industrialized countries suffered a setback at the WIPO General Assembly in September. The newly industrialized countries had countered calls for further harmonization by demanding that such harmonization go hand in hand with, for example, improvements in the quality of the process by which patents are granted and effective protection of traditional knowledge against possibly unfair exploitation by international companies.
Hm, maybe I need to change my views on WIPO....
Talking of RL and SL, this extremely witty piece is deeper than you might think:
Volumes have already been written about real life, the most accessible and most widely accepted massively multiplayer online role-playing game to date. Featuring believable characters, plenty of lasting appeal, and a lot of challenge and variety, real life is absolutely recommendable to those who've grown weary of all the cookie-cutter games that have tried to emulate its popularity--or to just about anyone, really.
(Via Web 2.0 Blog Network.)
This is getting seriously weird.
DestroyTV lets RLers watch an island in SL, using an embedded video camera (which is "in" both SL and RL). There are also screenshots (several thousand of them), over on Flickr, complete with a tag cloud. So which world are we in now?
It's interesting to note the number of pamphlets that are being written as primers, explaining this wacky open stuff to "normal" people. Here's another one:
How do you do business with an illusive network that belongs to nobody? How do you work with talented people you are likely never to meet and who are not motivated by money? How can businesses learn to “take advantage of” innovation when it comes along in an open-platform world?
And on another level, are we seeing the beginning of the end for the existing business models based on hierarchy, planning and management, and pure competition?
We are in fact seeing a new level of change. What are the indicators? The rise of open-source development communities. The growth of an on-line knowledge commons. A shift of power to socially and technologically connected “smart mobs.” And many more.
Nothing new, but handy for the boss.
23 October 2006
If you think laws are a problem in the real world, wait until you start thinking about the virtual one. Here's what one person has decided:
I think that the entire range of common law rights needs to be viewed as applicable to virtual worlds -- property included.
Heavy stuff - but not something that we can avoid confronting as our second lives start to take on ever-more importance alongside our first ones.
This is it. Just look at these dynamics:
The number of businesses allowing employees to download the Firefox Web browser soared this year, and at least one analyst believes the recently released Internet Explorer 7 could boost use of Firefox in companies.
Fully, 44 percent of businesses with 250 employees or more allow workers to download Mozilla Corp.'s open-source browser at the office, according to a survey conducted this year by JupiterResearch. Last year, only 26 percent of such businesses were willing to do the same.
So, we're through the crucial stage, where Firefox is only downloaded by enthusiasts, to that of corporate acceptance. That's good, but even better is the timing:
For many businesses, the move to Vista could take a year and a half or more, analysts say.
As a result, many people who get IE 7 at home through Microsoft's automatic update service will likely find IE6 lacking. Without the option of installing IE 7 at work, they are likely to turn to Firefox, Wilcox said.
Yee-ha, as they say.
More than anyone else, Richard Stallman is driving the GPLv3 debate (although Eben Moglen is clearly another crucially important figure). What follows is a transcript of a short interview that took place on 6 October, 2006. In it, RMS talked about the issues that lie behind the GPLv3, and gave his thoughts on the concerns expressed by the Linux coders, some of which were raised in the posting below.
Could you give a little background to the drafting of the GNU GPLv3?
The purpose of the GNU GPL is to defend for all users the freedoms that define free software. It doesn't make sense in terms of open source. It's the result of implementing the philosophy of free software in the most strong way that we can. So all the version of the GPL have prevented middlemen from restricting subsequent users by changing the licence. Some free software licences permit that, for example the X11 licence permits that. The various BSD licences permit that. But the GPL was specifically designed not to permit that - you cannot add restrictions making the program non free.
Now, what we didn't have 15 years ago was the threat of making the program effectively non free by technical restrictions placed around it. That's what Tivoisation is. Tivoisation means taking a free program and distributing a binary of it, and also providing the source, because the GPL requires that. But when the user changes the source code and compiles it and then tries to install the changed program he discovers that that's impossible because the machine is designed not to let him.
The result of this is that freedom number 1, the freedom to study the source code and change it so the program does what you want, has become a sham. Tivoisation is essentially a way to formally comply with the requirement, but not in substance.
So we've come to the conclusion that this is more than just a minor issue. That this will be common, probably the usual case, if we don't do something to stop it. And therefore we've decided to do what is necessary so that our software will not be Tivoised. Our purpose is to deliver freedom to the user.
Why do you think there has been such an outcry in some quarters recently?
I don't know. A few people are upset.
A few people including most of the key kernel coders...
Their business. That's their program and they can decide whether to use this licence.
Seems clear they will stick with GPLv2?
I hope not, but if they do it's up to them.
If that happens, is that going to cause any problems for GNU?
It won't cause any problems for us, only for the public. The problem it will cause is Tivoisation. It will cause the problem that users don't have the freedoms that they should have. And that's a very big problem, but it's not a problem specifically for us, it's a problem for everyone. The problem is that many people will get machines in which Linux has been Tivoised. Which means that for practical purposes it won't be free for them.
If that happens, would you put more effort into the Hurd?
I don't think so, and the reason is that wouldn't achieve much unless we convinced everyone to switch to the Hurd from Linux, and that isn't too likely. The Hurd still has some technical problems, and who knows if it would ever become a competitor. But suppose somebody wanted to Tivoise, and he had available the Hurd and Linux to choose from, and Linux permits Tivoisation and the Hurd doesn't: the solution would be to use Linux.
Some people make the argument that if GPLv3 is applied to Linux, companies might simply adopt a different operating system for their products.
I don't think so.
You don't think they might use BSD or Windows?
They might, who knows? I don't think it's very likely, but the main point is it's no use giving up on a fight because you might lose, not when the fight is for something very important like freedom.
Is there anything you can do to assuage concerns of the kernel coders without giving up your principles?
I don't know. If they would just speak with us. we can explore that possibility.
Are they not doing that?
Basically no. Just recently we have had a couple of communications with them, not yet reaching the stage of being entirely civil in tone, but at least it's a start. We've been inviting them to talk with us since before we started publishing drafts, but they have not for the most part taken up that offer. In general they've made statements to the public instead of to us. And some of them are based on misunderstandings of the draft and of our intention. They're talking to each other not to us. But it's not too late for them to start if they wish to talk to us.
Is there scope to rephrase the clause that deals with Tivoisation?
We can rephrase it in a lot of different ways. We just recently decided on a change, which is that the requirement for keys would no longer work by calling them part of the corresponding source. This is a change in the details, but the substance is the same, the aim is the same - to change that would be giving up.
The two philosophies of free software and open source in some cases lead to similar conduct - in fact, in many cases. That's why it was so easy for the people who support open source to apply their label to what we're doing. Because if you're participating in a free software project it usually doesn't matter whether your goal is to give users freedom and to establish freedom in cyberspace or just have powerful and reliable software, because either way you could do the same things. And there's no need for people to ask each other: What's your philosophy, why do you want to contribute to this project? - they just start contributing, and they work on making the software better, and they focus on that.
But there are cases where these two different philosophies lead to different results. For instance, some people have proposed what they call “open source DRM” - DRM meaning “digital restrictions management”. This is a plan to develop software to put in machines that will restrict users, and then publish the source code of this. The idea is that programmers around the world will work together making that software do its job better, that is, restrict the user more inescapably, more reliably, more powerfully. Although the source code of this software will be published, they plan to use Tivoisation to make sure that the users can't escape from their power.
Now, if your goal is to give the users freedom, restricting the users through open source is no more tolerable than restricting the users any other way, because the users have to have the freedom.
Have you tried talking to TiVo about this?
You don't think it might be useful?
No, not really. And the reason is they're just the first example. If it were only that one company that were the problem, we probably wouldn't pay attention because it would be a small problem. But the idea is floating around, and there are many different plans to use it.
Couldn't you help TiVo do what they want to do with free software?
They initially did. This Tivoisation was not in the first TiVo box. The point is, it's pressure from Hollywood. And the best way to have a chance of negotiating something with those who are under the pressure is first to set up counter pressure.
The problem being that a hacked version of TiVo could circumvent any DRM?
Exactly. And the point is, DRM itself is evil. Restricting the user's freedom in other ways so that the user cannot change the software and get rid of DRM makes the software effectively not free for that user. So we have these two philosophies, and here they make a big difference. You can imagine open source DRM, and if all you care about are the philosophical values of open source, you might think it's great. If you only want software to be powerful and reliable, you might tend to apply that to software whose purpose is to go in somebody's machine and restrict it, and you might think, “Sure I'll help you make that powerful and reliable.” But if you believe in free software, and you think that the user whose machine it is should be in control of what that machine does and not somebody else, then the aim of that project becomes wrong in itself. Free software DRM makes no sense - it's a contradiction in terms.
Are you worried about the prospect of GPL projects forking?
It can happen. But again, there's no use not fighting, there's no use surrendering to this threat. It's too dangerous.
Are there any other points you'd like to make?
There are people who seem to imagine that some disaster will happen because some programs in the GNU/Linux system are using GPLv3 and some are using GPLv2, but in fact there are many programs with other licences in the system as well, and there's no problem there at all.
There are many people who would like to come across some disastrous flaw in GPLv3. If one person says he's found it, the others repeat it without stopping to make sure it is for real, because they consider it the answer to their prayers.
But you think they'll work together without problems?
I know they will, because these programs are separate programs, and the licence of one has no effect on the licence of another.
Now, I wish that everyone would switch to GPLv3 because that would give the strongest possible front to resist Tivoisation and ensure the freedom of the users. But I know that not everybody will participate, nonetheless we have to try to defend the freedom.
Richard Stallman has sent me a comment on Alan Cox's reply:
While I addressed the topic you proposed--version 3 of the GNU General Public License--Alan Cox chose instead to present a misleading picture of the history of GNU and Linux.
The GNU/Linux system comes out of the effort that I began in 1983 to develop a complete free Unix-like system called GNU. GNU is the only operating system that was developed specifically to respect computer users' freedom. Since our goal was to achieve freedom as soon as possible, we utilized the scattered existing free software packages that would fit. That still left most of the components for us to write. In those years, we of the GNU Project systematically developed the essential components of this system, plus many other desirable components, ranging from libraries to text editors to games.
In 1991, Linus Torvalds developed a kernel called Linux--initially not free software, but he freed it in 1992. At that time, the GNU system was complete except for a kernel. The combination of Linux and the GNU system was the first complete free operating system. That combination is GNU/Linux.
Cox says that Linux is not part of the GNU Project. That is true--of the kernel, Linux, that he and Torvalds have worked on. But the combined system that Cox calls "Linux" is more our work than his.
When Cox says that "FSF-copyrighted code is a minority in [GNU/Linux]", that too is misleading; he knows that just a fraction of the GNU packages' code is copyright FSF. What part do GNU packages compose in the whole system? Many are just as essential as Linux is.
In 1995, GNU packages were 28% of the system, while Linux was 3%. 28% is less than half, so that was a minority; but it is a lot more than 3%. Nowadays, after thousands of other groups have added to the system, both the GNU and Linux percentages are smaller than before; but no other project has contributed as much as the GNU Project.
Calling the combined system GNU/Linux is right because it gives the GNU Project credit for its work, but there are things more important than credit -- your freedom, for example. It is no accident that the GNU GPL existed before Linux was begun. We wrote the GPL to protect the freedom of the users of GNU, and we are revising it today so that it will protect against newer technical methods of denying that freedom. When you think about GPL issues, this is the background for them.
If the developers of Linux disagree with that goal, they are entitled to their views. They are entitled to cite their important work--Linux, the kernel--to be listened to more, but they should respect our right to cite the GNU system in the same way.
See http://www.gnu.org/gnu/gnu-linux-faq.html for more explanation.
Few subjects in the world of free software have provoked as much discussion as the new GNU GPLv3 licence. Mostly it's outsiders (like me) sounding off about this, but what do the people involved really think?
I decided to ask them, and write up the result for the Guardian. I was expecting a couple of lines back to my emailed questions if I was lucky, but badly underestimated hacker generosity. Linus and his mates sent me back long and typically thoughtful replies, while RMS wanted to talk about it at length.
Since I was only able to use a tiny fraction of this material in the Guardian article that resulted, I thought it might be a useful contribution to the GPLv3 debate to post it here (with the permission of those concerned).
For length reasons, I've split it up into two postings. Below are the replies of the kernel coders, which I received on 3 October, 2006 (placed in the order in which their authors appear in the recent GPLv3 poll). The interview with RMS can be found above.
I don't think there will necessarily be a lot of _practical_ fallout from it, so in that sense it probably doesn't matter all that much. It's not like we haven't had license "discussions" before (the whole BSD vs GPL flame-war seemed to go on for years back in the early nineties). And in many ways, it's not like the actual split between the "Open Source" and the "Free Software" mentality is in any way new, or even brought about by the GPLv3 license.
So while I think there is still a (admittedly pretty remote) chance of some kind of agreement, I don't think that it's a disaster if we end up with a GPLv2 and a new and incompatible GPLv3. It's not like we haven't had licenses before either, and most of them haven't been compatible.
In some ways, I can even hope that it clears the air for all the stupid tensions to just admit that there are differences of opinion, and that the FSF might even just stop using the name "GNU/Linux", finally admitting that Linux never was a GNU project in the first place.
The real downside, I suspect, is just the confusion by yet another incompatible license - and one that shares the same name (licenses such as OSL and GPL were both open source licenses and they were incompatible with each other, but at least they had clear differentiation in their names).
And there's bound to be some productivity loss from all the inevitable arguments, although in all honesty, it's not like open source developers don't spend a lot of time arguing _anyway_, so maybe that won't be all that big of a factor - just a shift of area rather than any actual new lost time ;)
One of the reasons the thing gets so heated is that people (very much me included) feel very strongly about their licenses. It's more than just a legal paper, it's deeply associated with what people have been working on for in some cases decades. So logically I don't think the disagreement really matters a whole lot, but a lot of it is about being very personally attached to some license choice.
Ar Maw, 2006-10-03 am 13:57 +0100, ysgrifennodd glyn moody:
> Since it seems likely that the kernel will remain under v2, while the
> rest of GNU goes for v3, I was wondering whether you think this is
> going to cause you and others practical problems in your work on the
> kernel. What about companies and end-users of GNU/Linux: will there
There is no such thing as GNU/Linux. For an article like this it's really important to understand and clarify that (and from the US view also as a trademark matter).
I mean there is no abstract entity even that is properly called "GNU/Linux". It's a bit of spin-doctoring by the FSF to try and link themselves to Linux. Normally its just one of those things they do and people sigh about, but when you look at the licensing debate the distinction is vital. (its also increasingly true that FSF owned code is a minority part of Linux)
Linux is not and never has been an FSF project. I would say the majority of the kernel developers don't buy the FSF political agenda. Linus likewise chose the license for the pragmatic reason it was a good license for the OS, not because he supported the GNU manifesto.
Thus this isn't about the Linux people splitting from the FSF, its a separate project that happens to have been consulted as to whether it would like to use a new, allegedly better, variant of the license it chose.
Linux does use FSF tools but that doesn't make it a GNU project any more than this article will be an IBM project because it was typed on a PC, or a BT project because it used an ADSL line.
The Linux kernel being GPLv2 isn't a problem we can see for the future. It is a distinct work to the applications that run on it, just as Windows kernel is to Windows applications. The more awkward corner cases will be LGPL and similar licenses where you want the benefits and flexibility. The FSF have indicated they understand that and will ensure it works out. The licenses are about having barriers to abuse, not barriers to use.
> be negative consequences for them, or do you think that life will
> just go on as before?
I'm not sure what will happen with the rest of the GPL licensed software world. It really is too early to say because the license is a draft at this point and various areas around patents and optional clauses remain open to correction and improvement.
Most GPL licensed code is not controlled by the FSF and probably has too many contributers to relicense. Stuff that is new or has a few owners might change license if the new license is good. However given that most of the work on the FSF owned projects is done by non FSF people then if the license is bad I imagine all the developers will continue the GPLv2 branch and the FSF will be left out in the cold. The FSF know this too and that's why it takes time to build a new license and consensus.
It may well be the new license is mostly used with new code.
> What's the main problem you have with GPLv3?
For the the kernel there are a few, the big one that is hard to fix is the DRM clause. Right now the GPLv2 covers things like DRM keys in generic language and it means the law can interpret that sanely. Its vague but flexible, which lawyers don't like of course. There isn't any caselaw but out of court settlements support the fact this is enforcable.
The GPLv3 variant is much stronger and it appears to cover things like keys to rented devices where the DRM logic is less clear.
The big one though for the kernel is not a legal matter or even a specifically GPLv3 matter. Many people contributed to the kernel under a set of understood terms. Not only would all those people have to agree to a change in those terms but those terms changing would prevent some of the existing users from continuing to use it in the manner they do now.
You can't really make an agreement like that and then change the rules on people who've contributed time, money and code to the Linux project. I support Linus' assertion that legal issues aside he doesn't have the moral right to change the rules this way.
> My question concerns the timing of the recent white paper: why was it
> released now, and not at the beginning of the GPLv3 consultation process
> when it might have been able to influence things?
The process is not over, and we still hope to influence things. We would not have written that letter otherwise. The main reason it was not done earlier is that we just did not think it was going to be a problem, as the kernel was not going to change licenses. But the more that we realized this was going to have a problem outside of just the kernel, and affect the whole community, we felt that we should at least voice our opinions.
Also, please note that the DRM issues have changed over time from being very broad (which was at least admirable), to being explicitly targeted at only the Linux kernel. Now the license is worded to try to stop the "tivoization" issue.
This is the where a bootloader or bios determines if the crypto signature of the kernel is acceptable or not before it decides to run it or not. This means that only "approved" kernels that come from the company will run properly on the hardware.
Now this kind of restriction pretty much _only_ affects the kernel, not any other type of program. This is because only if you can control the kernel can you ensure that the system is "secure".
So it seems that the FSF is only targeting the Tivo issue, which us kernel developers have explicitly stated in public that it is acceptable to use _our_ code in this manner. So they are now trying to tell another group (us) what we should do to our code.
As the FSF has no contribution in the Linux kernel, and has nothing to do with it in general, we kernel developers are now a bit upset that someone else is trying to tell us that something we explicitly stated was acceptable use of our code, is suddenly bad and wrong.
> Given that the FSF is unlikely to throw away all the work it has done, or
> even modify it substantially/substantively, do you have any thoughts on
> what's going to happen?
I really have no idea, but I would hope that things change for the better. We are already hearing rumors from the people on the different GPLv3 committees that our statement has had an affect, but we will not know for sure until the next draft comes out.
Well gee. We're programmers and we spend our time programming, not swanning around at meetings talking about legal matters and playing politics. We find things like licensing to be rather a distraction, and dull. So most people largely ignored it all.
It was only later in the process when the thing started to take shape, when we saw where it was headed and when we began to hear the concerns of various affected parties that there was sufficient motivation to get involved.
In fact this points at a broad problem with the existing process: I'm sure that a large majority of the people who actually write this code haven't made their opinions felt to the FSF. Yet the FSF presumes to speak for them, and proposes to use their work as ammunition in the FSF's campaigns.
And why haven't these programmers made their opinions known? Some are busy. Many work for overlawyered companies and are afraid that they might be seen to be speaking for their companies. Some don't speak English very well. Almost all of them find it to be rather dull and a distraction.
For the kernel I'm pretty sure things will go on as they have before.
The problems are most likely for the projects under the GNU Project umbrella. All the copyrights to those projects, such as GCC, Binutils, etc. are all assigned to the GNU Project. So the FSF could, and almost certainly will, make all of those projects use the GPL v3.
As an aside, I will note that originally the FSF used to say that they wanted copyright assigned to them "to make it easier to enforce the GPL in court for software projects under the GNU Project umbrella." But as is clear today, it's also a power thing, in that having all the copyrights assigned to them allows the FSF to choose the licensing of the code as they see fit since they are the copyright holder of the complete work.
At the point of a relicense to GPL v3 for these GNU Project source trees one of two things could happen. Either the developers are OK with this, even if to simply "grin and bear it" and things go on under GPL v3. Or, the developers are unhappy with this, and fork off a GPL v2 copy of the tree and do development there.
In the end, even though they've assigned their copyrights to the FSF, the developers do control the ultimate licensing of these GNU projects. If they don't like GPL v3 and work on the GPL v2 fork instead, the FSF is at a loss because while they can mandate whatever they like such mandates are useless if the developers don't want to contribute to the GPL v3 variant.
So being the ones who do the development work is actually a kind of power which permeates through all of the politics. If the political folks do something stupid, the developers can just take their talent and efforts elsewhere.
I'm more than familiar with this process, since I was part of the group that forked the GCC compiler project many years ago because the majority of the GCC developers found the head maintainer (Richard Kenner) impossible to work with. Although he was quite upset about it, there wasn't much that Richard Stallman and the FSF could do about it. In the end the fork became the "real GCC" under GNU Project umbrella once more.
So the opinion of the developers matters a lot, especially when it comes to licensing. It could get messy is a lot of these projects fork, but the GPL v3 isn't a done deal yet so the FSF still has time to fix things up and make it more palatable to people.
> Alone among those polled for their views on the v2 and v3 you choose
> 0 I don't really care at all
> why is this when everybody else seems to hold such extreme views on the
> subject? Do you think they're getting worked up over nothing?
First, I think the poll was pretty useless.
The poll asked what people think of the GPL v3 draft, which is by definition in draft state and therefore not ready for final consumption. Of course the GPL v3 still needs some fixing. So asking about actually using it in a major software project right now is totally pointless. What would have been more interesting would have been to ask what the developers think about the "core issues" rather than the specific implementation of those issues in the current GPL v3 draft.
For example, polling on what the kernel developers thought about "keying of the Linux kernel" in the way that Tivo does would have been much more interesting. In my opinion, I believe you would have seen about an even split down the middle on this one. But even the people who are against keying think that the DRM language in the GPL v3 meant to combat this is not done correctly.
Several kernel developers believe that GPL v2 already has enough language to make restrictions such as keying be not allowed.
Personally, I'm against keying and I'm quite unhappy with what Tivo did with the Linux kernel. This kind of keying sets a very bad precedent. For example, in the future vendors could get away with some questionable things using keying. Say a vendor sells a piece of hardware, and provides the GPL source to the kernel drivers of the new things in that piece of hardware. Then, they only allow kernel binaries signed with a special key to load. This makes the publishing of their drivers effectively useless. The vendor still controls everything and nobody gains from the code they've submitted. Nobody can recompile a kernel with their changes and actually test it on their hardware, since they have no way to produce a signed kernel that the device will actually allow to boot. So the value of this "contribution" is absolutely zero. This is, in my opinion, totally against the spirit of the GPL v2.
I would have been perfectly fine with Tivo using another OS for their product. All the world does not have to be Linux, and if Linux's license doesn't suit someone, they are free to not use it.
In most examples I've ever been shown where this kind of lockdown is supposedly "legitimate", the owner of the device is effectively making this lockdown decision. For example I'm OK with electronic voting machines used in real elections having their software locked down. The government owns those machines, and is well within their rights to lock down that hardware in order to ensure a proper and fair election to the public by preventing software tampering.
But if I purchase an electronic voting machine of my own, and it uses the Linux kernel, I very much want to tinker with it, build my own kernels, and try to poke holes in the device. How else could we validate that electronic voting machines are safe if the public has no way to test these claims out for themselves, in particular when such devices use open technologies such as the Linux kernel?
Commons are things that are held, well, in common, for the benefit of all. The traditional commons is the common land, many of which still exist in England - Clapham Common, for example. But commons can be anything. For example, free software is a commons, as is open content. The air we breathe is clearly a commons, as are the world's oceans.
Less obvious, perhaps, is the commons of tranquillity. Like other commons, it can be destroyed for all by the selfish actions of a few. But what exactly is the state of this commons today? Here in England, we now know, thanks to a neat map of this commons put together by the Council for the Preservation of Rural England.
This actually quite useful because, as they say, if you can't measure it, you can't manage it: if you don't know where the commons is most threatened, you can't take action to protect it. Well done CPRE. (Via BBC News.)
22 October 2006
21 October 2006
Innovate - the "journal of online education" - has an issue exploring
the potential of open source software and related trends to transform educational practice.
Nothing hugely new there for readers of this blog, but there are some articles with interesting case studies from the educational world. There's also a typically thoughtful and well-written piece by David Wiley, who invented the term "open content" back in 1998. You'll need to register (email address required), but it's worth that minor effort.
20 October 2006
There's a piece on C|net which I can only hope was written with the express intent of provoking a reaction, since it's basic idea is so batty:
A European court last month agreed with a group of regional publishers in Belgium that accused Google of ripping off their content. The court ordered Google to remove text summaries of the newspapers' articles, along with Web links to the publishers' sites.
As world and dog have pointed out, what Google News does is provide free - yes, free - publicity for news sites, leading to free - yes, free - extra traffic, which can then be converted to what we in the trade call dosh. The idea that Google is somehow "ripping off" the poor old media conglomerates is risible. But luckily, they seem intent on slitting their own throats, so let 'em, says I.
More serious is the implicit assumption in the C|net piece that there is something sacred about copyrighted material. Maybe there would be, if copyright did what it was originally intended to do: to provide an incentive to the creator to create. But now that copyright typically runs for 50 or even 70 years after the creator's death, it's hard to see how new works are going to be conjured up except with a Ouija board.
Copyright has broken the original social compact, which is that people aren't allowed to copy it for 14 years - yes, 14 - in return for being allowed to do what they like with it afterwards. As copyright is extended time and time again, it is becoming impossible ever to access the content it covers: there is no quid for the quo.
So copyright has become the "rip-off", demanding without giving. If media companies really wanted to stop people using their materials, they should go back to a balanced copyright that gave to both parties. The current system is so inequitable that it is no wonder most people feel morally justified in ignoring it.
I've not really been following the IE7 saga, since it seems to be a case of too little too late. I was pleased to find my prejudices confirmed by the Grand Old Man of populist tech journalism, Walter Mossberg:
The new Internet Explorer is a solid upgrade, but it's disappointing that after five years, the best Microsoft could do was to mostly catch up to smaller competitors.
Looks like I was overly pessimistic about the Spamhaus case:
On 19 October 2006, United States District Court Judge Charles P. Kocoras, presiding over the e360Insight v. The Spamhaus Project matter in the Northern District of Illinois, issued an order denying e360Insight's ("e360") motion asking the Court to, among other things, suspend www.spamhaus.org. The Court explained that the relief e360 sought was too broad to be warranted under the circumstances. First, the Court noted that since there is no indication that ICANN or Tucows acted in concert with Spamhaus, the Court could not conclude that either party could be brought within the ambit of Federal Rule of Civil Procedure 65(d), which states that an order granting an injunction is "binding only upon the parties to the action, their officers, agents, servants, employees, and attorneys, and upon those persons in active concert or participation with them." Second, the Court stated that a suspension of www.spamhaus.org would cut off all lawful online activities of Spamhaus, not just those that are in contravention of the injunction the Court previously issued against Spamhaus.
Kudos to Kocoras for his intelligence, and to ICANN for not rolling over as I feared they would.
19 October 2006
...as in Second Life:
The Plone Foundation announced today the broadcasting of the Plone Conference 2006 into the virtual world Second Life. It is the first big open source conference being colocated inside a virtual world. The event will be held from Oct 25-27
"We'll broadcast selected talks and tutorials each day into a virtual conference building inside Second Life. Residents who could not make it to the by-now sold-out conference can participate in virtual form. A back channel for their questions to the actual speaker will be provided, too"
And so the boundaries between RL (real life) and SL (Second Life) became just that tiny bit more friable....
It might seem strange to talk about Sun supporting OpenOffice.org - after all, it was the original donor of the code to the open source community. But what's changed is that it is now offering service plans for the software: this is news, because in the past it has only supported its own variant, StarSuite.
This is also important, because it provides a safety net to companies and government departments who want to use OpenOffice.org. In the past, they have been forced to opt for StarSuite if they wanted support; no longer.
Well done, my Sun.
The ODF Alliance has been going for a while now, but even so this list of 300+ members is a forceful reminder that this is a standard that is getting stronger day by day. (Via Erwin's StarOffice Tango.)
The complete works of Charles Darwin are now online. This is certainly an important moment in the evolution of academic knowledge, since it points the way to a future where everything will be accessible in this way - call it the Googleisation of academia.
18 October 2006
Citizendium, Larry Sanger's Wikipedia fork, is opening its doors, albeit in a very controlled sort of way, as a private alpha. At least the press release - characteristically lengthy - sketches in some of the details as to who is doing what with this interesting project. I'll be writing more about this in due course.
A few months ago, I asked whether lack of open access to avian 'flu data might hinder our ability to head off a pandemic; now it looks like lack of open access could lead to the destruction of civilisation as we know it. If that sounds a little far fetched, consider the facts.
The US is the largest single polluter in terms of carbon dioxide: according to the US Environmental Protection Agency, "In 1997, the United States emitted about one-fifth of total global greenhouse gases."
The EPA plays a key role in determining the US's environmental actions: "the Agency works to assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies; and provide leadership in addressing emerging environmental issues and in advancing the science and technology of risk assessment and risk management."
To "assess environmental conditions and to identify, understand, and solve current and future environmental problems; integrate the work of scientific partners such as nations, private sector organizations, academia and other agencies" clearly requires information. Much of that information comes from scientific journals published around the world. Unfortunately, the EPA is in the process of cutting back on journal subscriptions:
The U.S. Environmental Protection Agency is sharply reducing the number of technical journals and environmental publications to which its employees will have online access, according to agency e-mails released today by Public Employees for Environmental Responsibility (PEER). This loss of online access compounds the effect of agency library closures, meaning that affected employees may not have access to either a hard copy or an electronic version of publications.
In addition to technical journals, EPA is also canceling its subscriptions to widely-read environmental news reports, such as Greenwire, The Clean Air Report and The Superfund Report, which summarize and synthesize breaking events and trends inside industry, government and academia. Greenwire, for example, recorded more than 125,000 hits from EPA staff last year.
As a result of these cuts, agency scientists and other technical specialists will no longer have ready access to materials that keep them abreast of developments within their fields. Moreover, enforcement staff, investigators and other professionals will have a harder time tracking new developments affecting their cases and projects.
So, we have the organisation whose job is to help determine the actions of the world's worst polluter cut off from much of the most recent and relevant research, in part because much of it is not open access.
No OA, no tomorrow, no comment. (Via Open Access News.)
Technocrat pointed me to this story on Village Voice, a title I used to read assiduously in my younger days. It's about how a network of planespotters have put together many of the pieces that go to make up the shameful jigsaw puzzle of the CIA's "torture taxi" operation, used for moving people around the world to be held and tortured without judicial oversight.
What's fascinating is the way that tiny, apparently meaningless contributions - a photo here, a Yahoo search of a plane number there - when put together, can help create something really big and important, just as open source projects pool the work of hundreds or thousands to create vast and astonishing achievements like GNU/Linux or Wikipedia.
Andy Updegrove has a short but justified paean to the wonder that is Unicode, one of the unsung heroes/heroines of the computer revolution. Apparently version 5.0 is now available. Don't all rush to buy a copy at once.
17 October 2006
This story from Cory Doctorow on Boing Boing about someone allegedly trying to copyright a fabric seems to be fading away, but its life has not been in vain: it's brought us this wonderful parting shot:
Thanks Cory, you really got us! We were really putting one over on everybody - and you totally busted us! Saving the world from evil fabric stores, you are, one post at a time...
Well, not if you look at what's on offer:
# Krita Becomes Usable for Professional Image Work
Krita and its maintainer Boudewijn Rempt won the aKademy Award for "Best Application" at this year's KDE conference in Dublin. With features such as magnetic selection, effect layers, colour model independence and full scriptability, it has risen to become what is probably the best free image editing program today.
# Lots of New Features in Kexi
Kexi, the desktop database application competing with MS Access, is the other application in KOffice that is already the best of its kind. Kexi has received over 270 improvements since KOffice 1.5. With this release, Kexi gains such features as the ability to handle images, compact the database, automatic datatype recognition and Kross scripting tools.
# KFormula Implements OpenDocument and MathML
The formula editor of KOffice now supports OpenDocument and MathML and uses it as its default file format. It also surpasses the equivalent component in OpenOffice.org, scoring 70% on the W3C MathML test suite compared to 22% for OpenOffice.org Formula. We see this as one example where the work to provide a very well-structured codebase of KOffice pays off to create a superior support for the existing standard.
KOffice is clearly storming away. I can't wait for the Windows port to introduce more people to the free software way....
MySQL's success is impressive, and provides a handy example of pervasive corporate open source that isn't Apache. Although I'd seen about its new Enterprise offering earlier today, I must confess I hadn't picked up on the complementary Community product until I read this post by Matt Asay. It's a shrewd and necessary move that will doubtless be imitated by others.
16 October 2006
How can you not love a book whose author introduces it thus:
My latest book, Capitalism 3.0, is out this week. It’s about how to upgrade our economic operating system so that it protects the planet, shares income more equitably, and makes us happier, while preserving the strengths of capitalism as we know it. The key to my proposed upgrade is to rebuild the commons, that dwindling set of natural and social assets that benefit everyone.
In the spirit of enlivening the cultural commons, the book’s publisher, Berrett-Koehler, has agreed to an experiment. They are selling the book in the usual places — in bookstores and on-line — but they’re also allowing readers to download the book from this web site for free.
I've not read it yet, but will do: I'm sure it'll be worthwhile. Until then, I suggest everyone spread the word to reward the author and his enlightened publishers, and to fulfil the former's hopes - and help with that upgrade:
As the author, here’s what I hope will happen. I hope many of you will download and skim the book. If you’re intrigued, you’ll read the preface and first chapter either on the screen, or by printing just those pages. You’ll then decide you want to read the whole book, give a copy to a friend, or keep it on your bookshelf or coffee table. So you’ll go to your local bookstore, or to an on-line vendor, and buy the handy, long-lasting version, printed on acid-free paper.
I've written about crowdsourcing before, and this is an interesting application: writing a book called "Why Are You Here - Right Now" (YRUHRN).
Project YRUHRN was started with one idea in September of 2006. It all started with a HIT posted on Amazon's mTurk offering a penny for your answer to 'Why are You Here - Right Now?'. From that HIT, over 500 answers were given in a weeks time.
We have taken those answers, and compiled them in a book that will speak to a part of everyone.
We will see if it is possible, through crowdsourcing and the power of work-at-home people, if a book can be written and published in just 30 days from idea to publishing.
Evidently it was, and the result can be downloaded for free from Lulu.com (for a while, at least).
One of the things that continues to amaze me about blogs is the quality of some of the writing. A case in point is this fantastic essay by Richard Poynder. It's an extremely thorough consideration of whether open access means that peer review is on the way out.
Here are a couple of ideas that were new to me:
In September, for instance, a group of UK academics keen to improve the way in which scientific research is evaluated launched a new OA journal called Philica.
Unlike both Nature and PLoS ONE, Philica has no editors, and papers are published immediately on submission — without even a cursory review process. Instead, the entire evaluation process takes place after publication, with reviews displayed at the end of each paper.
Philica is not the only new initiative to push the envelope that bit further. Another approach similar in spirit is that adopted by Naboj, which utilises what it calls a dynamical peer review system.
Modelled on the review system of Amazon, Naboj allows users to evaluate both the articles themselves, and the reviews of those articles. The theory is that with a sufficient number of users and reviewers, a convergence process will occur in which a better quality review system emerges.
And you're getting it all for free: true open access. I just hope you are grateful.
Here's an interesting Ars Technica story about Microsoft being forced to do the right thing - and benefitting from it - with its rival to PDF, called XPS:
Microsoft had previously indicated that its XPS technology would be licensed "royalty-free" to developers, and the company also promised a so-called "covenant not to sue" provision for businesses working on XPS print support, scanning technologies, and certain graphics display technologies.
However, at the behest of the EU, Microsoft is now taking matters a step further. A company spokesperson told Ars Technica that Microsoft "agreed to submit our new fixed-layout document format—the XML Paper Specification—to a standards-setting organization, and to revise the licensing terms on which the specification is made available to other software developers."
Microsoft is looking again at its license in order to make it compatible with open source licenses, which means that the "covenant not to sue" will likely be extended to cover any intellectual property dispute stemming from the simple use or incorporation of XPS. The end result is that using XPS may be considerably more attractive for developers now that the EU has apparently expressed concerns over the license.
The moral: open up, and you reap the benefits.
15 October 2006
Back in the 1990s, I used to write about VRML quite a lot. VRML - Virtual Reality Modelling Language - seemed like the future, but turned out not to have one, at least not in that form. As you may have noticed, it more or less disappeared, though I now realise where it went.
I also often wondered where the VRML pioneers went. One of them is Mark Pesce, whom I've just discovered through this post called "Trust, But Verify". It's of note for two reasons.
First, it's well written, and worth reading for that alone. But secondly, because it touches on what is becoming a key issue in the Web 2.0 world, that of trust. Trust - and reputation systems - lie at the heart of openness. It's a subject of particular interest to me, and I'll be writing more about it here and elsewhere in due course.
The ability of blogs to pick up on stories that the mainstream media miss or choose to ignore is by now well known; less remarked upon is the fluidity of the blogging world - the fact that a blog can comment on anything, even apparently far beyond its area of specialism.
A case in point is this post on Get Outdoors - "Everything you need to GetOutdoors". Hardly the place where you'd expect to find material headed "Chinese Troops Gun Down Tibetan Refugees". What's even more remarkable, though, is that this story, of international importance given China's continuing denial of human rights abuses in Tibet, is only now being picked up by the traditional outlets, who somehow overlooked it the first time around.
All power to the blogging elbow.
Update: The BBC has now picked up on the story, and is running a video showing the events. Interestingly, the clip was first shown on a small video sharing site in Romania - further proof that Web 2.0 is starting to trump MSM 1.0 these days.
13 October 2006
Here's one that completely passed me by: the European Union Public Licence. There's a very full discussion of why the EU is doing this, as well a rather sceptical comment from the FSF on the subject. But probably the best place to go for a succinct discussion is this one from Matt Asay, which is where I came across the idea in the first place.
This post raises a good point: people's reluctance to set up a dual-boot GNU/Linux and Windows machine. Unfortunately, this reluctance is absolutely justified. I've set up dozens of them, and they nearly always go pear-shaped.
That's partly why I love live CDs: you get all the benefit of a dual-boot system, without the risk. Even better, you can keep swapping in different ones to produce various kinds of systems. (Via Digg.com.)
It's not much compared to the swathes of spectrum that have been auctioned off, but it's a start:
The FCC officially signed off on the plan to allow low-power wireless devices to operate in so-called "white spaces" in the television spectrum.
The latest cheery reading from Friends of the Earth puts a price-tag of around £11 trillion (that's £11,000,000,000,000, in case you were wondering) on the economic damage caused by runaway climate change, by the year 2100. An estimate, and probably an under-estimate.
And yet, we hear, the cost of implementing the Kyoto Protocol is "too high" for the US economy:
For America, complying with those mandates would have a negative economic impact, with layoffs of workers and price increases for consumers.
Right, a "negative economic impact": what, like to the tune of a few trillion pounds? I don't think so....
Here's eWeek all breathless:
If the plan is perfectly executed, Nicholas Negroponte's One Laptop Per Child project will deploy 100 million laptops in the first year. In one fell swoop, the nonprofit organization will create the largest computing monoculture in history.
Well, that depends how you define monoculture.
Yes, if you mean exactly the same machine; but definitely not, if you mean effectively the same environment. The honour of mega monoculture certainly belongs to Microsoft Windows, in all its later incarnations. Each has offered what is basically the same lush virtual mulch to several million crackers: the operating system, Internet Explorer and Outlook. What more do you need? As the unrelenting attacks based on just these elements show, you certainly don't need to have identical systems to succeed in sowing mayhem. (Via Techmeme.)
I mentioned previously that it's a sure sign that things are moving if rich and respectable people like accountants start warning about global warming; and so when the insurance companies start doing it too, we must really be getting somewhere.
Moreover, it is precisely these people - not all us right-on greenies - that will ultimately make Mr and Mrs on the Clapham Omnibus do something: not because they necessarily care, but because it will cost them far too much not to.
On the one hand, OpenOffice.org is a powerful and therefore complex program, and so benefits from a little bit of training. On the other hand, the OO.CBT interactive tutorial from Resolvo (free for home users), while quite well done, is written entirely in Flash....
Ah well, you'll just have to make up your own mind on this one. (Via OpenOffice.org Training, Tips and Ideas.)
I've always rather like the term 'wetware', so I suppose I'm duty-bound to promote something calling itself OpenWetWare:
OpenWetWare is an effort to promote the sharing of information, know-how, and wisdom among researchers and groups who are working in biology & biological engineering. OWW provides a place for labs, individuals, and groups to organize their own information and collaborate with others easily and efficiently. In the process, we hope that OWW will not only lead to greater collaboration between member groups, but also provide a useful information portal to our colleagues, and ultimately the rest of the world.
In fact it's so cool, it offers its content under not one, but two open content licences: CC and GFDL. (Via Public Library of Science - Publishing blog.)
What do you get when you combine OpenSolaris, the GNU utilities, and Ubuntu? Nexenta -- a GNU-based open source operating system built on top of the OpenSolaris kernel and runtime.
Yes, fascinating, but why bother?
12 October 2006
Well, I still don't really know what's going on as far as patents in Europe are concerned. But the Foundation for a Free Information Infrastructure (FFII), an organisation whose judgement I generally trust in these matters, seems happy enough with the latest vote in the European Parliament:
"We're 80% happy with the result" comments Jonas Maebe, FFII board member. "The main unfortunate artefact left in the adopted resolution is the fact that it promotes accession of the EU to the European Patent Convention, which would delegate most patent-related responsibilities to the civil servants of the Commission and member states. Overall our main concerns are however addressed and we would like to congratulate the many MEPs and assistants who very worked very hard for this result."
User-generated sites are becoming increasingly accepted as a viable way of creating information and even money. But there's a huge problem faced by any new entrant in this sector. For a site to draw in new contributors, it needs lots of readers; but to gain those readers, you need good content, which means lots of good contributors.
A good example of this Catch-22 situation in practice is the new Helium: there's nothing wrong with the site, but equally there's nothing special about it either (though the use of peer review is interesting), so it's hard to see it gaining the momentum needed for it to succeed. (Via ContentBlogger.)
A nicely provocative post from OnTheCommons.org, which points out the destructive effect of money on the commons:
What is called "economics" today is the world as seen through the myopic and tendentious lens of money and price. If something is transacted through money it has reality; if not it does not exist. It makes no difference that trees provide shade and neighbors provide comfort; it makes no difference that they serve real needs. They are not sold for money and therefore they do not count. Therefore, the more "the economy" destroys these things – the more it displaces that which is free for commodities that we have to buy for money – the more the economy is growing and the better life is getting, or so we are told.
11 October 2006
Here's the EU and International Atomic Energy Authority trumpeting all sorts of stuff, including the fact that the former's Joint Research Centre:
has developed software which monitors a wide range of open access sources such as news articles, research papers, reports and satellite images.
The ever-perceptive Peter Suber comments:
I'd like to see the EU make the software public and open the source code. I'm assuming it works with a separable database of cues and sources relevant to nuclear non-proliferation, which could remain classified. The software was developed at public expense, has general utility, and could serve another urgent public purpose: accelerating scientific research. It wouldn't be the only text-mining application around, but I'm assuming that the IAEA wouldn't have chosen it unless it had some strengths missing from other packages. The public gains when new tools and access policies make public research more useful than it already is --and OA benefits when new tools give authors and publishers an extra incentive to make their work OA.
For old-timers such as myself, the Eudora email client has connotations of pure Web 1.0-ness. During the 1990s it was more or less the definitive piece of Windows software for sending email if you didn't want to besmirch your name by using Outlook, once that existed. Times have changed, of course, and Thunderbird has now taken over that role.
So the announcement that future versions of Eudora will not only be open source but based on Thunderbird, seems to close the circle nicely. (Via LWN.net.)
One of the ideas that I've been banging on about on this blog is the commonality of the commons - how entirely disparate areas like open content and the atmosphere have much in common. Of course, I didn't invent this meme, and there are plenty of others out there helping to push it. The latest I came across was Michael Geist, with a piece on this idea from a Canadian viewpoint.
In a comment to a previous post about Google's acquisition of YouTube, I muttered something about Google having losts of dosh and good lawyers to see off the inevitable lawsuits that will follow that move. Here's a rather more thoughtful take on that, ending with the following interesing idea:
So I think the YouTube acquisition may well represent a legal opportunity for Google (and the Internet industry generally), rather than a vulnerability. After all, litigation to define the copyright rules for new online services are inevitable -- better to choose your battles and plan for them, rather than fleeing the fight and letting some other company create bad precedents that will haunt you later.
I'm a big fan of Google's Writely online word processor - I now do most of my writing with it. I can't say the same about Spreadly, I mean the online spreadsheet, because it lacks basic features like charts (as far as I can tell). But Google are steadily making improvements - for example, by integrating the Writely and Spreadly file spaces.
10 October 2006
This looks eminently sensible from just about every angle:
Energy suppliers should make it easier for people to generate power for their own homes, the gas and electricity regulator, Ofgem, said today.
Why has it taken so long for the idea of distributed power generation to get going?
I'm a big fan of Flickr, even if I don't have much call to use it. Perhaps one reason for that is that it's a bit of a pain finding stuff: tags are only approximate at the best of times.
I think I might start using it some more thanks to FlickrStorm, a kind of search engine plus:
It works by looking for more than what you enter to find related and more relevant images... Be surprised!
When I gave it a whirl, I won't say I was deeply surprised, but maybe pleasantly so, on the basis of both the images it found, and the rather cool way it displayed them, with a scrollable set of thumbnails on the left that bring up the main photo on the right remarkably quickly. Worth taking a look. (Via OpenBusiness.)