09NTC plenary: Eben Moglen

[Holly Ross:]

Let’s go back to that word “community”. What I think is really interesting is not that we just talk to each other all the time, which we do, which is fantastic, but that this community has such broad representation, and that we all get to play pretty nicely together regardless of where we come from or what we’re trying to do. That spirit of sharing has been something that really drives the community, whether you are… So, you know, in this community we have Blackbaud and Convio, we have Microsoft and open source, we have all these players here in this space, and that’s something that I think contributes to our success.

The other thing that I think is really important about having all those different community members in one space is that we have the chance to have really good, healthy debates. Or as I like to say: “I don’t think it’s a good NTC until at least one fist fight has broken out in the hall.” Not really, though, OK, please. Ideological fist fight – fight with your brains. So, you know, I think that that is really important.

And so, we are really excited today to have another New Yorker from Columbia, Eben Moglen, here to speak today. And Eben is a professor at Columbia Law, he’s clerked for Thurgood Marshall, so you know that his rabble-rousing skills are good, and he is currently the executive director of the Software Freedom Law Center, and we’re going to have a fantastic discussion today about what “open” means for non-profits in the technology space.

I’m thinking that the Internet is still working, Johanna Bates? Confirm, deny? Internet’s working? Thank you! So, hopefully, today you really can submit some questions while we go through today’s plenary, and we’ll have time, about 20 minutes at the end, to get to that Q & A. So that’s the URL you need to go to; again, you need a Google Account to log in and do that, so share with your neighbors. And please join me in giving a very warm welcome to Eben Moglen.

[Applause]

[Eben Moglen:]

Thank you, it’s an honor to be here. I’m very grateful to the organizers for their invitation. I feel – more, I must admit, than usual in my travels – as though I’m among my own. I too run a sometimes struggling non-profit legal practice, and our practice is about the work of non-profit communities, some of them individuals and some of them groups. In the four and a half years that my colleagues and I have been practicing together we have formed more than sixteen non-profits, and we have a client list of about 30 or so non-profit organizations, that we have assisted in one way or another. So we are

servants of the community, and it’s a pleasure to be here this morning in the community in which I belong.

The most serious problems that confront humanity are about human beings and their intelligence, which oft goes agley, as Robert Burns told us, no matter how good the plans may be. And the most serious tool that we have to confront the problems made by human intelligence, is human intelligence. We, collectively, are in the business of maximizing humanity’s ability to use its intelligence to make life better for people, and in doing that, the gravest difficulty that we confront is that all societies since the beginning of human sociality have thrown away

most of the brains they had.

Let’s begin with a simple question. How many of the Einsteins that ever existed were allowed to learn physics?

One. Or two, maybe. And that’s the nature of the problem on which we all work, one way or another, in the service that we attempt to perform for humankind.

The primary difficulty of the 20th century was that it discovered extraordinarily efficient ways for people to work in regimented forms, but it made very little progress over where the 19th century left us with respect to the ability to educate every human mind. Among the reasons that

20th century civilization made such little progress – we do, you know, we still throw away almost all the brains – the reason we made such little progress is largely that we treated knowledge as a thing that could be owned, and therefore need be purchased. And no matter what we did to attempt to equalize ability to purchase, we didn’t equalize very much, and most of the children in the world are deprived of the real opportunity to learn – they can’t afford to.

The central problems of the human race therefore depend upon easing the ability of brains to feed – we must stop starving the intellect that gets us out of the messes we think our way into.

To do that, then, we begin, at the end of the 20th century, to imagine reversing the long and complicated relationship between the human race and the idea that knowledge is something that you own. We reverse that course by beginning, once again, to treat – unsparingly and without any degree of forgiveness for the alternative – we begin to treat knowledge as a thing that must be shared in order to be valuable.

Of course, we continue to exist in a world in which it is considered to be acceptable to treat knowledge as a thing that can be owned. The consequence is that there are people who will die because the knowledge of the molecule that will help them not to die is owned knowledge, and someone has secured, for a substantial portion of a human lifetime, the exclusive right

to deploy that knowledge, which raises its price, decreases its availability, and condemns some people to extinction.

These are only some of the consequences of the belief that knowledge is a thing that can be owned. And we live now, all of us, and indeed much of the world – soon all of the world – we live in the midst of technology which makes it unnecessary even to discuss the conception of the ownership of knowledge, because it is possible, efficiently, to share.

In a world where everything’s a bitstream, where everything has zero marginal cost, where if you have one copy, you can make a billion copies at no additional expense, the ownership of knowledge is a moral problem. If we could feed everybody

by cooking one breakfast and pressing a button, what would the case be, what would the argument be, for charging people more for food than everyone could pay? Of course, we can’t just cook one breakfast and press a button, but we can make one operating system and press a button. We can make a database and press a button. We can make a novel, a film, a poem, a symphony, a dance, or a design for survivable low cost constructible housing, and press a button.

In other words, in the world that we have made, – the digital world – we have escaped one of the principal reasons that we threw away all the brains we threw away. And – as many of you, in the work you do, are acutely conscious – we have as many children with us now as there were human beings in the generations that preceded our own. All of them, put together. That means we are either about to throw away as many human brains as have ever been thrown away in the whole history of the human race, or we’re about to reverse the flow at the moment where it will do the most good.

This is the context within which we have begun to use technology ourselves in our own lives in a slightly different way. The more we use the technology in our own lives in a slightly different way, the more we bias our activity towards sharing rather than owning, – or even, the more we bias our activity towards sharing rather than doing business with those who claim to own – the more we are establishing the fundamental principle by which we will make a kind of social justice that will attack one of the root causes of human misery: the throwing away of all those brains that wanted to learn and couldn’t.

The free world has a research facility in Bangalore. It’s less than 30m² in extent, it exists in a part of town where 2,200 people live with one toilet, and are glad to have it. And the workers in that research facility are children, playing with computers. They are friends of mine. They write, and paint, and sing,

and what they do is helping them to help humanity in ways that they can feel, and see, and touch.

For 70 generations, probably, their families have not been allowed to possess anything. Until a generation and a half ago, they were considered untouchable persons. Now they are merely the poorest of the poor. But not their brains.

I go to them with the same sense of enthusiasm about what I might learn from them that I get when I walk into the research facilities of the great IT companies. I know I’m going to see something neat before I go. But there I know that whatever I see comes with no non-disclosure agreement, and can be shared.

We are all now using what we all have made together. And we can use it more. In doing so, we are not merely making our own businesses cheaper to run, or even more efficient, more pleasant, more simple, more stable, we are also addressing a root issue in the injustices of people towards people, because we are reducing the political and economic might of knowledge that can be owned. This is the primary reason why all of us, regardless of the part of the human condition on which we work, can benefit, not only directly,

but theoretically, ethically, politically, from the adoption of technology we made by sharing.

I didn’t come to advertise products, I didn’t come to speak on behalf of clients, we’re all in the same business together. I came to say that we can do what we need to do, every day, all the business that we need to do, all the telephone calls, and messaging, and planning, and delivery, we can do all of it in a way which is calculated to address the basic question of how we allow everyone to learn.

It’s also fun.

So, what are we really talking about doing?

Well, in the first place, we’re talking about teaching people that they don’t actually need software that somebody owns in order to do anything they want to do. There’s always an answer which allows us to get our work done efficiently, and helpfully, and cheaply, and in a way that we consider even a little bit stylish or elegant, without having to support the idea that knowledge is a thing you need to own. We can innovate without having to make the claim that innovation only happens if you deprive some people of its value in order to make it more precious to those who can afford it. And we can begin to do what we do so much in this society, which is to repurpose its wealth.

All of you know that there is hardware out there, at the very moment we are talking, which belongs to companies that are shutting down and people who are being fired. In my university, – which tries, I think, to save money – when a computer is regarded as “scrapped”, you can’t even buy it if it’s yours in your office at a scrap price. The university considers it to be a waste of time and trouble to bother having a program to sell used hardware to the workers who use it – the only efficient thing to do is throw it away. And so it goes off into the e-waste stream somewhere, and undoubtedly, though we don’t mean to do it, some of it ends up somewhere in the world being disassembled by a child who does not know that some of it is poisonous.

In the system of owned knowledge more children will harm themselves by disassembling e-waste than will be given the opportunity to change the operating system. In our world, the reverse is true.

So, we repurpose. We repurpose material, we repurpose profit, we repurpose people’s willingness to work, we repurpose the instinct to assist. We are the sector of the economy that really understands what recycling is about. And if you take our ability to repurpose, and you add to it knowledge which is made by sharing, you create assets that would otherwise be wasted,

and you drive those assets towards making it possible for everyone to learn. And in addition, of course, they work very smoother, better, more elegant and cheaper – everybody wins. Or rather, everybody except a few. And from the few who have much to learn, much has been heard. But for us, it isn’t really difficult to choose. Everything points us in the same direction. Our principles, our objectives, our dreams, and our daily needs all coincide. They point us in the same direction: Could we make the very technology we use generate the energy which educates the world? Yes, we can. To borrow a phrase.

So, that’s where we live, now. Right on the cusp of that. You know yourselves in your own operations how close you are to being able to make that statement: “Everything I use is generating energy to make learning more easy, and more possible, and more equitable around the world.” You have only a few last lines to cut with the world of owned knowledge, and you will have attained not merely intellectual self-sufficiency, but the actual capacity to make knowledge easier for people all over the world to acquire, and you will have begun educating everyone and you will have eased the human tendency to throw away all the brains. It is what we tell everybody else a matter of acting locally, and thinking globally – within our own space, just where we live – in enabling the people who do good to use technology more effectively for all the things that must be done, from raising the money to auditing the books, delivering the services, making the plans, everywhere we operate, we can be pushing the great rock uphill.

Unfortunately the alternative is to assist the other side. And so it isn’t the case that we can be neutral in this matter. We are either helping to make knowledge something which can only be shared, or we are helping to continue knowledge as a thing that can be owned. Those who must turn a profit must experience a difficult choice, but we shouldn’t have any conflict at all.

So, if that’s the background, if that’s the moral proposition, if that’s the reason for believing that in the very daily texture of what we do, we can be making yet another difference, and that among the most important differences of all. What are the problems we do need to think about?

The computer – though I have been using them since I was twelve, though I have been using e-mail as a daily mode of communication since I was 14 – the computer is a drag. Right? It breaks, becomes obsolete,

needs to be fixed, has costs, clutters things up, uses electricity, makes entropy in the form of heat; wouldn’t it be nice if we just didn’t have to have any anymore? And one of the paths that we can go – one of the paths that we can take in our lives – is to use our intelligence to vaporize all the machines around us, and leave ourselves with the thinnest possible client and the thickest possible cloud. There can’t be anybody here who isn’t at least partly attracted by that – it would be so nice to get away from all the stuff one has to do, and let somebody else do it.

How should we think about that? How should we think about the fact that in order to ask me a question, you have to have enrolled yourself in the database of the world’s largest intelligence service? You can’t ask me a question today, unless you’ve decided to let Google know a great deal about you. Is that okay? If I were sitting out there right now, I wouldn’t be able to ask myself a question, [Laughter] And I’d sort of like to keep it that way. [Laughter]

Because, of course, that’s the other side, isn’t it? “How far do we want to share all knowledge, Eben? Isn’t it the case that there’s some we don’t want to share?” Yes, indeed there is. The line between the knowledge we want to share and the knowledge we want to keep to ourselves

is a crucial line, but unfortunately not a straight one. I tried to suggest how one might go about answering it, a little bit, by starting where I started. The primary question I have about the knowledge that might be shared is: “Might Einstein use it to learn physics, if he were a child in Trivandrum?” That is to say, the knowledge that can best be shared is the knowledge that helps minds to grow. Knowledge that ought not be shared is knowledge which would give someone power or leverage over a mind that it might be unfair to exercise.

I appreciate Google products very much. They are very clever. They are made by very smart people, many of whom I know quite well,

who work very hard and think about the ethics of what they do with great clarity and they talk about it, and I appreciate their discussion. But in the end, the wealth that they use to improve our lives comes from the production of advertising which distracts brains and makes it harder to learn. My browser doesn’t show me advertisements. I don’t like them, and I don’t want them there, I want to think about what I’m thinking about. Maybe your browser does that too.

But if everybody were taught how to use their browsers to remove the advertising, you’d have to do without the Google products, and we’d all have to do without Google. And maybe we’re just going to have to, in the end. Because otherwise we’re going to have to keep the knowledge about how to get rid of the ads in the web secret. And it isn’t a secret; it’s a thing an Einstein might want to know.

About us, however, what we bought yesterday, what we ate yesterday, how we felt about our partner yesterday, maybe not all the knowledge needs to be shared. It’s also not knowledge that needs to be owned, and you would particularly dislike it if it were owned by somebody other than you. Of course, that is what happens when you sign up to ask a question – some knowledge about you becomes knowledge owned by someone else. And they will sell it. Maybe the law will constrain a little bit how they sell it, maybe it won’t, maybe, like most laws, it can be evaded to the extent

that is needed, and that’s not very hard.

The European Union has excellent data protection law, or at least it thinks it does, but lots of people in the European Union buy ringtones for their phones, and sometimes they buy a ringtone from a North American corporation, and I know at least two North American ringtone sellers who have a “If you click here to buy your ringtone, we’re entitled to do anything we want to do with any personal data of yours we can acquire; oops, all your data just moved from the European Union to the North America and is now on sale, because you bought a ringtone.” business model. The Federal Trade Commission could, I suppose, interfere with that business model. Maybe the new Federal Trade Communication will. But the rules about the protection of your data are so

sketchy, so plastic, so easily evaded – and it isn’t just yours, of course – all being piped to the one Einstein who knows physics. Right? Or something. And he’s got a business model, and his business model is selling you. Of course, the world’s poor have less to worry about in that way, because they’re not worth selling quite so much in that story. They have less money to spend.

But maybe we ought to ask ourselves about adversing-supported services, whether they aren’t really just another form of sponsorship for the ownership of knowledge. Or rather, there’s one model for building cloud-supplied services which is about that model. Maybe we ought to be using something else. Maybe we ought to be thinking about how to free the cloud. I have some ideas about that and we’re working on them, but I’m sure you have better ones, and if you give them to us we’ll work on them, too. In other words, right, the principle of thinking about freedom in the architecture of the technology pays real benefits, avoids unintended consequences, creates ways of thinking ecologically about technology – in human as well as physical ecological terms – that would otherwise be difficult to retrofit. That we are already having difficulty retrofitting.

Law school, where I spend a lot of my time, is an awkward sort of place, because people compete there, harshly,

to learn how to do jobs which are done by collaborating. I’ve been a solo practitioner much of my life, but one of the things I love about the Software Freedom Law Center, and one of the reasons that I made it, is that I don’t have to practice alone. And I never do. Just in the few minutes I was standing here this morning, as you were eating breakfast, I was checking some things with people in New York, because I don’t like to practice alone if I don’t have to. But my students compete harshly for the right to learn how to collaborate after they graduate, from people who buy their time cheap, and sell it dear, and for whom “teaching them how to collaborate” means “teaching them how to have their brains leveraged”. Fortunately that system is falling apart before our eyes,

and as a teacher I am very glad to see it go.

But I mention it here, because for all of us, awareness has long since come to full heat, that without collaboration there is no success – that the purpose of the technology is to make us peers, to give us ways of communicating with one another on terms of quality, and helpfulness, and mutuality, and sharing. That’s what we use the technology for, at the first level of generality, because that’s all we need. People share time, people share money, people share skill, people share passion, and out of that we make a better world. We know that the technologies of collaboration are the technologies that in the end

will do best for us.

Out there in the industry – where our colleagues who must make profits, as well as improvements, live – collaboration is not necessarily actually the goal. The architecture of technology in the past 20 years has largely been about the making of platforms rather than communities. You know what platforms are; platforms are sticky things – it’s difficult to fall off. Once you’re in, your can’t get out. Once you buy the music player, you have to buy the headphones from the same guy, ’cause the guy has put the controls for the music player in the headphones, just to keep you on the platform. He might even convince you that you can’t have music

without going to him for it. He might even make a phone [Laughter] that you could change, but you’d have to get his permission first. And you couldn’t even share an improvement you had made without getting his permission to give somebody the improvement. Because every one of those moments of human cooperation is an opportunity for leverage of the platform.

The design of technology, in other words, at the very most basic level, assumes things about the nature of social life. If those assumptions are wrong for you, the technology is wrong for you – maybe subtly, maybe seriously, but unquestionably wrong for you. Because you need to collaborate,

and the technologist needs the platform to be sticky. To be sure, we benefit now, very largely, from cooperation with businesses who have realized that a sticky platform isn’t necessarily their greatest help. The more elite the business, and the further it is from contact with ordinary human beings, the more likely it is to have begun to experience the benefits of sharing and collaborating in our mode. But at the end of the day, it is not so difficult to tell the difference between fundamental designs that are about platform and profit from fundamental designs that are about community, and sharing, and not throwing away brains.

If you think about the technologies you use, and recommend, and

need, and evaluate, and consider, and you begin in your mind to categorize them as about sharing and community and collaboration, or about platforms and profit, you very readily see how easily the lumps can be differentiated, one from the other, and you can begin to think about which heaps you’re built on.

What we need to do, all of us, is to establish for ourselves a little internal bias. It doesn’t have to be complete – it needn’t be 100%. But it’s a bias – we fall, always, in the direction of technology of sharing, collaboration, and community. We fall, always, in the direction away from technology of platforms and profits. This is no surprise, and it’s not a provocative recommendation, in my judgement

– it’s simply being who we are. We will experience great advantages, but we will not even live to see the greatest advantages we are creating by doing so. Because the real question that we are agitating at the bottom when we do that is: “Are more Einsteins learning physics now? Are more children able to learn now? Is knowledge less owned, more accessible to those who cannot pay, now?” And we will derive benefit beyond calculation from all those human beings whom we will enable to cooperate with us.

I’m not breaking you any news when I mention that capitalism is having a little trouble lately. [Laughter] I’m not breaking any news to you when I say that the trouble it is having is the trouble that it has to have because it has to tell people that they must pursue the benefit of ownership at the expense of the sustainability of others. And, I’m not telling you a secret when I say that if you pursue individual benefit at the expense of others’ sustainability, in the end you will have a problem sustaining yourself. What is happening now to the technology of finance is not entirely unrelated to what happens to the technology of the ownership of knowledge in many other respects. What happens to capitalism in the United States or around the world is not entirely unrelated to what happens to the university nor what happens to the school, nor what happens to the laboratory, nor what happens to you and me.

Our wealth consists of what we share, not in what we possess exclusively. Rarely do we have any stake at all in keeping somebody else from going into our business, serving our communities, helping to deal with “our” questions.

We are almost never to be found on the side of exclusion, but we do business with exclusion constantly. And often on terms with which we would not be so comfortable, if the harm were more immediate. We deal with the owners of knowledge businesses with far more ease in our hearts than we deal with the owners of tobacco businesses or petroleum businesses. We think, when we think about sustainability, as many of us do, in different ways, that we can attain sustainability without discussing who owns knowledge, and I wonder if we’re right.

We are in the business of bringing people what will improve the civilization of human beings for everyone who lives there, yet the ownership of knowledge, which stands in the way of so much of human improvement, is something that we find ourselves constantly compromising about. I am not here to lecture about that. My income, too, depends upon the idea that knowledge is owned to some extent. But I like to think I’m paid to teach, not to hide the ball. I like to think it’s okay for me to say everything I know to anyone who needs it. I don’t close the door of the classroom. I don’t keep the reading off the net. I put the audio up as soon as I know it isn’t going to discourage people from coming to class. [Laughter] Because if I have anything of any value to present, and I’m never sure I do, I’m also never sure the person who needed it was in the room, and you aren’t either.

What we need to do, then, is to ask ourselves: “How much more sharing can we do?” without too much concern about whether it will hurt us, which it won’t. In our work, from day to day, as we begin to design technology we need – if we all put our hearts and minds to it, you know we could be done replacing Raiser’s Edge in six months – [Applause] if we did put our hearts and minds to it, we could be showing everybody we interact with who has a dead computer somewhere – which they all do, everybody we interact with does – we could be showing them that it isn’t waste, it’s useful material. We could be showing people that free telecommunication is possible. We could be showing people that you don’t actually go to jail, or lose your privacy, or anything else if you open your Wi-Fi network, if you know how. We could be providing bandwidth to people, just to prove that bandwidth is a thing that we can share, a lesson we are badly, badly going to need to teach the world unless you really, really want AT&T and Verizon to replace Chrysler and General Motors when they die, next week. [Laughter] Right? We have a lot of power.

We have a lot of power we always dreamed of having. We all wanted to change the world – every single one of us; that’s how we got here. And we can do it. The funny thing is that we can do it pretty easily from where we are. Just a little left to do. Not very much. Each of you knows how little it would take,

if only they would let you. And each of you knows what you are stopping your people from doing, that they could do, if you would let them. We’re getting very near now, we’re getting very near. There are industries that are about to topple, and that won’t do us any harm at all.

But maybe we should be a little quick before we decide to get rid of all the computers and turn them all over to somebody else. Because that will in fact make it harder for us to do this work. We have a contribution yet to make, and if we take the tools out of our hands and leave them all to be run by somebody else who has a platform to build, then we’ll be making a big sacrifice, for ourselves and for other people, that maybe we shouldn’t make. It’s cheap to store data, you know; that’s not an expensive thing anymore. Why give it to someone else to store? Why let somebody else read the e-mail? “Oh well, they’re not really reading the e-mail, you know? I mean, it’s only a computer reading the e-mail, only for the purpose of making an advertisement.” How do you feel if the National Security Agency is using computers to read all the e-mail just for the purpose of preventing terrorism? For me, that’s not an acceptable arrangement anymore than it was 18 years ago when I went to work for Phil Zimmermann on PGP.

We need to ask ourselves whether we don’t at the moment have tools in our hands, in our offices, on our desks, for improving the world that we shouldn’t give up until we finish the job, and we don’t have much left to do. All we have to do is use the stuff that we can share. All we have to do is make it

a little better, and share that too; each one teach one – we’ve all been here before. I don’t actually think that the very best thing that we can do right now is to turn ourselves over to the industry and let them handle it for us; I think the very best thing that we can do is replace the industry and prove that we can do it without breaking a sweat. That would be a lesson humanity could use.

We’ve come a very long way, all of you know it, it isn’t just the inflatable penguins, right? We’ve come a very long way. I walked in here 7:30 and there was nobody here but the penguins, and I thought: “Yeah, well, that was then. [Laughing] This is now.” We’ve moved in.

We’ve moved in. It’s not just my old friend Mr. Stallman, it’s not just Linus, it’s not just the geeks who didn’t really know why they were making it except they had to – it just was a thing that had to be made. That’s like Einstein doing physics, right, you can’t stop people from thinking, you can only stop them from sharing and learning. It’s all of us now. We’re incredibly sophisticated.

When I went to work at the IBM Santa Teresa laboratory in 1979, making mainframe software, there were 300 programmers in a laboratory working three shifts around the clock, and there was less direct access storage space in that entire laboratory than there is in one seating section of this room. Machinery that costs millions of dollars to make at the end of the 1970s is fitting in your pocket now and you’re buying it for 250 bucks. What the engineering did was to make it possible for us to change the planet more than the internal combustion engine – which has changed the planet quite a lot, even though it’s barely improved since the 1920s, really. And most of the improvements that we have made in the internal combustion engine came by putting a computer in it.

We possess tools of such extraordinary power, and we take them so extraordinarily for granted, that we’d really like just to get rid of them, ’cause they’re such a nuisance. And we thirst to turn them off so they won’t keep engaging our brains and we can have some peace and quiet for a change. But beware of that. Beware of that. Because we are, still, not like everybody in the world,

and when everybody in the world is surrounded by those machines, it will make a whole lot of difference how we used them; whether we used them to make knowledge freer, or whether we used them to increase privilege at the expense of equality.

We’ve got important decisions to make, and we are the right people to make them. Because we’re the people who care about sharing. Because we live in the part of the society in the economy which understands that as the basic rule. Because we have rarely, if ever, any benefit from not showing people how it’s done. We should use the technology that fits our civilization, ours. The one where we help one another. The one where we don’t exclude one another. The one where we are peers, not servers and clients, masters and slaves, something up and something down, something big and something little. We want, as much as possible, to model how it is that we can all think together, and we want to maximize inclusiveness, because we are the first beneficiaries of all those brains that we can save.

We are keeping dinner warm until the kids get home. And when they get here, – all those hundreds of millions of them, whose lives we helped to improve because we made the technology that allowed them to learn – I for one will be very glad to let them tuck me in.

But I wouldn’t want to have to account for why we didn’t take the chances that we had now. I wouldn’t want to have to account for why it was that we made what looked like simple compromises at the time to save a little time, to save a little weight, to save one more head, in the count, when what we were really doing was deciding whether we were going to help to make the world of collaboration and sharing, or whether we were going to build ourselves a little extension on the platform.

We ought to help people make what we enjoy, which is the organizations that can serve. We ought to make that possible by laying down every kind of infrastructure for every kind of organizing without organizations. We ought to do it by making it more and more feasible

for people to learn and to act together. We need to move from how to present our content, which we have now learned, most of us, at least a little bit about, to how to help people communicate more effectively, more equally, without intermediaries, because it’s in the intermediation that the leverage grows, the platforms come to exist, and the bottlenecks on which profits can be made are institutionalized.

I thought, some while ago, that I had learned something new, and that I was very smart for learning it. I thought I learned that profit made bad technology, inherently. Because, I thought, – “How creative I am” I thought, while thinking it – they have to make a profit, they don’t have to make a good technology, so when there’s a choice, they make a profit, and the technology’s not so good. And I looked around the world and I thought “Yeah, that must be right, ’cause look at all this stuff.”. And then I found myself – for some other project entirely, in another side of my life – reading Rosa Luxemburg and there in the collected papers of Rosa Luxemburg, in volume 5, was a little essay on capitalism and technology, and it said “profit motive makes lousy technology, because when you have to choose between making a profit and making good technology, you make the profit, and in a world of socialism the technology would advance at a rate that would make

capitalism seem like child’s play.” And I thought: “Sharing is better than owning.” But they didn’t let her share, you know.

So, we’re not just the non-profit sector. We’re the real breeding house of the greatest of technologies. We’re the place where the research can really be done. We’re the place where we can actually attend to what we would do if we didn’t have to make money doing it, right? That’s what we do all the time, that’s every day. We don’t want to think of ourselves as where the great technology comes from; that would be pressure, and, moreover, who’s funding us for that? But we are. Because we’re the place where you measure it by whether it makes human life better, not by whether it makes a bigger profit. Right? That’s what we keep trying to tell people.

We should live our principles. “The world would be a better place.” we say, and we’re right. We say “We do this because it makes our lives better too.” and we’re right. We say “We do this because can teach every person that one person can make a difference.” and we’re right. Most of us say “I wouldn’t do it any other way.” and we’re right again.

So let’s just do it. Let’s just make knowledge a thing we share instead of something somebody else owns. Let’s just pick up the tools we’ve already got. Let’s go and find some more lying under people’s beds, and around the corner, and in the waste can, and belonging to bankrupt banks and auto companies and other surplus sellers.

You do all buy your hardware on eBay, right? I needed a web server this month, because the wikis that run my courses get kind of heavy in traffic towards the end of the term, when everybody’s trying to do all their collaborating at the last minute, and I, you know, I needed some

light heavy iron; a, you know, small low end IBM server. I bought it from a guy in Mississippi on eBay; I paid $9.99 for it, [Laughter] 40 bucks shipping. [Laughter] 80 bucks for 2 gig of memory, moved the hard drives from one to the other, and pressed the “On” button. That was it.

Think you could do that with the proprietary guys? Think anybody who really wants to teach you how works for Intel? Not likely.

What we need to do is just show people how easy it is, right. We need to live it out and let them see it, we need to put glass walls around the kitchen so people can learn to cook, our way. We’re the research facility for how to do it our way.

You all know how. Now you’ve just got to teach some people.

See, we’ve been trying to do this for a thousand years. We’ve been trying to make it possible for everyone to become who she and he meant to become. We’ve been trying to make it possible to reach where we were reaching for: From each according to his ability. They made it sound bad somehow, but it’s what we were trying to do.

We’ve been doing it a long time, people paid a lot of price for the little progress that they made. We get a chance to take a giant step; we’re like them, we’re still struggling for it, we’ve given our lives to it. The difference is, this time, we win.

Thank you very much.

[Applause]

[Holly Ross:]

Let me… I forgot mine. I forgot a handheld mike. [Laughing]

Thank you so much for the big ideas, and thank you guys for drinking enough coffee to keep up. So we’re going to have a seat and do the Q & A as well; I’m going to ask the first question back here though, since I forgot my mike.

And the top question here right now is: “How should the people who produce knowledge – whether designs, novels, or software – be paid, then?”

[Eben Moglen:]

Well, you know this is of course a question I hear a fair amount. And I think it comes from a confusion. And it’s an odd confusion, in this room, because it makes a distinction which maybe we don’t need to make, and introduces one that we shouldn’t have.

People get paid, voluntarily, because we love what they do.

I know that’s true; I pay people because I love what they do, and so do you. We pay – and would be happy to pay more, if we didn’t think that someone else was in the way – for things of beauty and utility all the time.

What we are losing is the ability to force people to pay. We are losing the coercive distribution system which says: “I won’t give this to you you until you pay me for it first.”

We’re losing that because people can’t manage to make it work anymore;

that’s what it means to be a music company or a movie company in the present world; it’s the distributors who have a problem. Their model was: “We make a profit as distributors because you can only get this from us.”

And by forcing people to pay, right, they made their business model work.

We can’t force people anymore, but we can ask them.

And the reason that shouldn’t be a surprise in this room is: that’s how you do it. You ask people to pay. You tell them “It’s worth it. You love what we do, you care about it, you have passion. Pay us for it.”, and they do.

You shouldn’t be worried. This shouldn’t be your question, this isn’t your fight, you won it already. The people who are worried are the people people wouldn’t voluntarily pay, and that’s not you.

We’re going to have every bit as much creativity as we had before Edison; Edison was the guy who made it possible to put the thing in a can and sell it like a product in the store. There wasn’t any absence of music before there were recording companies. Musicians got paid poorly then and they get paid poorly now. [Laughter] The difference was: there were no recording companies stealing from them then, and there are now.

The issue of “How will people get paid?” is agitating our friends on the other side. But I don’t think it needs to agitate us so much. People pay for what they love. Make what they love; they’ll pay you for it.

I have an odd business model. I don’t charge clients. I charge people, voluntarily, who make a lot of money in technology, to provide lawyering services for people who don’t make any.

And I haven’t fired any lawyers yet. And I’m an unusual law firm in New York city because I haven’t.

[Holly Ross:]

So, I think what you’re saying there is that it’s not the product that people are… Well, it’s that we all now have access to distribution. And we see that, I think, particularly in the newspaper industry, right, journalism isn’t… people aren’t paying for the distribution channel anymore, it’s not that they don’t want the news, or the knowledge of the news, they don’t want to get it via that distribution channel; they can get it all through, you know, blogs and other news sources now.

[Eben Moglen:]

Well, with the exception, of course, that I turn on the radio at five o’clock every day and National Public Radio is still there, making the finest electronic journalism in the United States, and they’re still being supported by people who voluntarily pay.

[Applause]

[Holly Ross:]

'''Yeah, I agree. I agree.''' I agree. But I think one of the questions that we have is that there is an assumption that, you know, if you go to a distribution channel, like the San Francisco Chronicle, you know, that there is this assumption that the people who are reporting there are reporting better, because they’re paid for it. So I want to ask you, is, like, if we remove this distribution channel, if you can get your journalism anywhere, are we in for collaborating around newsmaking better, is that actually making journalism better?

[Eben Moglen:]

Let’s imagine two things about journalism. I didn’t realize we were going to talk about journalism this morning, but let’s imagine two about them. Let’s ask, first: Of all the great investigative reporting that newspapers ever paid for, how much did the publishers spike because it made a problem that he didn’t want to have? Right? The Washington Post didn’t want the Pentagon Papers, and The New York Times didn’t want Watergate. And that’s the two they love to talk about, and yet 50% of them would have passed, each time.

Let’s ask another question: How many reporters, compared to how many people selling advertisements? And: Where was the love, anyway?

You know what’s going to happen to the raw material of journalism, right? We’re going to have RSS feed of everything useful out of every place useful, every firehouse, every police station, every city council meeting, we’re going to aggregate that data on the fly for you on your handheld, in your laptop, on your telephone in any way you care to want. You’re going to shape your information stream, and then what you’re going to want is somebody to tell you what it means.

Which means there’s going to be an embedded reporter in your community, with access to all the information you have, whose job it is to help you understand it.

And when I talk to journalism students – and I do – and they are frightened, – and they are – and hostile to “free” – and they are becoming that, because the royalists who own the press are staging this grand royal funeral and asking us all to mourn – but when I talk to journalism students I say: “Well, what would you like? To live embedded in a community in the rest of your life and explain to people what’s going on there, and have them care for you and look after you and pay you? Or would you like to work for Rupert Murdoch?

“Well, we’d like to work for Rupert Murdoch ’cause the pay is steady.”, you know? “Could we really rely on a community to pay us adequately?” I don’t know, there are a lot of guys who had great white-collar careers wondering if they were going to get a General Motors pension after all. And working for Rupert Murdoch doesn’t really look all that good, you know, ’cause he’s Rupert Murdoch, [Laughter]

and he’s in charge.

So, I think that the answers are evolving out of the technologies of sharing; I do believe that. I think the way that journalism is going to change is by getting closer to the community, not further away; It’s another disintermediation going on, and the folks being disintermediated are just as ugly as the intermediaries usually are, the only thing is: they own the media and they make themselves look pretty. And they want us to be very sad because they’re dying. And I’m not.

[Holly Ross:]

Okay, so… [Laughing] [Laughter] [Applause] Let me ask you another question: Is profit evil?

[Eben Moglen:]

No, but it’s not hard to do evil if you’ve got your eye on profit at the expense of all the other things you should have your eye on.

[Holly Ross:]

Okay.

[Eben Moglen:]

It’s not e… You know, right?

No kind of overfocus is evil in itself, it’s the evil that we do because we weren’t looking that hurts so many people so much of the time.

[Holly Ross:]

All right, so one of the folks out in the audience asked: Through your examples of Google and Apple and others, you imply evil motives, which I think is the profit motive – that they’re only looking at profit at the exclusion of other things. How do you balance accessibility and collaboration with capitalism, then? Or, do you just think that capitalism is flawed? So, you have a bold…

[Eben Moglen:]

Well, this is… Look, I mean, the isms aren’t the problem.

Let’s assume all of these people are working in something called “social capitalism” or “capital socialism”, or whatever it is we live here, you know, – it’s a pretty socialist place, right, our government owns most of all the banks at the moment, and all of the defense establishment – but whatever it is that we live in, it has some problems, and all of us work on them all the time.

Apple and Google, you can distinguish one from another, but you can say that it isn’t profit

that’s the problem, or money that’s the problem, it’s the love of money, right? Radix malorum est cupiditas. The problem of Apple is selfishness, and the selfishness is Steve’s. The problem of Google is not that. The problem of Google is that in order to do the work it’s going to do to make the money it’s going to do, it wants to know what’s inside the head of everybody on Earth. And that’s a perfectly okay thing for them to want, and it’s perfectly okay for us to wonder whether we ought to let them have it.

But where it tends to go wrong first is in the decision that you can’t share it, because your best way of making money from it is to own it, and to keep other people from it. That may be okay about real estate, but is it okay about knowledge? That may be okay about diamonds, but is it okay about culture? That may be all right when what it means is that the poor don’t have Lexus, but is it okay when it means the poor don’t have physics?

And, so I think what we really need to do isn’t to concentrate on the question of good or evil or capitalism or socialism, I think what we ought to do is what we tend to do day to day in our work, which is to ask: “What does this mean to the people we care about?” And in particular: “What does this mean about the people we care about who have trouble taking care of themselves because the world isn’t organized to let them?”

[Holly Ross:]

So you feel like we can achieve all the things that you’re talking about whether we’re operating under capitalism, any kind of ism, it’s all possible for us?

[Eben Moglen:]

Yeah, I think I probably hinted the possibility that capitalism might occasionally get in the way, as it occasionally helps out.

[Holly Ross:]

I did get that message.

[Eben Moglen:]

But the same could be said of sharing; it gets in the way too. All of us who do the work of collaboration are well aware of why it was that Oscar Wilde said that the problem of socialism is that it takes up too many evenings. [Laughter] Right? I mean, it’s hard to do the work we do by sharing and collaborating.

It’s a drag, every day I go in there and I think to myself: “God, if I just were the emperor of this place, everything would be perfect; I could take care of it in an instant, it’d be over by lunchtime.” Right? Which is the way profit tends to think, right? Just: “Everybody else get out of the way. All other businesses sicken and die, and when I have achieved monopoly everything will be great.” You see how well it worked. Right? I mean, we get the software we get from that; You know how beautiful it is. Right?

The right way, I think, to think about it is by asking: “What lies along the line of our major objectives? What it it that fits our intentions and our goals?” I could go and talk to other people who have different needs and different passions in the world and give reasons why they shouldn’t care about this as much as we all do. They have other proposals about how to spend their time, I don’t want to put them in jail, I don’t want to indict them, I don’t think they’re evil.

But for us the issue should be pretty clear.

[Holly Ross:]

You also talked a little bit about platforms, you see the platform being a place where… You create a platform because you want people to stay there, you don’t want them leaving your ecosystem, so they consume more. I think that there are some open source tools – software tools – that are increasingly looking a lot like platforms, particularly around CMSs; Joomla, Drupal, etc., '''you know, they are more than a content management system at this point. Do you feel''' that you would rethink the platform model, in that context?

[Eben Moglen:]

Now, I have to say that I do try and follow a general rule of speaking carefully about clients. And Joomla, and Drupal, and Plone are clients, and so I’m going to speak a little carefully. But that will at least tell you that I certainly don’t think that the problem is with them.

[Holly Ross:]

It’s good to know you mince your words about something.

[Eben Moglen:]

[Laughter] I appreciate that, thank you very much. But I wouldn’t take Steve as a client even if he wanted; no, never mind. [Laughing]

Look. Content management systems feel to me like a mixed story. The good news about them is: They’re content management systems, and the bad news about them is: They manage content. Right? I mean, It all depends on how people arrange things. You can use a content management system to destroy people’s freedom to communicate, and most corporate communications are organized around that proposition. You can also use those frameworks to make possible things that would be very difficult, if not impossible, any other way.

What we’re not troubled by, in that world, is something that is a lock-in that’s deliberate and meant to sell the product. If people need something to happen in those content management systems, to increase their flexibility, to allow them more portability, to make it easier for them to interoperate with other things they also want to use, it will get done. And the more people who have the itch, the more efficiently it will be scratched.

If you read Mini-Microsoft, the blog where Microsoft workers complain, you will see, often, the statement of the form: “You know, my partner and I, we work together in department such-and-such, and there was a serious bug in Windows 98 and we fixed it, but it didn’t meet the feature cutoff for Windows 2000, and it’s still unfixed in XP.” I’m not even going to talk about the other ones, right? The business lost interest. It transferred the guys out of the department. It never fixed the bug. Was it important? Sure it was important. Did it get done? No. Business rationale was for something else.

There are of course defects in software – there’s defects in software we make, both defects in execution and defects in design – but there’s no defect in intention. The intention was to make a thing that people could share to solve their problems. If it’s got a problem with it, the only problem is it isn’t good enough yet.

I’m not worried that the systems that totalize our environment for us that we can understand and change and modify and share are going to control us. And I have immense respect for what it is that the content management systems have allowed people to do by way of making the web a place where their work can get done.

[Holly Ross:]

So, we have a new word for knowledge, in the last decade: “Intellectual property”. If you could rename that, and rethink how it works, what would that look like?

[Eben Moglen:]

Free speech.

I tried, for a long time… [Applause]

[Holly Ross:]

Okay. [Laughing]

[Eben Moglen:]

I tried for a long time to work out in a technical way a replacement for the system of copyright, made entirely out of principles of free speech. The goal was to take every usable and positive phenomenon of copyright and reexpress it using only free speech principles. I began with two: The right of authorship – that is to say, the right to put your name on a thing you made – and the right of anonymity – that is, the right to take your name off a thing which has been modified to the point at which you no longer consider it yours. And I wanted, in the usual sort of principle of elegance, not to add a third principle until I had to.

And about 95% of all the good that copyright ever does can be, in my judgement, dealt with by rules about “You have a right to put your name on” and “You have a right to have your name taken off”.

Those are what lawyers call liability rules – that is to say, if somebody gets in the way of your doing what you have a right to do you have a claim for damages. They’re not property rules, they’re not based on ownership, they’re not based on excluding people from a thing.

The power to exclude is the essence of the difficulty. And to take the intellectual property system – which uses the word “property”, not really as an actual, but as a metaphor – to take the word “property”, which is about exclusion, and add it in, is really to say that what you have is a machine which works because some people can’t have it.

To reexpress all of that in terms of free speech, to talk about everybody’s right to it, will not actually change the outcomes as much as it will change the philosophy, and you can understand why the goal is not to change the philosophy in some quarters.

[Holly Ross:]

You don’t have a Google Account.

[Eben Moglen:]

As it happens.

[Holly Ross:]

As it happens. [Laughing] One of the issues with… We want to create change, in how we think about how our data is protected, who gets to see our data, you know, all that kind of stuff. One way of making sure your data’s protected is just opting out of the system. But does opting out help put the pressure on Google to change the way that they behave? And how…

[Eben Moglen:]

No. No, I don’t think they need me, actually.

[Holly Ross:]

Yeah, I’m pretty sure they’re doing okay. But, [Laughing] How then, if opting out isn’t the key, what is the key to helping create those user agreements that are amenable '''to our needs, that may be customizable to our personal situations? How are we going to create that change?'''

[Eben Moglen:]

Well, we’re going to do two things, I suspect. One of which is the outgrowth of the thing I was saying about the server I bought.

Hardware’s cheap in the world, and it’s going to stay cheap in the world for a decade now.

We’re going to assemble a free cloud. We’re going to put it in a place where energy is cheap. We’re going to put it in a place where education levels are high. We’re going to put a bunch of hardware which we can assemble at very low cost the same way our friends in Mountain View assemble theirs, and what we’re going to do is we’re going to run free software there. And we’re going to offer services there in a way which is going to allow people to know that they are sharing, rather than owning, in the cloud, and that the cloud is improving itself on the basis of the sharing.

The other thing we’re going to do is we’re going to finish the encryption revolution, – the one that the other guys out there can’t seem to finish, so that they lose every National Health Service patient in England, or a tape full of every pension account in the Air Force, or whatever it is. And, we’re going to make it safe, in a minimalist sense, to keep our data on other people’s computers

– a thing we don’t do now because it isn’t safe.

I’m certainly not going to put client information in the cloud, not the cloud architected the way it is now. I wouldn’t run my law practice’s mail through Google. I mean, I can’t; in my judgement, I’m not preserving privilege at all if I am letting other people’s computers read the data generated in the course of my talking to my clients.

We are going to have to address some issues of social responsibility not being addressed by industrial operations. And I don’t mean by this to say that Google is careless – Google is not careless. Google thinks very carefully, and wants to do the right thing. But it’s not okay to just say “Well, Google wants to do the right thing”, we have to do the right thing ourselves, as a community.

I think that’s coming. I think the global economic situation is helping it to arrive. I am doing some work at the moment which is specificly related to it.

We have some international development projects that we need to do. When they are over, there will suddenly be assets there of great value that we can all use together, which will much improve our situation.

This will seem surprising to people; “How could he be talking about hundreds of billions of dollars’ worth of assets emerging out of nowhere?”. And yet, of course, the software did, and is already hundreds of billions of dollars’ worth of assets, and we all made it by sharing, and our rich corporate friends are busy helping to make money with it.

[Holly Ross:]

But how are we going to get them to change the user policy?

[Eben Moglen:]

Oh, we’re not. They’re not going to change the user policy, not for us, we don’t have enough clout. We’re going to make them afraid. We’re going to say “Your business doesn’t have to exist.”. We’re not going to negotiate with them from a position of “We supplicantly wish that you would make your user policies better” – they won’t.

They won’t. And so, we’ll replace them. And then they’ll change.

[Holly Ross:]

So, we’re going to take down Google? That’s the only answer?

[Eben Moglen:]

If you put Adblock in your browser, it won’t be hard, right?

[Holly Ross:]

There’s no way that Google would ever change their user licensing?

[Eben Moglen:]

Yes, they’ll change their user licensing a little, but they’ve got shareholders, remember? They can’t do what you need, they have to do what their shareholders need. And they’re right to do that, because they’re Google. It’s not their fault that they’re Google, it’s their glory that they’re Google. But it’s my glory that Adblock Plus is in my browser. And if Adblock Plus is in your browser, then Google has to think about changing.

[Holly Ross:]

Okay, one more question here from the crowd: How do we protect ourselves from our surrounding software controlled by others – e.g. utilities.

Just putting it out there, that’s what it said.

[Eben Moglen:]

And it’s a

good question.

In 2004, in Berlin, I gave a little talk about the course of freedom, and I said: “Free software, free hardware, free culture, and free bandwidth, and in that order.”, and I think probably that’s still correct. Free software: mostly done, with a little left. Free hardware? Well, that, in 2004, was about DRM and lockdown. And you notice that things have improved. I said in Berlin in 2004 that the war for free hardware was going to be short, sharp, and decisive, and I think that’s right – DRM is dead (don’t tell Steve).

So, now we are in two harder places: Free culture; the Murdochs of the world are staging, as I say, a grand funeral designed to keep us weeping so hard that we won’t finish the free culture job, and after that are the utilities, that’s correct. It’s the infrastructure, it’s the bandwidth.

Once again, think globally, act locally. Want to fix the utility problem? Switch to VoIP. Asterisk is a client of the Software Freedom Law Center for a reason, because when Verizon comes to kill it I want to be there, standing in the way. [Applause] Right? But if you are disappearing from the telecommunications network of the world, you are making your point to the utilities suppliers.

Everywhere you can substitute a Wi-Fi thing for a 3G thing, you are making your point to the utilities suppliers. The public internet is our resource; if we allow the network operators to begin replacing the public net with end-to-end proprietary networks, we are giving up leverage which we must have for the reason that the questioner shrewdly points out.

Our decisions will count, not because it is our money that they need,

but because we would be setting a terrible example to the world if we prove that they are unnecessary, except as wholesalers. And they are unnecessary, except as wholesalers; all we want to buy is packet movement, we don’t want to buy anything else, you can’t charge us for bottled water, we’ll just use what comes out of the tap, thank you. [Applause]

[Holly Ross:]

Well, I want to congratulate you for being the most pessimistic optimist I have ever met. [Laughing] I really enjoyed today, thank you so much for helping elevating our discussion around not just how we do our work, but how we do it. And thanks again Eben, I really appreciate it.

[Eben Moglen:]

Thank you Holly, it was a pleasure. [Applause]