The Ongoing War on Computing; Legacy Players Trying to Control the Uncontrollable

[Source: TechDirt, by Mike Masnick, January 4, 2012]

I don't think I've ever had so many people all recommend I watch the same thing as the number of folks who pointed me to Cory Doctorow's brilliant talk at the Chaos Communication Congress in Berlin last week. You can watch the 55 minute presentation below...or if you're a speed reader, you can check out the fantastic transcript put together by Joshua Wise, which I'll be quoting from:

The crux of his argument is pretty straightforward. The idea behind all these attempts to "crack down" on copyright infringement online, with things like DRM, rootkits, three strikes laws, SOPA and more, are really simply all forms of attacks on general purpose computing. That's because computers that can run any program screw up the kind of gatekeeper control some industries are used to, and create a litany of problems for those industries:

 
By 1996, it became clear to everyone in the halls of power that there was something important about to happen. We were about to have an information economy, whatever the hell that was. They assumed it meant an economy where we bought and sold information. Now, information technology makes things efficient, so imagine the markets that an information economy would have. You could buy a book for a day, you could sell the right to watch the movie for one Euro, and then you could rent out the pause button at one penny per second. You could sell movies for one price in one country, and another price in another, and so on, and so on; the fantasies of those days were a little like a boring science fiction adaptation of the Old Testament book of Numbers, a kind of tedious enumeration of every permutation of things people do with information and the ways we could charge them for it.
 
[[355.5]] But none of this would be possible unless we could control how people use their computers and the files we transfer to them. After all, it was well and good to talk about selling someone the 24-hour right to a video, or the right to move music onto an iPod, but not the right to move music from the iPod onto another device, but how the hell could you do that once you'd given them the file? In order to do that, to make this work, you needed to figure out how to stop computers from running certain programs and inspecting certain files and processes. For example, you could encrypt the file, and then require the user to run a program that only unlocked the file under certain circumstances.
 
[[395.8]] But as they say on the Internet, "now you have two problems." You also, now, have to stop the user from saving the file while it's in the clear, and you have to stop the user from figuring out where the unlocking program stores its keys, because if the user finds the keys, she'll just decrypt the file and throw away that stupid player app.
 
[[416.6]] And now you have three problems [audience laughs], because now you have to stop the users who figure out how to render the file in the clear from sharing it with other users, and now you've got four! problems, because now you have to stop the users who figure out how to extract secrets from unlocking programs from telling other users how to do it too, and now you've got five! problems, because now you have to stop users who figure out how to extract secrets from unlocking programs from telling other users what the secrets were!
 
From there he goes on to put together a fantastic analogy of how a confusion over analogies, rather than (perhaps) outright cluelessness (or evilness) explains why bad copyright laws keep getting passed:
 
It's not that regulators don't understand information technology, because it should be possible to be a non-expert and still make a good law! M.P.s and Congressmen and so on are elected to represent districts and people, not disciplines and issues. We don't have a Member of Parliament for biochemistry, and we don't have a Senator from the great state of urban planning, and we don't have an M.E.P. from child welfare. (But perhaps we should.) And yet those people who are experts in policy and politics, not technical disciplines, nevertheless, often do manage to pass good rules that make sense, and that's because government relies on heuristics - rules of thumb about how to balance expert input from different sides of an issue.
 
[[686.3]] But information technology confounds these heuristics - it kicks the crap out of them - in one important way, and this is it. One important test of whether or not a regulation is fit for a purpose is first, of course, whether it will work, but second of all, whether or not in the course of doing its work, it will have lots of effects on everything else. If I wanted Congress to write, or Parliament to write, or the E.U. to regulate a wheel, it's unlikely I'd succeed. If I turned up and said "well, everyone knows that wheels are good and right, but have you noticed that every single bank robber has four wheels on his car when he drives away from the bank robbery? Can't we do something about this?", the answer would of course be "no." Because we don't know how to make a wheel that is still generally useful for legitimate wheel applications but useless to bad guys. And we can all see that the general benefits of wheels are so profound that we'd be foolish to risk them in a foolish errand to stop bank robberies by changing wheels. Even if there were an epidemic of bank robberies, even if society were on the verge of collapse thanks to bank robberies, no one would think that wheels were the right place to start solving our problems.
 
[[762.0]] But. If I were to show up in that same body to say that I had absolute proof that hands-free phones were making cars dangerous, and I said, "I would like you to pass a law that says it's illegal to put a hands-free phone in a car," the regulator might say "Yeah, I'd take your point, we'd do that." And we might disagree about whether or not this is a good idea, or whether or not my evidence made sense, but very few of us would say "well, once you take the hands-free phones out of the car, they stop being cars." We understand that we can keep cars cars even if we remove features from them. Cars are special purpose, at least in comparison to wheels, and all that the addition of a hands-free phone does is add one more feature to an already-specialized technology. In fact, there's that heuristic that we can apply here - special-purpose technologies are complex. And you can remove features from them without doing fundamental disfiguring violence to their underlying utility.
 
[[816.5]] This rule of thumb serves regulators well, by and large, but it is rendered null and void by the general-purpose computer and the general-purpose network - the PC and the Internet. Because if you think of computer software as a feature, that is a computer with spreadsheets running on it has a spreadsheet feature, and one that's running World of Warcraft has an MMORPG feature, then this heuristic leads you to think that you could reasonably say, "make me a computer that doesn't run spreadsheets", and that it would be no more of an attack on computing than "make me a car without a hands-free phone" is an attack on cars. And if you think of protocols and sites as features of the network, then saying "fix the Internet so that it doesn't run BitTorrent," or "fix the Internet so that thepiratebay.org no longer resolves," then it sounds a lot like "change the sound of busy signals," or "take that pizzeria on the corner off the phone network," and not like an attack on the fundamental principles of internetworking.
 
The end result, then, is that any attempt to pass these kinds of laws really results not in building a task-specific computing system or application, but in deliberately crippling a general purpose machine - and that's kind of crazy for all sorts of reasons. Basically, it effectively means having to put spyware everywhere:
 
[[1090.5]] Because we don't know how to build the general purpose computer that is capable of running any program we can compile except for some program that we don't like, or that we prohibit by law, or that loses us money. The closest approximation that we have to this is a computer with spyware - a computer on which remote parties set policies without the computer user's knowledge, over the objection of the computer's owner. And so it is that digital rights management always converges on malware.
 
[[1118.9]] There was, of course, this famous incident, a kind of gift to people who have this hypothesis, in which Sony loaded covert rootkit installers on 6 million audio CDs, which secretly executed programs that watched for attempts to read the sound files on CDs, and terminated them, and which also hid the rootkit's existence by causing the kernel to lie about which processes were running, and which files were present on the drive. But it's not the only example; just recently, Nintendo shipped the 3DS, which opportunistically updates its firmware, and does an integrity check to make sure that you haven't altered the old firmware in any way, and if it detects signs of tampering, it bricks itself.
 
[[1158.8]] Human rights activists have raised alarms over U-EFI, the new PC bootloader, which restricts your computer so it runs signed operating systems, noting that repressive governments will likely withhold signatures from OSes unless they have covert surveillance operations.
 
[[1175.5]] And on the network side, attempts to make a network that can't be used for copyright infringement always converges with the surveillance measures that we know from repressive governments. So, SOPA, the U.S. Stop Online Piracy Act, bans tools like DNSSec because they can be used to defeat DNS blocking measures. And it blocks tools like Tor, because they can be used to circumvent IP blocking measures. In fact, the proponents of SOPA, the Motion Picture Association of America, circulated a memo, citing research that SOPA would probably work, because it uses the same measures as are used in Syria, China, and Uzbekistan, and they argued that these measures are effective in those countries, and so they would work in America, too!
 
[audience laughs and applauds] Don't applaud me, applaud the MPAA!
 
But his point is much bigger than copyright. It's that the copyright fight is merely the canary in the coal mine to this kind of attack on general purpose computing in all sorts of other arenas as well. And those fights may be much bigger and more difficult than the copyright fight:
 
And it doesn't take a science fiction writer to understand why regulators might be nervous about the user-modifiable firmware on self-driving cars, or limiting interoperability for aviation controllers, or the kind of thing you could do with bio-scale assemblers and sequencers. Imagine what will happen the day that Monsanto determines that it's really...really...important to make sure that computers can't execute programs that cause specialized peripherals to output organisms that eat their lunch...literally. Regardless of whether you think these are real problems or merely hysterical fears, they are nevertheless the province of lobbies and interest groups that are far more influential than Hollywood and big content are on their best days, and every one of them will arrive at the same place - "can't you just make us a general purpose computer that runs all the programs, except the ones that scare and anger us? Can't you just make us an Internet that transmits any message over any protocol between any two points, unless it upsets us?"
 
[[1576.3]] And personally, I can see that there will be programs that run on general purpose computers and peripherals that will even freak me out. So I can believe that people who advocate for limiting general purpose computers will find receptive audience for their positions. But just as we saw with the copyright wars, banning certain instructions, or protocols, or messages, will be wholly ineffective as a means of prevention and remedy; and as we saw in the copyright wars, all attempts at controlling PCs will converge on rootkits; all attempts at controlling the Internet will converge on surveillance and censorship, which is why all this stuff matters. Because we've spent the last 10+ years as a body sending our best players out to fight what we thought was the final boss at the end of the game, but it turns out it's just been the mini-boss at the end of the level, and the stakes are only going to get higher.
 
And this is an important fight. It's why each of the moves to fight back against attempts to censor and break computing systems is so important. Because the next round of fights is going to be bigger and more difficult. And while they'll simply never succeed in actually killing off the idea of the all-purpose general computer (you don't put that kind of revelation back in Pandora's box), the amount of collateral damage that can (and almost certainly will) be caused in the interim is significant and worrisome.
 
His point (and presentation) are fantastic, and kind of a flip side to something that I've discussed in the past. When people ask me why I talk about the music industry so much, I often note that it's the leading indicator for the type of disruption that's going to hit every single industry, even many that believe they're totally immune to this. My hope was that we could extract the good lessons from what's happening in the music industry - the fact that the industry has grown tremendously, that a massive amount of new content is being produced, and that amazing new business models mean that many more people can make money from music today than ever before - and look to apply some of those lessons to other industries before they freak out.
 
But Cory's speech, while perhaps the pessimistic flip side of that coin, highlights the key attack vector where all of these fights against disruption will be fought. They'll be attacks on the idea of general purpose computing. And, if we're hoping to ward off the worst of the worst, we can't just talk about the facts and data and success stories, but also need to be prepared to explain and educate about the nature of a general purpose computer, and the massive (and dangerous) unintended consequences from seeking to hold back general computing power to stop "apps we don't like."