Matthias Urlichs <[email protected]> writes: > I'm not disputing any of that. *Of course* we should write our rules and > laws to benefit humans / humanity, not robots or AIs or corporate > profiteering or what-have-you.
> All I'm saying is that the idea "a human can examine a lot of > copyrighted stuff and then produce non-copyrighted output but a computer > cannot" might still hold some water today, but the bucket is leaky and > getting leakier every couple of months, if not weeks. This is why I'm trying to plug the holes in the bucket as best I can, in the small part of the world that I can affect, at least until someone deploys a better legal system than copyright for protecting human interests. I know I am trying to occupy an awkward middle position. I participate in free software because I don't like the way copyright (among other tools, including simple secrecy) is abused in software to prevent people from modifying their own tools, and then within that free software community I'm arguing in favor of some uses of copyright to protect artists from exploitation, even if that restricts some things a user may wish to do with their work. This is going to sound contradictory to purists in both camps. But I think this is one of the cases where there are real competing interests that need to be balanced, not simply dismissed by declaring one of the interests superior. I am in part making an argument in favor of Chesterton's fence [1] and those arguments are never very popular (and, I admit, are also often misused). But I do think it's worth opposing the ethos of "move fast and break things." [1] https://en.wikipedia.org/wiki/G._K._Chesterton#Chesterton's_fence We do not have to simply accept the direction society appears to be going in. We can try to change it, and we can label and add nuance and selectively decline to participate in the portions of it that we find harmful. That's how the free software movement started, in considerably more hostile terrain than we face today, and we still made such significant gains that we fundamentally changed the entire software industry. I think there is a very strong gravitational well in technical communities that pulls people towards the idea that if something is possible, it will happen anyway, and therefore we may as well embrace it because there's no way to stop it. But this is just not true. Societies have outlawed all sorts of things and thereby significantly reduced their frequency because they were harmful to people. Human ingenuity is not a god; we do not owe it passive obedience. We can weigh new developments against our morals and ethical judgments and find some of them wanting. The ironic part is that this makes me sound like some sort of conservative, when I am probably on the left radical side of most of the folks here. But I want to argue for changing things thoughtfully and arguing seriously about a sense of shared ethics, not just assuming we have to accept what other people decide to do. -- Russ Allbery ([email protected]) <https://www.eyrie.org/~eagle/>

