AI: a fork in the road for open source
(octet-stream.net)
from thomask@lemmy.sdf.org to programming@programming.dev on 23 Apr 11:41
https://lemmy.sdf.org/post/33190792
from thomask@lemmy.sdf.org to programming@programming.dev on 23 Apr 11:41
https://lemmy.sdf.org/post/33190792
Some thoughts/predictions about how open source developers will be forced to choose their path with GenAI.
Full disclaimer: my own post, sharing for discussion and to find out if anyone has any brilliant ideas what else could be done. It looks like self-posts are okay here but let me know if I’m wrong about that.
#programming
threaded - newest
Nothing in the core free software principles - namely, the four freedoms - actually concerns the development process and tools used - or copyright. It’s all about what you can do with the software.
The GPL is more of a “hack” that “perverts” copyright to enforce free software principles - because that was the tool available, not because the people who wrote it really liked intellectual property.
This is a good point. I assumed here that FS advocates will be basically opposed to a technology that serves to incorporate their code into software that does not provide the fundamental freedoms to end users, more than those who license their work permissively. But yes you could imagine an FS advocate who is quite happy to use the tech themselves and churn out code with GPL attached.
The fact is, currently, AI can’t write good code. I’m sure that at some point in the future they will - but we’re not there yet, and probably have some years still.
Imagine at some point in the future, where an AI can program any piece of software you want for you, and do it well. At that point, the value of code itself will be minimal. If you keep your code proprietary, I’ll just get the AI to re-implement the functionality anew and publish it.
Therefore, all code will be permissive open source. There would be no point in keeping anything proprietary, and also no point in applying copyleft. But at this point the copyleft “hack” would simply be unnecessary, so permissive open source would be just as good.
Until then, me not using AI doesn’t in any way prevent others from training AI on my code. So I just don’t see training on my code as a valid reason to avoid it. I don’t use AI currently - but that’s for entirely pragmatic reasons: I’m not yet happy with the code it generates.
The answer is 2.
Snake oil salesmen always encourage the public to bet against the experts, with predictable results.
Someday ethically sourced AI can be used responsibly by trustworthy coders.
But the key is choosing to collaborate with trustworthy coders.
Yep. It does increasingly feel like developers like me who find it deeply disturbing and problematic for our profession and society are going to increasingly become rarer. Fewer and fewer people are going to understand how anything actually works.
I think nobody understands exactly how anything works, but enough of us understand our own little corner of tech to make new things and keep the older things going. I’ve been coding for decades, and proudly state I understand about 1% of what I do. This is higher than most
AI will make these little gardens of knowledge smaller for most, and yet again we, as the human species, will forever rely on another layer of tech.
Do you think there’s a way for this to scale to larger projects like Servo? Or will it only work for a few people collaborating?
I just write in a language few people really knows well. Not like I expect AI to do a great job of basing off my code.
Emphasis added by me.
Thing is, it's not black and white most of the time - usually a developer is using Gen AI as an assistant in some capacity. There are a wide range of ways to do that with really big differences in how firmly their hand remains on the wheel of where things are going. Only in the most extreme "vibe coding" scenario would it be fair to characterize the code as "written by AI".
There reaches a point somewhere on the spectrum of dependency on AI where quality would suffer and developer capacity-building would be stunted. Where that point is, is a more productive question than a binary Yes or No to all AI.
As you said, it’s out of the box/bag. The thing I’ll push for is open sourcing all code. Being able to copy opensource code and hide it in proprietary code is to me the biggest problem. Were everything opensource, I doubt anybody would bat an eye. “You copied my code and put it out there publicly, free of charge? Good. Do it again”.
Personally, I license everything as restrictively as possible for companies and would love an enforcable opensource license that figures out how to make companies contribute back or pay for use of the code.
Anti Commercial-AI license