Recently saw a youtube video about a service created to change an open source software license.
- One agent reads code and gather specs
- Another agent, without access to the original code, creates equivalent software
In theory this should allow someone to take any open source software and change it’s license.
For a large portion of open source likely this is not an issue, because nobody may care for the particular software, but for larger projects I wonder what sort of impact this may have. In particular any open source software where it’s authors are making a living from donations or public support.
Has anyone read, or thought, of a way to prevent getting one’s code license changed this way?


An interesting argument would be to require the training data to be shared to prove it was never exposed to the original source it’s ripping off.
It might help set a precedent that would make this sort of thing less attractive
I believe there have been lawsuits which have already proven these models stole, and can reproduce verbatim, copyrighted material yet there has been little to no real consequences for the AI companies. So, if they can get away with that from companies that actually have the means to present a strong lawsuit, the chances of some open source author to defend their code are slim (very slim in my opinion)