@evan I'm not sure how it could not be derivative. If the training data didn't slurp up all the repos on Github, it would not be able to generate something that looked an awful lot like a repo in Github.
If I trained a model on the collected literary works in the public domain and then asked it to produce a for loop written in Zig, no amount of prompting could make that happen.
On what grounds would that not be derivative?