This is bad.
-
@ireneista Especially because you need to also fork the whole governance model around it.
@xgranade @ireneista "do you have five million dollars of disposable income to fund an alternative to the PSF" is a good place to start, if you want to frame it as a "hostile fork" situation. the only solution is to get involved in the messy process of politics and governance and try to figure out a way to negotiate a durable peace
-
@xgranade @ireneista "do you have five million dollars of disposable income to fund an alternative to the PSF" is a good place to start, if you want to frame it as a "hostile fork" situation. the only solution is to get involved in the messy process of politics and governance and try to figure out a way to negotiate a durable peace
@xgranade @ireneista unless you do have $5MM++ in which case, uh, cool, very happy for you
-
@xgranade @ireneista "do you have five million dollars of disposable income to fund an alternative to the PSF" is a good place to start, if you want to frame it as a "hostile fork" situation. the only solution is to get involved in the messy process of politics and governance and try to figure out a way to negotiate a durable peace
@glyph @ireneista One of those domino memes that starts with Calibre cutting a new release and topples into "Cassandra Granade runs for PSF Board."
I just seriously do not want to. But I agree, getting into the messy politics is the only way forward with Python in particular.
-
@cap_ybarra @xgranade @sparks they do not take fash money, but they seem to be happily using a machine that is intrinsically inseparable from fash values anyway...
-
@theorangetheme @xgranade I don't want to sell CPython's review process and test suite short here, nor the high quality of the work that Serhiy and Gregory do on the core. I don't subscribe to the theory that it's automatically bad work on technical merit because of the tools.
But it *does* carry the taint of corporate influence, exposure to financial instability, and ethical/aesthetic unpleasantness, and I find that very regrettable.
-
@theorangetheme @xgranade I don't want to sell CPython's review process and test suite short here, nor the high quality of the work that Serhiy and Gregory do on the core. I don't subscribe to the theory that it's automatically bad work on technical merit because of the tools.
But it *does* carry the taint of corporate influence, exposure to financial instability, and ethical/aesthetic unpleasantness, and I find that very regrettable.
@theorangetheme I do agree with @xgranade that it's a leading indicator, especially if the scope of use grows…
-
@cap_ybarra @xgranade @sparks they do not take fash money, but they seem to be happily using a machine that is intrinsically inseparable from fash values anyway...
-
@theorangetheme I do agree with @xgranade that it's a leading indicator, especially if the scope of use grows…
@theorangetheme @xgranade and possibly a compromise of the Code of Conduct, if Anthropic drops their commitment to not building weapons and turns Claude into another genocide machine at DOD's behest, as it seems they would like to/are being pressured to do.
(that ship has probably sailed on account of e.g. using GitHub in the first place)
-
@ireneista If there's any monolithic overly centralized dependency that makes sense to take on, it's the language itself.
It would have been nice if alternative implementations like PyPy, IronPython, and Jython could have taken off, but extension modules are just too important to leave out.
-
This is bad. This is very, very bad.
I'm not trying to pick on Python here, I pick it because Python is something I'm actively using, and so I have a vested interest in the project *not* being AI-vulnerable.
But it's not good, chat. It's very far from good, in fact.
[edited to add: see two addendums below, they're important context]
@xgranade you shouldn't really be judging the code authors, but rather the maintainers. writing code is usually easier than reviewing it.
-
@theorangetheme The use within the CPython is limited to two users, one of whom is a member of the Python organization.
Running the search on the entire organization returns a few more: https://github.com/search?q=org%3Apython+%22Co-authored-by%3A+Claude%22&type=commits
Its worth noting that not all commits by these users disclose use of AI; either they didn't use AI, or did it in a way which isn't disclosed.
For the entire organization, there were 4 disclosed uses of Claude over within the last 7 days. The organization has 141 members, and 3,374 contributors to CPython alone. If these commits are the only AI generated code, then the usage would be a insignificant.However, I don't see anything requiring contributors to disclose use of AI, so its hard to know the true number, which is likely far higher.
Even if Python doesn't ban use of AI, I think they should at least require usage to be disclosed, so AI generated PRs can be subject to extra review and so we have a clearer picture. I personally would completely ban AI generated code - and have from my projects - but given its prevalence in the tech industry, such a ban would be both unlikely and unpopular. I fail to see why anyone would object to mandatory disclosure, though I'm sure some would.
-
@theorangetheme @xgranade and possibly a compromise of the Code of Conduct, if Anthropic drops their commitment to not building weapons and turns Claude into another genocide machine at DOD's behest, as it seems they would like to/are being pressured to do.
(that ship has probably sailed on account of e.g. using GitHub in the first place)
@SnoopJ @theorangetheme Anthropic already works with Palantir. But by the time we get to friend-of-a-friend kind of logic...
-
@xgranade you shouldn't really be judging the code authors, but rather the maintainers. writing code is usually easier than reviewing it.
@MissingClara I agree maintaining is the more difficult part, but introducing wildly unethical and flawed tooling into the authorship stage is a problem, and a major one at that.
-
This is bad. This is very, very bad.
I'm not trying to pick on Python here, I pick it because Python is something I'm actively using, and so I have a vested interest in the project *not* being AI-vulnerable.
But it's not good, chat. It's very far from good, in fact.
[edited to add: see two addendums below, they're important context]
@xgranade Huh, checked exactly that a few hours ago and didn't see the warning. Recent commit?
-
@astraluma A bit complex, but the rough summary would be a bit of column A, a bit of column B?
I'm definitely displeased, no question. I'm also afraid of what it touched, but not to the degree that I'm abandoning Python wholesale nor encouraging others to do so.
As mentioned, I'm not trying to pick on Python in particular, so much as that it's an example that's near and dear to my heart?
-
@SnoopJ @theorangetheme Anthropic already works with Palantir. But by the time we get to friend-of-a-friend kind of logic...
@xgranade @theorangetheme yea, you run into "no ethical consumption" awful fast in software
-
@ireneista If there's any monolithic overly centralized dependency that makes sense to take on, it's the language itself.
It would have been nice if alternative implementations like PyPy, IronPython, and Jython could have taken off, but extension modules are just too important to leave out.
@xgranade @ireneista Keep an eye on MicroPython; https://pyscript.net supports it as a backend, where it has the obvious benefit of, well, being small
-
@astraluma Searching through commits directly at the command line, @SnoopJ found a list:
https://hachyderm.io/@SnoopJ/116133154453198084
I found a few more by searching PR discussions, some mention Claude as the author but don't include that in Git metadata.
That said, the warning banner has been appearing for some folks and not others, I have no idea why.
-
@ireneista @xgranade voting eligibility is easy, just join at https://psfmember.org, and either pay dues or self-certify as a contributing member, and register your interest in voting when they send out the yearly email
-
@MissingClara I agree maintaining is the more difficult part, but introducing wildly unethical and flawed tooling into the authorship stage is a problem, and a major one at that.
@xgranade I also dislike it, but the cat's out of the bag, even if it wasn't allowed people would still be using it, just without revealing it