there is approximately no chance that this is a real effect, and if it is there is a 0% chance it is intentional. LLMs don't understand vulns enough to make subtle mistakes, so any intentional sabotage would be way more obvious than whatever "somewhat more apt to result in low-quality code" means.
add a skeleton here at some point
26 days ago