We're not drowning in NFTs. AI slop might well decline if the bubble bursts and a) AI is more expensive, b) the execs move on to the next corporate fad.alice wrote: ↑Tue Feb 03, 2026 2:39 pm Alternatively: "Recycled and recirculated shit is still shit, in whatever quantity". Just as the Internet is a part of life for increasingly many people , so too will AI-generated slop be. And that's the main problem, *not* that people with more money than sense are trying to automate and replace human creativity (which won't succeed, I think we're pretty much agreed).
Developers, told to use AI or be fired, report productivity gains. But as one study reports, "While developers did spend less time on boilerplate code generation and API searches, code-quality regressions and subsequent rework frequently offset the headline gains."
As I've said before, if code is so repetitive and yet arcane that an LLM helps produce it, there's an architectural problem. Why do you need millions of lines of code that is barely different from the millions of lines of code you already have? Probably you should refactor so that similar situations use the same code, controlled by a parameters file or something. Why is getting libraries to talk to each other so complicated? Probably their APIs are needlessly complex. (A programmer writing an API is already an expert in their own system and doesn't even see the complexity; also doesn't care how hard using it is since that's not his job.)
The principle probably applies elsewhere as well. LLMs are really good at writing corporate boilerplate— that is, bullshit documents required by someone but not intended to be read by human beings. (Human beings that matter to the writer, that is.) They're also really good at writing papers for you so you can pass college courses without doing work. And writing spam. If it's not worth doing, an LLM can do it!