WeepingElf wrote: ↑Fri Sep 08, 2023 2:31 pm
These concerns are nothing new. 200 years ago, craftspeople were worried about factories. Later, painters were worried about photography, office clerks were worried about typewriters and calculators, and musicians were worried about grammophones. And so on. And we still have craftspeople, office clerks and musicians. They have lost
some jobs, but they are still around and have their place in our society.
This is the same argument as saying "War is nothing new. 200 years ago people were worried about cannons. Later people were worried about machine guns, and even later, airplanes. And so on. Nuclear weapons are no problem, nuclear war won't be anything we can't handle."
Or "Human-caused environmental problems are nothing new. We've had mass extinctions, desalinization, deforestation, devastation of entire biomes and fishing stocks. Yet there are more humans than ever! Burning off our ecosphere with ever-increasing carbon emissions is no different."
I'm not saying that AI
is an existential risk. But a facile dismissal of possible risks, and historical problems with automation, doesn't help. Technological changes can get bigger and bigger, and "we always handled it before" doesn't mean we always will.
Automation, and other forms of increased productivity,
are disruptive-- and not infrequently produced famine and revolt. It's hard to read about the
Highland Clearances and shrug them off as a benign improvement. It was great for the landlords, not so great for the people who died, lost their livelihood, or were forced to emigrate.
Automation
can be benign in the long run, if new and better jobs are created at the same time. It's not automatic that this happens, and in the US, the conditions for it happening have been systematically undermined. Jobs are replaced or improved only if there's a social safety net, good public education, unions, restrictions on how much the wealthy can keep productivity gains for themselves.
I'd also point out that AI is not really like many of these earlier innovations, not because robots are going to destroy us, but because executives are going to destroy entire industries based on misplaced hype. There are already CEOS who think they can fire their entire customer service department, or all their writers or actors, or all their reporters. This isn't even productivity improvement, it's disruption with almost no positive gain. But destroying an industry doesn't necessarily mean that it can be easily reconstituted once people realize it was a mistake. (The news industry, for instance, has been decimated (not just by AI, it's a long process), and it's not likely to be coming back in the useful form it once had.)