But that merely kicks the can down the road. Now we must figure out how the computer analyzing the AI works and why it reached a particular interpretation. The more powerful AI becomes, the more powerful any computer analyzing it must become simply to keep pace. What happens when AI achieves the equivalent of 100 billion neurons?
But no other tool has this property. Telephones transmit your speech verbatim, washing machines follow predictable cycles for washing clothes, and even non-AI software only does what you specify. If you tell your SCA to apply French sound changes to Latin and it applies Welsh changes instead, you have lost control of the process. The job of a sound change applier is automating the deterministic process of phonological change on each word, not choosing sound changes of its own accord. It does what Tolkien did by hand, only faster.Yes, that's precisely why they are useful.
When someone with no literary talent asks an unfathomably complex machine to produce a novel whose contents they can't even predict, that is supplication, not mastery. That will only become more true as AI becomes more powerful in the coming decades. The legal owners of AI have little understanding of how it works, less still any direct control. They merely have legal rights to the profits that AI generates. You could ask Sam Altman why chatGPT wrote the novel you prompted with particular characters and themes or specific words in a given order and he would have no idea."Supplicators"? The whole point here is that you need to assign responsibility correctly: to the human owners, not to their lifeless scraps of silicon.
For that matter, one must remember that humans are mere scraps of carbon, following the same laws of physics as silicon based computers. What really is the difference between scraps of silicon and scraps of carbon that makes the former inherently lifeless?