bradrn wrote: ↑Sat Sep 09, 2023 8:32 pm
In today’s money, I believe that would be around one trillion USD.
That's more money than I would have expected. Not sure it paid off though.
Or you can just read Hofstadter’s Gödel, Escher, Bach, which very excitedly talks about systems which are capable of reasoning about blocks stacked on top of each other, and predicts that any system intelligent enough to play chess will be intelligent enough to refuse to do so. It’s a great book, but totally missed the mark when it came to AI — just like most efforts of that era.
I read it long ago-- I was an AI nerd. Hofstadter was like the
Wired magazine of the 1970s, endlessly enthusiastic about any movement toward AI but not (so far as I can see) expecting it would happen any time soon. I don't think anyone thought that projects at that time "were AI", just that eventually they might lead to it.
I think this is too cynical. On the one hand, people genuinely got excited about expert systems. On the other hand, I think there is definitely a sense in which LLMs have ‘understanding’ of the world, though of course it depends on how you define ‘understanding’.
And I think people who should know better forget get fooled, just because LLMs can "talk". It's amazing what a huge corpus of inputs can do, but it's also pitifully easy to get an LLM to show that it really has no internal understanding. We are simply not used to interacting with things that sound smart but aren't.
I'm sure we've talked about this before-- it
is possible to analyze an LLM, if you have access to it, and see what connections it's making. I think talk of understanding is premature until that work has been done.
It's pretty much the reverse of expert systems: they were good at real-world knowledge (carefully fed to them) but bad at language; LLMs are the reverse.
I do feel that LLMs are a huge step toward AI-- a hundred times more than earlier attempts. And I suspect that human brains are more like LLMs than we really like to think. On the other hand, AI people have a long, long history of overestimating their progress, and assuming that the next step is a couple of years away.