malloc wrote: ↑Sat Jun 07, 2025 6:29 pm
Torco wrote: ↑Sat May 31, 2025 3:50 pmThey indeed don't know a lot about it. then again, no one is trying to make a human-like mind -that i know of- so it's even less likely that they will do it.
Except that multiple massive corporations have specifically put forth
artificial general intelligence as one of their goals. You can dispute the feasibility of this goal but you cannot claim that nobody is trying to replicate the abilities of the human mind.
but AGI is still not a person. just some nebulous concept of having a model that's capable of adequately performing functions that are very broad in scope, about as broad as a person, or something. you fall again and again into this congnitive trap of "people can do X,Y,Z. software is doing X, and they're working on Y and Z >>> therefore, they're making software people". but no, it's not software people, not anymore than the petrol engine is a metal horse. the engine is enough of a metal horse that it dramatically decreased the horse population, but it's not enough of a metal horse that you can to any significant degree understand it by analogy with the horse.
put even more simply, to do what a horse can do is not the same as to be a horse, or a superior-horse. to do what a mind can do is not to be a mind, or a superior-mind. function is not structure. ability is not essence.
malloc wrote: ↑Sat Jun 07, 2025 6:29 pm
Except that multiple massive corporations have specifically put forth
artificial general intelligence as one of their goals. You can dispute the feasibility of this goal but you cannot claim that nobody is trying to replicate the abilities of the human mind.
see here you're closer to truth: people are trying to replicate concrete abilities of the human mind, and for some of them succeeding! but functional replication is not replication itself: i have a hot plate next to me [nice little bit of tech] which replicates some functions of a wood fire, i.e. heating up my morning milk. it is not, however, a wood fire.
keenir wrote: ↑Sat Jun 07, 2025 8:27 pmMalloc, people have been trying to replicate the human mind since the days of Byron, if not Aristotle.
in fairness, this is going too far [lmao am i the centrist here?]. they've been trying since aristotle, sure, but they haven't been getting much success at the functional part it till relatively recently. AGI is *possible*, if not stricto sensu, in a sensu stricto enough to have the powers that be decide we're all not necessary. it's not silly to worry about the negative effects of these algorithms: what's silly is to think about the in the particular way malloc does.