Travis B. wrote: ↑Wed Apr 29, 2026 2:00 pm
I've come to the conclusion that I'm a dualist -- not in the typical Cartesian 'mind-body' sense, but in the sense that I believe that while the physical substrate of reality is 'real' and the realm of ideas and abstractions must be
encoded in the physical substrate in one form or another, ideas and abstractions have their own identities and rules independent of the physical substrate, whose encoding need not have any consistent nature in the first place, and these ideas and abstractions are made 'effectively real' based on their interaction with the physical substrate (and in the case of things such as math, their interaction with the physical substrate may be very fundamental -- the rules of physics are intrinsically based on and and inseparable from math, even though math itself is 'unreal' in the sense that mathematical ideas can be completely divorced from physical reality [...].
I don't know what philosophical terms should apply here, and I'm not 100% sure I follow you, but this strikes me as the worldview that a programmer has to have, to make sense of their own work.
I think what you're saying is that reductionism doesn't always work. You can reduce the theory of gases to statistical mechanics of gas particles, but you can't reduce human cognition to neurochemistry, or computer programming to electronics.
There are some respectable reasons to say that reductionism doesn't always work
in practice. The math may be unsolvable: that's why all of chemistry can't be replaced by relativistic quantum field theory. There's chaos theory, also gumming up the math. Quantum effects mean that strict determinism is untrue and certain things are unpredictable.
A computer is a deterministic machine whose every bit of function can be explained by the electronics: voltage levels, flows of current, relationships between its hundred billion transistors. Not only do programmers not work at this level, it is strictly irrelevant to their work. Even if you work at the assembly level, you're working at a much higher level of abstraction— numbers and basic operations. Moreover, that level can be implemented on very different architectures, and higher-level languages make this an explicit goal.
Plus, computers are designed to work in the world— or at least one aspect of it, the user. But of course they can also analyze pictures and sound, operate a 3-D printer or a robot, and so on. Again, we explain all of this at an abstract level, not an electronic one.
Or to put it more bluntly, if you ask a computer to detect something in an image, the answer will be whether that thing is in the image; if you ask it to add two numbers, the answer is their sum. It'd be ridiculous to say that the answer is inherent in the machine's electronics, rather than out there in the world, or as a mathematical truth, respectively.