zompist wrote: ↑Fri Feb 27, 2026 5:34 am
bradrn wrote: ↑Fri Feb 27, 2026 4:58 am
zompist wrote: ↑Thu Feb 26, 2026 6:48 pm
Now it's easier to see why the charts differ: the bin sizes are different. Eg. the two bins on either side of the orange line, where most of the light comes in on the wavelength chart, take up over half of the frequency chart.
This is precisely his point! There are (at least) four different ways of measuring spectra, with none of them intrinsically ‘best’, and with a nonlinear relationship between them. It is misleading to cherry-pick the single plot which happens to match up with the sensitivity of the eye, while ignoring the others. Indeed, as he argues at the end, it’s just as easy to argue that the most ‘relevant’ distribution is that of photon count over frequency, which gives you a maximum in the infrared.
I don't really understand the difference between the first and second set of charts. Though charts 1 and 3 are really not that different.
It’s a bit subtle… basically, a single photon at a longer wavelength has less energy than a single photon at a shorter wavelength, so counting by ‘amount of energy’ gives you different results than counting by ‘number of photons’. Sometimes the one is relevant, sometimes the other is. Photoreceptor proteins interact with individual photons, so arguably it’s the latter which is more relevant here.
I said it twice, but just in case: if you look at the top two charts, the vast majority of the sun's radiation is either in the visible light range, or in the near infrared. (There is a lot of infrared, this is just part of it.) Far from disproving the traditional view, he's just added one mystery: why our eyes don't extend into the infrared.
Yes, and that is precisely the ‘common misconception’ which he addresses — that the sensitivity of the eye is somehow uniquely aligned with the distribution of solar radiation. If this were really the most important mechanism at work in the evolution of our eyes, our eyes
should extend well into the infrared, but they don’t, so that ‘explanation’ must be missing something very important — that being the ‘mystery’ you mention.
(The original paper he cites (Sofer & Lynch) puts this more bluntly: ‘The eye does not appear to be optimized for detection of the available sunlight, including the surprisingly large amount of infrared radiation in the environment’.)
There seem to be several reasons-- this guy suggests that a cone cell that would detect infrared would be "unstable". Some Googling suggests other reasons, such as water absorbing infrared, and the signal getting swamped by the body's own heat. Animals that do sense infrared, such as snaked, use an entirely different organ, so I'm happy saying that it's a physical limitation of cone cells, at least..
Sofer & Lynch agree that it’s strongly related to the transmission of water. They also go into more detail about the ‘unstable’ molecules — to summarise, the longer your wavelength, the bigger your molecule needs to be to interact with the photon, and such big photosensitive molecules tend to either react with water or fall apart at body temperature.