Why can a regular infrared camera not show temperature (thermography)?



This is a common confusion, because both thermographic cameras and "normal" cameras with some IR capability are called IR cameras often.

The typical normal video camera with IR capability has a standard solid state camera sensor normally used for capturing visible light, which relies on the photoelectric effect to convert incoming visible light photons into electric charge which is subsequently measured. These photons are in the wavelength range of 300-800 nm or so, but the sensor technology is typically responsive up to 1000 nm or more. As the eye is not sensitive to the energy in the 800-1000 nm band, an IR cut filter is normally inserted in cameras to make the resulting photo seem similar to what the eye sees.

But if you remove the IR filter, you can get some "nightvision" capability by bathing the scene with light in the 850-950 nm range which is invisible to the eye.

On the other hand, thermal radiation is peaked at a much longer wavelength, typically at 8000 nm or longer, and is much more difficult to work with in a direct photon -> charge process, so the typical thermal camera uses a completely different and more mundane physical process - it actually uses an array of thermometers!

These are nothing else than a grid of small metal squares that are heated by the incoming thermal radiation, and their temperature can be read out because their resistance changes by their temperature (they are called micro-bolometers).

So, very different physical processes are used and the radiation is of an order of magnitude different wavelengths.

The thermal cameras need optics that can bend these longer wavelengths, they are often made of germanium for example and are opaque to visible light
Why can a regular infrared camera not show temperature (thermography)? Why can a regular infrared camera not show temperature (thermography)? Reviewed by Hamza Bashir Ahmad on 03:12:00 Rating: 5