2
\$\begingroup\$

How are IR thermal-vision devices able to present us with "invisible" imagery in such a way that we can actually interpret it?

What I'm really trying to wrap my head around is:

  1. How a sensor can "see" colours that I can't. (I understand it's probably like trying to explain colour to somebody born without sight).

  2. How it can "pitch-shift" to a visible wavelength/frequency. (I understand that terminology is probably incorrect when talking about light as opposed to sound, but you get the idea).

  3. Not to mention, in some situations, it seems to have this uncanny ability to "see through" solid, opaque materials & detect objects obscured by obstacles etc. (Like military & DEA aerial surveillance, for example).

Aerial Thermal IR Surveillance Aerial Thermal IR Surveillance Macro Thermal IR Fusebox Aerial Thermal IR Motorist
Macro Thermal IR Connectivity
Macro Thermal IR Plumbing

\$\endgroup\$
8
  • 2
    \$\begingroup\$ Light that we can see is part of the EM spectrum. Infrared is light that we can't naturally see.... are you following this? Does it make any big deal that we recolour thermal images to suit our eye's ability only to see normal colours. \$\endgroup\$ Commented Oct 30, 2015 at 17:53
  • 1
    \$\begingroup\$ Moreover. Thermal images don't have colors, as they are relying on a very narrow wavelength spectrum, so the only thing we can get from there is the intensity. I.e. they are naturally monochromatic. The postprocessing is just assigning some non-monochromatic palette mapped to the intensity values, so the images appear colorful. \$\endgroup\$ Commented Oct 30, 2015 at 19:13
  • \$\begingroup\$ @EugeneSh.: Actually, thermal imagers cover a much wider spectrum than that of visible light. The problem is that the sensor (typically a bolometer array) can't distinguish wavelength, so we have to display the resulting images as either monochrome or false color images as shown above. \$\endgroup\$ Commented Oct 31, 2015 at 4:14
  • \$\begingroup\$ Cheers @DaveTweed What I'm really trying to wrap my head around is (firstly) how a sensor can "see" colours that I can't. (I understand it's probably like trying to explain colour to somebody born without sight). And (secondly) how it can "pitch-shift" to a visible wavelength/frequency. (I understand that terminology is probably incorrect when talking about light as opposed to sound, but you get the idea). \$\endgroup\$ Commented Oct 31, 2015 at 22:36
  • \$\begingroup\$ Did you know that a pigeon can detect the earth's magnetic field and help it find it's way home ? There are snakes that can "see" infrared. The fact that you cannot see infrared (a heat picture) or sense the earth's magnetic field doesn''t mean other creatures/devices can't. The pitch-shift is just a calculation in a computer, it's not magic. \$\endgroup\$ Commented Oct 31, 2015 at 23:04

2 Answers 2

1
\$\begingroup\$

What I'm really trying to wrap my head around is (firstly) how a sensor can "see" colours that I can't. (I understand it's probably like trying to explain colour to somebody born without sight).

The sensor is just converting some physical measurement — in this case, the temperatures of a whole lot of little thermal sensors — into a 2-dimensional array of data. The thermal sensors (microbolometers) get their temperature from the fact that a point on some distant object is focused on each one by a lens that is transparent to long-wave IR.

And (secondly) how it can "pitch-shift" to a visible wavelength/frequency. (I understand that terminology is probably incorrect when talking about light as opposed to sound, but you get the idea)

The sensor isn't doing that at all. Any 2-D array of data can be represnted visually by mapping colors (or grayscale values) to the numbers. It is the display device (eg., LCD) that is producing the visible light that you see.

\$\endgroup\$
1
  • \$\begingroup\$ Do you have more information? Maybe I ought to have asked specific questions about microbolometers (thanks) instead. \$\endgroup\$ Commented Nov 1, 2015 at 0:16
1
\$\begingroup\$

Our retina's have cone cells that are sized and designed to absorb a particular frequencies of light, rgb-ish. Those frequencies are not infrared.

Electronic photosensors are designed to pick up certain frequencies of light and convert them into electrical signals. Designers choose photosensors that will pick up the frequencies they want the device to pick up (infrared). When a group of photons bombard the photosensor at the right frequency, an electrical signal is generated and that is processed by a computer. Then for humans to see it, the computer sends the information to a display. And you could have an arbitrary conversion in the computer. For example, low intensity infrared bombardment would result in a low voltage or current which the computer then converts to be purple. If there's a high intensity infrared bombardment, the computer converts that to be yellow. It's nothing more than an intensity number to color conversion.

It is the same thing as a normal camera except the big difference is that the photo detector is no longer a visible light sensor, it's an infrared sensor and since there's no "color" in infrared, we make up the conversion so that it's easier to visualize for humans.

\$\endgroup\$

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.