>Whenever shooting a subject with a mixture of visible and infrared light, it becomes readily apparent that infrared light focuses differently from visible light. For many subjects, this can mean having to choose between crisp visible contours and an odd pink glow, or blurred edges with some unusual pink features inside. Some things never look sharp no matter where you move the focus.
The extent of this effect is very lens dependent. It also occurs in different colours of visible light too, depending on how well the lens design accounts for it. Optically, the term is "Chromatic Aberration" - lens designers try and account for it in the visible spectrum with optical design and lens coatings, and modern designs are generally extremely well corrected in the visible spectrum. _Usually_ designers aren't worried about the design correctly handing convergence into IR and UV, so how well designs focus them to the same point as the visible spectrum is hit or miss. There's specialist lenses out there that are designed specifically for wide spectrum apochromatism, but they tend to be special purpose and very expensive - especially if they handle UV.
The author mentions it at the bottom of the post as something they're interested in trying out, but I've found it very fun to play with dual bandpass filters - they pass a part of the Visible Spectrum + IR, which creates some interesting options in editing for visual display. There's an example in this set I shot with different filters - https://www.reddit.com/r/infraredphotography/comments/1dnki0...
Superachromat is the term you're looking for in terms of lenses corrected throughout IR to UV.
They're not actually that expensive if you know where to look, I got my Zeiss 250mm supeachromat for about $500 from a Japanese seller. Works a treat for full spectrum film work especially on a system that lets you swap between film backs, if not for the woeful cost of film these days.
> it becomes readily apparent that infrared light focuses differently from visible light.
On old school manual focus capable lenses you'll note a small (often red when colors were used to indicate f stops) dot to the left of the focus indication line.
This was the offset for IR photography. You'd focus normally, and then make note of the focus distance and then line up the focus distance with the red dot for IR offset.
---
The UV photography often was done with other glass since the glass used by most lenses does an ok job of filtering UV light.
I like the idea of using the IR, where it shows a greater degree of contrast, to act as a "contrast mask" on the visible light image.
I'm thinking of the beautiful cloud detail in the one IR shot where the visible light photo had lost all of that. Seems like some compositing (sort of like HDR) you could try to pull in the best of both worlds.
For contrast specifically, you can get much of that effect with a red filter. Just removing the majority blue/green components already changes the lighting massively. Can add a polarising filter too for an even more extreme effect on the sky.
This is really cool -- pedantically, I've always thought "full spectrum" is actually misleading from a traditional photographic sense. Like IR + visible light + UV != full spectrum. I'd love to see post-processed imagery of every-day life through an extended view of broader EM energy (similar to astrophotography)... like what does a city scene look like with x-rays and microwaves included?
Side note: have always loved this image https://imgur.com/NZjWfWT of rainbows with UV and IR visible.
You'd obviously have to use false-color, as most modern astronomy pictures do (even the ones that use visible tend to pump the saturation UP!).
However, the amount of light from the sun drops off exponentially away from the peak at green-blue (yellow-green, after atmospheric filtering). You'd also have to really fake the dynamic range a lot to get it to look any different from IR+Vis+NUV. (If there was 0.001% as much x-ray light as there is, say, red light, DNA could only exist in the lightless depths of the ocean.)
So, it would look like an IR+Vis photo (light falls off pretty fast in the UV, too), except the ones you've seen oversell the IR.
So it would look like a Vis-light photo, with slightly shinier objects in it.
I like distinguishing "light" (physical world) from "color" (species-specific biology). Sunbeam blue light is already less intense than NIR-I, but human bio juices the blue. Most humans are bright-light trichromats and low-light monochromats. Rod sensitivity is 3 orders of magnitude up, with single-ish photon sensitivity. Some amphibians have an extra rod type, for low-light bichromaticity. Some deep-sea fish are bright mono and dark lotschromats (12+ rod opsins). So why not imagine seeing the world with a triple (or more) of short-wavelength super-rods, a few orders of magnitude more sensitive still, with whatever curves seem fun? Perhaps curves naturally selected for by "makes intriguing images of the world for social media"...
One thing I've wondered about is IR fluorescence photography.
I've seen some examples in document forensics where a page that looks blank (or at least the ink is unrecognizably smudged) because of water exposure is completely legible with an infrared photo illuminated by UV.
I suspect there must be a hidden world only visible in IR and UV (and long-wave IR, e.g. "thermal").
You’re considering whether it would be possible - and perhaps quite elegant - to use an XY‑scanner to raster‑scan the end of an optical fiber across a prism, disperse the light, and then capture the resulting spectrum with a CCD line sensor.
With that setup, each pixel on the line sensor would effectively record the full spectral content of the light at that scanned position, all in a single acquisition.
You could probably use just an X-scanner, and instead of a CCD line sensor, use a regular 2D image sensor if you used a "1 pixel wide" slit aperture to crop the image perpendicularly to the direction that the prism disperses the light. So instead of a single pixel being dispersed, you disperse a line.
You would reduce the time required by the root of the number of pixels you want (assuming a square image).
(This is what we do in momentum-resolved electron energy loss spectroscopy. In that situation we have electromagnetic lenses that focus the electrons that have been dispersed, so we don't have as bad a chromatic aberration problem as the other response mentions).
I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!
> I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!
Here[1] are some 31-band hyperspectral images of butterflies. Numpy/pillow can unpack the .mat files into normal images. Then perhaps vibecode a slider, or just browse the band images?
I knew of the site having explored "First-tier physical-sciences graduate students are often deeply confused about color. Color is commonly taught, starting in K... very very poorly. So can we create K-3 interactive content centered around spectra, and give an actionable understanding of color?"
A problem for multispectral imagery (even within visible rgb), is that the wavelengths of light are different so the lens cannot be in focus for all spectrum at once. I have tested this out with a few of my slr lenses. If you have blue channel perfectly in focus, red isn't just a little out of focus, it is actually noticeably way out.
This is called chromatic aberration, for those who are intrigued.
Given that regular phone cameras have sensors that detect RGB, I wonder if one could notice improved image sharpness if one had three camera lenses (and used single-color sensors) next to one another laterally, with a color filter for R, G and B for each one respectively. So that the camera could focus perfectly for each wavelength.
there are lenses out there designed for apochromatic performance across the UV-Vis-IR band, but they tend to be really pricey.
The Coastal Optical 60mm is a frequently cited one. UV in particular is challenging, because glass that works well in the visible light range can be quite poorly translucent in UV. Quartz is better, but drives up the cost a lot, and comes with other tradeoffs.
I've had this problem as well, but it's just due to optical properties of the lens and extremely consistent from image to image, so you can calibrate and correct for it as long as you focus each wavelength and collect data separately.
I don't think you can property calibrate for it unless you also move the camera to compensate for focus breathing. I'm not sure if that would fully account for it either. That being said these things are only very noticeable pixel peeping.
Focus breathing can be compensated for. The "breathing" only changes the effective focal length, not the location of the camera, so you can map the pixels to match where they should be and bilinear/bicubic interpolate appropriately.
Shoot a checkerboard at both wavelengths each focused properly and then compute the mapping.
If you're shooting macro stuff then maybe you are changing the effective location of the camera slightly depending on the exact mechanics of the lens and whether the aperture slides with the focusing, but the couple of mm shift in camera location won't matter for landscapes.
Alternatively, use cine lenses which are engineered not to breathe, but they are typically more expensive for that reason.
Related project: I shot a lot of landscapes in Iceland using a thermal (long wave IR) camera to show geologic phenomena in action. This involved stitching together a lot of (narrow FOV) thermal images and overlaying it on top of visible camera images for context.
>Whenever shooting a subject with a mixture of visible and infrared light, it becomes readily apparent that infrared light focuses differently from visible light. For many subjects, this can mean having to choose between crisp visible contours and an odd pink glow, or blurred edges with some unusual pink features inside. Some things never look sharp no matter where you move the focus.
The extent of this effect is very lens dependent. It also occurs in different colours of visible light too, depending on how well the lens design accounts for it. Optically, the term is "Chromatic Aberration" - lens designers try and account for it in the visible spectrum with optical design and lens coatings, and modern designs are generally extremely well corrected in the visible spectrum. _Usually_ designers aren't worried about the design correctly handing convergence into IR and UV, so how well designs focus them to the same point as the visible spectrum is hit or miss. There's specialist lenses out there that are designed specifically for wide spectrum apochromatism, but they tend to be special purpose and very expensive - especially if they handle UV.
The author mentions it at the bottom of the post as something they're interested in trying out, but I've found it very fun to play with dual bandpass filters - they pass a part of the Visible Spectrum + IR, which creates some interesting options in editing for visual display. There's an example in this set I shot with different filters - https://www.reddit.com/r/infraredphotography/comments/1dnki0...
Superachromat is the term you're looking for in terms of lenses corrected throughout IR to UV. They're not actually that expensive if you know where to look, I got my Zeiss 250mm supeachromat for about $500 from a Japanese seller. Works a treat for full spectrum film work especially on a system that lets you swap between film backs, if not for the woeful cost of film these days.
> it becomes readily apparent that infrared light focuses differently from visible light.
On old school manual focus capable lenses you'll note a small (often red when colors were used to indicate f stops) dot to the left of the focus indication line.
https://commons.wikimedia.org/wiki/File:AiS_Nikkor_85mm-2.0_...
On more modern lenses, is simply a dot. https://www.mir.com.my/rb/photography/companies/nikon/nikkor...
This was the offset for IR photography. You'd focus normally, and then make note of the focus distance and then line up the focus distance with the red dot for IR offset.
---
The UV photography often was done with other glass since the glass used by most lenses does an ok job of filtering UV light.
The 105mm UV lens for example - https://www.mir.com.my/rb/photography//hardwares/speciallens...
It's an oddball enough lens that others don't often make that it keeps getting special runs.
https://www.nikon.com/business/industrial-lenses/lineup/uv/
Costal Optics did a run of of the lens too - https://diglloyd.com/prem/s/DAP/Coastal60f4/Coastal60f4.html...
---
One of the photographers I've stumbled across from days of old who did UV nature photography (what do bees see?) http://www.naturfotograf.com/uvstart.html
I like the idea of using the IR, where it shows a greater degree of contrast, to act as a "contrast mask" on the visible light image.
I'm thinking of the beautiful cloud detail in the one IR shot where the visible light photo had lost all of that. Seems like some compositing (sort of like HDR) you could try to pull in the best of both worlds.
For contrast specifically, you can get much of that effect with a red filter. Just removing the majority blue/green components already changes the lighting massively. Can add a polarising filter too for an even more extreme effect on the sky.
E.g. this photo (looks quite HDR'd but it's not, it's barely edited): https://rjones.photos/gallery/photo/20251207-img9638
This is really cool -- pedantically, I've always thought "full spectrum" is actually misleading from a traditional photographic sense. Like IR + visible light + UV != full spectrum. I'd love to see post-processed imagery of every-day life through an extended view of broader EM energy (similar to astrophotography)... like what does a city scene look like with x-rays and microwaves included?
Side note: have always loved this image https://imgur.com/NZjWfWT of rainbows with UV and IR visible.
By this measure, there is no "full spectrum" photography ever.
You'd obviously have to use false-color, as most modern astronomy pictures do (even the ones that use visible tend to pump the saturation UP!).
However, the amount of light from the sun drops off exponentially away from the peak at green-blue (yellow-green, after atmospheric filtering). You'd also have to really fake the dynamic range a lot to get it to look any different from IR+Vis+NUV. (If there was 0.001% as much x-ray light as there is, say, red light, DNA could only exist in the lightless depths of the ocean.)
So, it would look like an IR+Vis photo (light falls off pretty fast in the UV, too), except the ones you've seen oversell the IR.
So it would look like a Vis-light photo, with slightly shinier objects in it.
Sorry.
I like distinguishing "light" (physical world) from "color" (species-specific biology). Sunbeam blue light is already less intense than NIR-I, but human bio juices the blue. Most humans are bright-light trichromats and low-light monochromats. Rod sensitivity is 3 orders of magnitude up, with single-ish photon sensitivity. Some amphibians have an extra rod type, for low-light bichromaticity. Some deep-sea fish are bright mono and dark lotschromats (12+ rod opsins). So why not imagine seeing the world with a triple (or more) of short-wavelength super-rods, a few orders of magnitude more sensitive still, with whatever curves seem fun? Perhaps curves naturally selected for by "makes intriguing images of the world for social media"...
This is super cool, I want one now
Looks like grainydays on youtube can finally stop chugging flaming hot mountain dew everyday in the hopes that Kodak will bring back Aerochrome.
https://www.youtube.com/watch?v=v5KBQd_DkQw
One thing I've wondered about is IR fluorescence photography.
I've seen some examples in document forensics where a page that looks blank (or at least the ink is unrecognizably smudged) because of water exposure is completely legible with an infrared photo illuminated by UV.
I suspect there must be a hidden world only visible in IR and UV (and long-wave IR, e.g. "thermal").
You’re considering whether it would be possible - and perhaps quite elegant - to use an XY‑scanner to raster‑scan the end of an optical fiber across a prism, disperse the light, and then capture the resulting spectrum with a CCD line sensor.
With that setup, each pixel on the line sensor would effectively record the full spectral content of the light at that scanned position, all in a single acquisition.
You could probably use just an X-scanner, and instead of a CCD line sensor, use a regular 2D image sensor if you used a "1 pixel wide" slit aperture to crop the image perpendicularly to the direction that the prism disperses the light. So instead of a single pixel being dispersed, you disperse a line.
You would reduce the time required by the root of the number of pixels you want (assuming a square image).
(This is what we do in momentum-resolved electron energy loss spectroscopy. In that situation we have electromagnetic lenses that focus the electrons that have been dispersed, so we don't have as bad a chromatic aberration problem as the other response mentions).
I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!
> I would love to see e.g. a butterfly image with a slider that I could drag to choose the wavelength shown!!
Here[1] are some 31-band hyperspectral images of butterflies. Numpy/pillow can unpack the .mat files into normal images. Then perhaps vibecode a slider, or just browse the band images?
[1] http://www.ok.sc.e.titech.ac.jp/res/MSI/MSIdata31.html (includes 8 butterfly 31-band hyperspectral visible-light images). These butterflies are also in their VIS-SNIR dataset, and others.
I knew of the site having explored "First-tier physical-sciences graduate students are often deeply confused about color. Color is commonly taught, starting in K... very very poorly. So can we create K-3 interactive content centered around spectra, and give an actionable understanding of color?"
Very nice idea! That makes it much easier!
A problem for multispectral imagery (even within visible rgb), is that the wavelengths of light are different so the lens cannot be in focus for all spectrum at once. I have tested this out with a few of my slr lenses. If you have blue channel perfectly in focus, red isn't just a little out of focus, it is actually noticeably way out.
This is called chromatic aberration, for those who are intrigued.
Given that regular phone cameras have sensors that detect RGB, I wonder if one could notice improved image sharpness if one had three camera lenses (and used single-color sensors) next to one another laterally, with a color filter for R, G and B for each one respectively. So that the camera could focus perfectly for each wavelength.
Next issue would be the perspective distortion in the merged image
there are lenses out there designed for apochromatic performance across the UV-Vis-IR band, but they tend to be really pricey.
The Coastal Optical 60mm is a frequently cited one. UV in particular is challenging, because glass that works well in the visible light range can be quite poorly translucent in UV. Quartz is better, but drives up the cost a lot, and comes with other tradeoffs.
I've had this problem as well, but it's just due to optical properties of the lens and extremely consistent from image to image, so you can calibrate and correct for it as long as you focus each wavelength and collect data separately.
I don't think you can property calibrate for it unless you also move the camera to compensate for focus breathing. I'm not sure if that would fully account for it either. That being said these things are only very noticeable pixel peeping.
Focus breathing can be compensated for. The "breathing" only changes the effective focal length, not the location of the camera, so you can map the pixels to match where they should be and bilinear/bicubic interpolate appropriately.
Shoot a checkerboard at both wavelengths each focused properly and then compute the mapping.
If you're shooting macro stuff then maybe you are changing the effective location of the camera slightly depending on the exact mechanics of the lens and whether the aperture slides with the focusing, but the couple of mm shift in camera location won't matter for landscapes.
Alternatively, use cine lenses which are engineered not to breathe, but they are typically more expensive for that reason.
Related project: I shot a lot of landscapes in Iceland using a thermal (long wave IR) camera to show geologic phenomena in action. This involved stitching together a lot of (narrow FOV) thermal images and overlaying it on top of visible camera images for context.
https://petapixel.com/2019/07/13/shooting-high-res-thermal-p...
Very cool!