Imaging question

Why are images taken with different colors one at a time?

That’s totally standard in scientific astronomical imaging. If you think about a camera that can take an RGB image in one shot, it has a tiny pattern of R, G, and B filters covering neighbouring pixels, and then the camera software reassembles those into a 3-color grid.
Lots of reasons not to do that for science cameras. First, not everyone wants the same set of filters – it all depends on the science you’re doing. Many would take fewer than three filters, some take 6 or more, plus there are also specialty filters (narrow-band, or very broad filters used for asteroid hunting). Second, these cameras have 100% active detector area, so the filter-to-pixel match would have to be perfect or else you’d get smearing – not good!!
Another reason is we took quite different exposure times, and in different sky conditions, in different filters. The z band (which becomes ‘red’ in these color images) can be taken while the moon is up, but we typically took much longer exposures (like 120-250 seconds), vs the moon-down dark-sky conditions required for the g and r bands (‘blue’ and ‘green’ here), where we took as little as 40 second exposures.
cheers,
–dustin

2 Likes