• 1 hour ago
Hubble Space Telescope can "only see the universe in shades of grey," according to NASA's Goddard Space Flight Center. Learn how the imagery is processed into amazing color views of the cosmos.

Credit: NASA's Goddard Space Flight Center:
Miranda Chabot: Lead Producer
Miranda Chabot: Writer
Miranda Chabot: Narrator
Paul Morris: Support

Music & Sound:
Music Credit: “A Woven Narrative,” Matthew James Jude Anderson [PRS], Ninja Tune Production Music, Universal Production Music

Category

🤖
Tech
Transcript
00:00As a cosmic photographer, NASA's Hubble Space Telescope has taken over a million snapshots documenting the universe.
00:08These images illustrate, explain, and inspire us with their grandeur, but may not match what we'd see with our own eyes.
00:17That's because Hubble sees light beyond our sensitivity.
00:21Our eyes only sense a small fraction of the universe's light.
00:25This tiny band of wavelengths, called the visible spectrum, holds every color in the rainbow.
00:31Light outside that span, with longer or shorter wavelengths, is invisible to our eyes.
00:37But those invisible wavelengths can tell us so much more about the universe.
00:42Hubble houses six scientific instruments that observe at different wavelengths.
00:47Together, they expand our vision into infrared and ultraviolet light.
00:52That doesn't mean Hubble can show us never-before-seen colors.
00:57In fact, the telescope can only see the universe in shades of gray.
01:01Seeing in black and white allows Hubble to detect subtle differences in the light's intensity.
01:07If one wavelength is brighter than another, that tells us something about the science of that object.
01:13But because color helps humans interpret what we see,
01:16NASA specialists work to process and colorize publicly available Hubble data into more accessible images.
01:23When Hubble snaps a photo, it puts a filter in front of its detector, allowing specific wavelengths to pass through.
01:30Broadband filters let in a wide range of light.
01:34Narrowband filters are more selective, isolating light from individual elements like hydrogen, oxygen, and sulfur.
01:42Hubble observes the same object multiple times, using different filters.
01:47Image processors then assign those images a color based on their filtered wavelength.
01:52The longest wavelength becomes red, medium becomes green, and the shortest blue, corresponding to the light sensors in our eyes.
02:00Combining them gives us a color image, showcasing characteristics we can't make out in black and white.
02:05Adding color reveals the underlying science in every image.
02:10It's like translating words into another language, making sure no information is lost.
02:16Some words have an exact counterpart. The meaning remains the same when you swap them.
02:21Hubble's true color photos are like that.
02:24They are a direct translation, using broad filters in wavelengths we can see.
02:29Other words can't be translated directly.
02:32When we use narrowband filters or peer outside the visible spectrum, it's like translating words with no one-word replacement.
02:40Easily done, but requires more work.
02:44Narrowband images highlight the concentration of important elements.
02:49Infrared images are like heat maps, helping us spot newborn stars in dark, dusty clouds, and peer further back in time and space.
02:58In ultraviolet, we uncover active aurorae on Jupiter, and learn how young, massive stars develop.
03:05Image processors also clean up artifacts, signatures in an image that aren't produced by the observed target.
03:13As sensors age, some pixels become imperfect, returning too much electrical charge or not enough.
03:20Artifacts can leave behind odd shapes, or return images in a different color.
03:25These effects can be calibrated and removed.
03:29Other artifacts come from the dynamic environment of space.
03:34Even the best photographers get photobombed.
03:37In Hubble's case, the culprits are asteroids, spacecraft or debris trails, and high-energy particles called cosmic rays.
03:45By combining and aligning multiple observations, image processors can identify them,
03:50and piece together an artifact-free image.
03:54Without processing, many Hubble images would be divided down the middle.
03:59This line, called a chip gap, is the tiny space between some camera sensors.
04:04Hubble moves slightly with each observation, allowing image processors to fill the gap and replace faulty pixels.
04:12This process is called dithering.
04:14And because there's no natural up or down in space, processors decide how to rotate and frame the image.
04:22It's a time-consuming procedure.
04:25Simple images take about a week, while large mosaics stitched together from many observations can take a month to process.
04:35Hubble images may not be what we'd see firsthand.
04:38Instead, they are tools for understanding science at a glance, shedding light on otherwise invisible views of our universe.

Recommended