• 9 months ago
Hubble Space Telescope can "only see the universe in shades of grey," according to NASA's Goddard Space Flight Center. Learn how the imagery is processed into amazing color views of the cosmos.

Credit: NASA's Goddard Space Flight Center:
Miranda Chabot: Lead Producer
Miranda Chabot: Writer
Miranda Chabot: Narrator
Paul Morris: Support

Music & Sound:
Music Credit: “A Woven Narrative,” Matthew James Jude Anderson [PRS], Ninja Tune Production Music, Universal Production Music

Category

🤖
Tech
Transcript
00:00 As a cosmic photographer, NASA's Hubble Space Telescope has taken over a million snapshots documenting the universe.
00:08 These images illustrate, explain, and inspire us with their grandeur, but may not match what we'd see with our own eyes.
00:17 That's because Hubble sees light beyond our sensitivity.
00:21 Our eyes only sense a small fraction of the universe's light.
00:26 This tiny band of wavelengths, called the visible spectrum, holds every color in the rainbow.
00:32 Light outside that span, with longer or shorter wavelengths, is invisible to our eyes.
00:38 But those invisible wavelengths can tell us so much more about the universe.
00:43 Hubble houses six scientific instruments that observe at different wavelengths.
00:47 Together, they expand our vision into infrared and ultraviolet light.
00:52 That doesn't mean Hubble can show us never-before-seen colors.
00:56 In fact, the telescope can only see the universe in shades of gray.
01:01 Seeing in black and white allows Hubble to detect subtle differences in the light's intensity.
01:07 If one wavelength is brighter than another, that tells us something about the science of that object.
01:13 But because color helps humans interpret what we see,
01:17 NASA specialists work to process and colorize publicly available Hubble data into more accessible images.
01:24 When Hubble snaps a photo, it puts a filter in front of its detector, allowing specific wavelengths to pass through.
01:31 Broadband filters let in a wide range of light.
01:35 Narrowband filters are more selective, isolating light from individual elements like hydrogen, oxygen, and sulfur.
01:43 Hubble observes the same object multiple times, using different filters.
01:47 Image processors then assign those images a color based on their filtered wavelength.
01:52 The longest wavelength becomes red, medium becomes green, and the shortest blue, corresponding to the light sensors in our eyes.
02:00 Combining them gives us a color image, showcasing characteristics we can't make out in black and white.
02:07 Adding color reveals the underlying science in every image.
02:12 It's like translating words into another language, making sure no information is lost.
02:18 Some words have an exact counterpart. The meaning remains the same when you swap them.
02:24 Hubble's true color photos are like that.
02:26 They are a direct translation, using broad filters in wavelengths we can see.
02:32 Other words can't be translated directly.
02:35 When we use narrowband filters or peer outside the visible spectrum, it's like translating words with no one-word replacement.
02:43 Easily done, but requires more work.
02:47 Narrowband images highlight the concentration of important elements.
02:52 Infrared images are like heat maps, helping us spot newborn stars in dark, dusty clouds, and peer further back in time and space.
03:02 Using ultraviolet, we uncover active aurorae on Jupiter, and learn how young, massive stars develop.
03:10 Image processors also clean up artifacts, signatures in an image that aren't produced by the observed target.
03:17 As sensors age, some pixels become imperfect, returning too much electrical charge or not enough.
03:24 Artifacts can leave behind odd shapes, or return images without any true black.
03:30 These effects can be calibrated and removed.
03:34 Other artifacts come from the dynamic environment of space.
03:38 Even the best photographers get photobombed.
03:41 In Hubble's case, the culprits are asteroids, spacecraft or debris trails, and high-energy particles called cosmic rays.
03:49 By combining and aligning multiple observations, image processors can identify them and piece together an artifact-free image.
03:58 Without processing, many Hubble images would be divided down the middle.
04:03 This line, called a chip gap, is the tiny space between some camera sensors.
04:08 Hubble moves slightly with each observation, allowing image processors to fill the gap and replace faulty pixels.
04:15 This process is called dithering.
04:18 And because there's no natural up or down in space, processors decide how to rotate and frame the image.
04:27 It's a time-consuming procedure.
04:29 Simple images take about a week, while large mosaics stitched together from many observations can take a month to process.
04:39 Hubble images may not be what we'd see firsthand.
04:43 Instead, they are tools for understanding science at a glance, shedding light on otherwise invisible views of our universe.
04:52 (upbeat music)
04:54 (upbeat music)

Recommended