Washington: Astronomers using data from NASA's Fermi Gamma-ray Space
Telescope have made the most accurate measurement of starlight in the universe
and used it to establish the total amount of light from all the stars that have
ever shone, accomplishing a primary mission goal.
"The optical and
ultraviolet light from stars continues to travel throughout the universe even
after the stars cease to shine, and this creates a fossil radiation field we
can explore using gamma rays from distant sources," said lead scientist
Marco Ajello, a postdoctoral researcher at the Kavli Institute for Particle
Astrophysics and Cosmology at Stanford University in California and the Space
Sciences Laboratory at the University of California at Berkeley.
Gamma rays are the
most energetic form of light. Since Fermi's launch in 2008, its Large Area
Telescope (LAT) observes the entire sky in high-energy gamma rays every three
hours, creating the most detailed map of the universe ever known at these
energies.
The total sum of
starlight in the cosmos is known to astronomers as the extragalactic background
light (EBL). To gamma rays, the EBL functions as a kind of cosmic fog. Ajello
and his team investigated the EBL by studying gamma rays from 150 blazars, or
galaxies powered by black holes, that were strongly detected at energies
greater than 3 billion electron volts (GeV), or more than a billion times the
energy of visible light.
As matter falls
toward a galaxy's supermassive black hole, some of it is accelerated outward at
almost the speed of light in jets pointed in opposite directions. When one of
the jets happens to be aimed in the direction of Earth, the galaxy appears
especially bright and is classified as a blazar.
Gamma rays produced
in blazar jets travel across billions of light-years to Earth. During their
journey, the gamma rays pass through an increasing fog of visible and
ultraviolet light emitted by stars that formed throughout the history of the
universe.
Occasionally, a gamma
ray collides with starlight and transforms into a pair of particles -- an
electron and its antimatter counterpart, a positron. Once this occurs, the
gamma ray light is lost. In effect, the process dampens the gamma ray signal in
much the same way as fog dims a distant lighthouse.
From studies of
nearby blazars, scientists have determined how many gamma rays should be
emitted at different energies. More distant blazars show fewer gamma rays at
higher energies -- especially above 25 GeV -- thanks to absorption by the
cosmic fog.
The farthest blazars
are missing most of their higher-energy gamma rays.
The researchers then
determined the average gamma-ray attenuation across three distance ranges
between 9.6 billion years ago and today.
From this
measurement, the scientists were able to estimate the fog's thickness. To
account for the observations, the average stellar density in the cosmos is
about 1.4 stars per 100 billion cubic light-years, which means the average
distance between stars in the universe is about 4,150 light-years.
A paper describing
the findings was published Thursday on Science Express.
"The Fermi
result opens up the exciting possibility of constraining the earliest period of
cosmic star formation, thus setting the stage for NASA's James Webb Space
Telescope," said Volker Bromm, an astronomer at the University of Texas,
Austin, who commented on the findings. "In simple terms, Fermi is
providing us with a shadow image of the first stars, whereas Webb will directly
detect them."
Measuring the
extragalactic background light was one of the primary mission goals for Fermi.
"We're very
excited about the prospect of extending this measurement even farther,"
said Julie McEnery, the mission's project scientist at NASA's Goddard Space
Flight Center in Greenbelt, Md.