Metallicity evolution refers to the gradual increase in the abundance of heavy elements—known as metals in astronomy—within galaxies over cosmic time. When the universe first formed after the Big Bang, it was composed almost entirely of hydrogen and helium, with only trace amounts of lithium. The heavier elements like carbon, oxygen, iron, and silicon did not exist yet; they were created later inside stars through nuclear fusion and during explosive events like supernovae.
As the first generation of stars formed, they fused hydrogen and helium into heavier elements. When these massive stars ended their lives in supernova explosions, they released these newly formed heavy elements into the surrounding interstellar medium. This enriched gas later became part of new stars, meaning each successive generation of stars contained more heavy elements than the previous one. Over billions of years, this ongoing cycle of star birth, death, and enrichment increased the average metallicity of galaxies.
The metallicity of a galaxy depends on its history of star formation and gas accretion. Galaxies with rapid star formation tend to enrich faster, while smaller or isolated galaxies can remain metal-poor because they lose metals more easily through galactic winds. In contrast, massive galaxies retain metals and reach higher metallicity levels. Metallicity also affects how stars form, since metals help gas cool and collapse, influencing the types and sizes of stars that form.
Observations show a clear metallicity evolution across cosmic time: distant galaxies seen as they were billions of years ago are generally more metal-poor, while galaxies in the present universe are more metal-rich. Metallicity also varies within galaxies, usually increasing toward the center where star formation has been most intense. Understanding metallicity evolution helps astronomers trace the chemical history of the universe and learn how galaxies like the Milky Way matured over time.