02 June 2012

Snowball Earth - the Faint Young Sun Paradox

Here’s a detective story for you, and it doesn’t involve a murder.

Astronomers studying young stars like ours have realized that as a Main Sequence star evolves over time, the inner core becomes denser and the fusion rate of hydrogen to helium increases. In other words, our own Sun must have grown brighter and brighter during its first 5 billion years of existence.

Careful astrometric studies have even placed some numbers on this: the energy output of the Sun 2 billion years ago is inferred to be about 70% - 85% of what it is today. This would not be enough to warm the Earth above the freezing point of water. The Earth 2 billion years ago should have been a frozen ice-ball, like Mars today. Mars is more distant from the Sun than Earth is and correspondingly cooler.

There is a problem with this conclusion, however: it doesn’t agree with ancient evidence gleaned from geology. There are sedimentary rocks in South Africa with ripple-marks and mud-cracks. These rocks are derived from volcanic ash - and therefor easily dated at about 2 billion years old. Other rocks dated at 2.7 billion years ago show fossilized rain-drop imprints. I have personally handled ripple marks and pillow-lavas (lava that is fast-quenched in water) dated at about 1.7 billion years ago in southern Venezuela. Ancient stromatolytes - blue-green algal clumps and  mats - have been found in rocks over 3 billion years old in Australia.

The evidence is everywhere: the atmosphere may have been different, but there was liquid water on the Earth’s surface as far back as we can test.

What gives?  The arguments used to explain this so-called “Faint Young Sun Paradox” fall into three main groups:

- The young Earth may still have had a lot of residual heat left over from potential energy accumulated during the accretion process. However, the surface of the Earth would have equilibrated quickly with energy received from the Sun, and the existence of solid cratons back at least 3.4 billion years ago argues for a solid crust. Energy released from the Earth’s interior has actually ramped up with the onset of mantle convection and plate tectonics, now thought to have started about 2.5 billion years ago.

- The Earth’s atmosphere retained heat more efficiently than it does now - for instance, by containing more greenhouse gasses like carbon dioxide and methane. Sufficient nitrogen can also act as a greenhouse gas under a phenomena called nitrogen broadening. There are a few questionable gas inclusions in ancient rocks, but scientists argue over how pristine the gasses in these inclusions actually are - or if they have diffused (either into the rock or out) over time.

- The Earth’s albedo, or surface reflectance, was lower in the past. Lower surface albedo could have been due to less continental area (more dark, absorbing ocean), or perhaps by the lack of biologically induced cloud condensation nuclei. How would you ever obtain evidence for something like cloud cover 2.5 billions years ago, however?

There are other suggested explanations out there. One is the modulating effect of a stronger Solar Wind in Archean times (i.e., greater than 2.5 billion years ago). Another is that due to orbital mechanics and tidal effects, the Earth’s orbit was once closer to the Sun.

This last explanation is treated skeptically by most astronomers because of some bad science propagated in several books by Immanuel Velikovsky a generation ago. The Earth-Moon distance varies depending on where the Moon is in its orbit. Lunar laser ranging experiments show that in general the Moon is receding from the Earth at a rate just under 4 centimeters per year. This is due to tidal energy being transferred to the earth (and converted to heat) via the seas, and the deformation of the Earth’s crust along with the tides. It is a logical step to infer that the Earth’s orbit around the Sun could increase for the same reason over time.

There is a major problem with all these theories: with time, the evidence for anything becomes increasingly fragmentary, increasingly suspect. It’s like a Cold Case murder - only 2.5 billion years cold.

Scientists are clever folk, however - and they keep thinking, keep looking for other ideas. Recently some of them have gone back to the fossil imprints of ancient rain-drops onto volcanic ash, and have conducted comparison experiments to estimate the density of the Earth’s ancient atmosphere. There are many variables to deal with, however, including how big will rain drops be, and how much moisture was in the volcanic ash? Careful calibration has at least allowed scientists to put a range on the ancient Earth’s atmosphere: it was between 50% and 105% as dense as it is today. This immediately calls into question the greenhouse gas argument.

We also know from other geological evidence that the Earth’s atmosphere began to fill with freed-up oxygen around 2.5 billion years ago. Rounded pyrite grains found in ancient South African sandstones, which could not have occurred in the presence of oxygen, is one proof of this. The Great Oxygenation Event came at the expense of methane and carbon dioxide, which biological processes were already starting to sequester in the form of carbon accumulating in the bottoms of ancient swamps. 

You recently drove your car to the grocery store using gasoline - some of that sequestered carbon. That same trip thus released more of a greenhouse gas to the Earth’s atmosphere.

And so the Earth grows hotter and hotter...