Chronology Current Month Current Thread Current Date
[Year List] [Month List (current year)] [Date Index] [Thread Index] [Thread Prev] [Thread Next] [Date Prev] [Date Next]

[Phys-L] some climate concepts, numbers, and references



Here are some useful concepts and numbers. Tracking them
down was more laborious than I expected it to be.

1) Concept: The incremental amount of energy per unit
time is only /logarithmic/ in the amount of CO2 in the
atmosphere. That's because the atmosphere is more-or-less
optically thick at the relevant wavelengths already. So,
adding more CO2 cuts down on what's coming in, not just
on what's going out.

If it were linear instead of logarithmic, all our gooses
would be cooked already.

This is why people talk about the number of "doublings"
of the CO2 level.

2) More particularly, they talk about the number of
doublings relative to the pre-industrial level.

Most practitioners take the pre-industrial level to
be 280 ppm. There are a few papers, mostly on the
NASA site, that take it to be 275 ppm. The difference
is sufficiently small that I'm not going to worry
about it.

As a corollary, people talk about the 2xCO2 level
i.e. double the pre-industrial level, namely 560 ppm.

3) In the climate business, the concentrations are measured
in terms of v/v (not w/w) i.e. volume per unit volume
(not weight per unit weight).

4) The canonical 2xCO2 level i.e. 560 ppm corresponds
to adding a "forcing term" of 3.7 W/m^2 to the
global energy-balance equation.

http://www.pik-potsdam.de/~stefan/Publications/Book_chapters/Rahmstorf_Zedillo_2008.pdf

That is a humongous number. As a point of reference,
the solar constant is on the order of 1300 W/m^2.
That's the incoming energy at the top of the
atmosphere, at normal incidence.

As a corollary, if you wanted to tolerate the CO2 in
the atmosphere (and in the ocean!) and only wanted
to balance the energy equation, you could set up
a reflector. Alas, it would need to have a humongous
area.

5) Current CO2 level: For several months in mid-2014,
the CO2 concentration was just over 400 ppm. There's a
± 5 ppm seasonal variation superimposed on that.
We're in the trough right now, namely 395 ppm, but
400 is a good number if you average over the year.
It keeps going up.

https://scripps.ucsd.edu/programs/keelingcurve/
http://climate.nasa.gov/400ppmquotes/
https://scripps.ucsd.edu/programs/keelingcurve/wp-content/plugins/sio-bluemoon/graphs/mlo_one_year.png
http://scrippsco2.ucsd.edu/images/stories/home/mlo_front_plot.png

6) One gigaton of CO2 /in the atmosphere/ corresponds
to 0.5 ppm. This doesn't depend on anything except
the total mass of the atmosphere.

http://spacemath.gsfc.nasa.gov/Calculus/6Page51.pdf

7) One gigaton of CO2 /released into the environment/
produces only about half a gigaton of CO2 in the
atmosphere. Most of the other half partitions into
the ocean fairly quickly. OTOH I have zero confidence
that this ratio will remain unchanged over time. The
solubility of CO2 decreases strongly as pH goes down
and/or temperature goes up.

There's also some monkey business with deforestation
in the southern hemisphere and reforestation in the
northern hemisphere. For back-of-the-envelope calculations
these can be neglected /for the moment/ but I have zero
confidence that they will remain negligible. For one
thing, there is a lot of frozen peat in the northern
tundra, and if that thaws and rots, it will release
huge amounts of CO2.

8) The exciting number is the /sensitivity/ i.e. the
response of the global temperature to a change in the
energy input rate.

The key concept here is that this number is defined
so as to include the effects of all negative and
positive feedback loops. These are only rather
poorly understood, which means there is a lot of
uncertainty attached to the sensitivity number.

Quoting Holden:

We estimate climate sensitivity as likely (66% confidence) to lie in
the range 2.6–4.4  °C, with a peak probability at 3.6 °C.

http://www.mucm.ac.uk/Pages/Downloads/Other_Papers_Reports/RW%20clim%20dym%20probabilistic%20calibration%20GENIE%201%20rcvd%20082009.pdf

9) Let's talk about the cost of the problem.

For some purposes, to a first approximation, we might
be interested in the /expectation value/ of the
cost, i.e. weighted average of the possible costs,
weighted according to the probability of incurring
that cost.

To a better approximation, we should compute the net
present value (NPV) by discounting the costs according
to /when/ they are incurred.

10) Let us examine the process of calculating the aforementioned
expectation value.

Very hypothetically and temporarily, consider the case
where the sensitivity numbers were narrowly clustered
around the peak value, and the cost was a smoothly
varying function of temperature. Then we could treat
the sensitivity-distribution as a delta function. The
only thing that would matter would be the center of the
sensitivity distribution.

Non-hypothetically, we have a rather broad distribution of
sensitivity numbers. Furthermore, the cost of dealing with
the problem is a very strong function of temperature. It is
/at least/ exponential. At some point it becomes even worse
than that; it becomes an ultra-steep step function, i.e. a
life-or-death function, literally survival-or-extinction.

Now consider what happens when you convolve a Gaussian with
an exponential ... in this case a Gaussian probability
distribution with an exponential cost model. The resulting
integrand is just a shifted Gaussian.

Therefore the upper end of the sensitivity distribution
contributes to the expectation value more than the lower
end. This is not some Chicken-Little over-reaction; it
is not even an opinion. It is required by the mathematics.

11) A temperature rise of 2 °C is considered "dangerous".
Anything more than that is considered "extremely dangerous".

http://rsta.royalsocietypublishing.org/content/369/1934/20.full

The analysis suggests that despite high-level statements to the
contrary, there is now little to no chance of maintaining the global
mean surface temperature at or below 2°C. Moreover, the impacts
associated with 2°C have been revised upwards, sufficiently so that
2°C now more appropriately represents the threshold between
‘dangerous’ and ‘extremely dangerous’ climate change.

That's from a book chapter; the larger book is indexed at:
http://rsta.royalsocietypublishing.org/content/369/1934.toc

12) Anybody who doesn't like these numbers is cordially invited
to come up with other numbers, and to explain why the other
numbers are better.

13) Current climate science is directed toward building new
models that describe /regional/ rather than global changes.
All regions will be affected, but not all in the same way.

Similarly, there's more to the story than average temperature.
We need to model regional precipitation. We need to model
regional /extremes/ in temperature and precipitation, so as
to predict the chance of droughts, floods, crop failures,
et cetera.

====================
Putting all the numbers together, consider what would happen
if we stopped all man-made CO2 emissions tomorrow: We have
already emitted enough CO2 to push the temperature rise well
past 2 °C (using the weighted expected value) ... i.e. well
into "extremely dangerous" territory.

Next, consider the effect of burning enough additional
carbon to make 504 gigatons of CO2:

504 Gt released into the environment
--> 252 Gt in the atmosphere
--> 126 ppm on top of what we have now
--> 526 ppm total
--> 0.91 doublings relative to pre-industrial 280 ppm
--> 4.00 °C temperature rise,
(using the weighted expected value)

That's so far beyond what is considered "extremely dangerous"
that I don't even have a name for it.

It could be less than that, or it could be more, but that
seems like a reasonable number for planning purposes.