2018-05-28

What is aerosol radiative forcing?

There is a lot of research about the climate impact of aerosols. One of the fundamental measures of the climate impact is the "radiative forcing" associated with aerosols. It's not obvious what exactly aerosol radiative forcing is, however, so here we begin our examination of this question.

The IPCC AR4 provides a nearly useless description: [LINK]

We can discern two important facets of aerosol radiative forcing from that description:
  1. It is measured based on top-of-atmosphere (TOA) radiative fluxes. 
  2. It includes the impact of aerosol on clouds. 
It is also useful to look at other parts of AR4, where better text describes aerosol effects. I started at the link above because when I search for "aerosol radiative forcing" that is one of the top hits I get. That's an unfortunate hit because the text surrounding that small section is much more informative.  

The first thing that can be clarified is that #1 above is part of the definition of radiative forcing. As far as IPCC reports go, radiative forcing is the impact that a forcing agent has on the net TOA fluxes. The concept is useful because it is derived from the basic physics of conservation of energy and thermodynamics. In equilibrium the net TOA flux is zero (averaged over a year, or many years). When a forcing agent is applied to they system, such as anthropogenic aerosol, the energetic consequence may be a change in that TOA balance (i.e., a radiative forcing), and having a TOA imbalance causes the system to respond. We deduce that if the forcing is negative the system will cool to achieve a new balance, but if the forcing is positive (i.e., more energy is entering the system than leaving) the system will warm to achieve a new balance. Aerosols typically fall into the negative forcing category, and so cause a cooling, but the story is not really so simple.

In particular, it is helpful to split aerosol effects into two pieces:
  1. direct effects of aerosol particles on radiative transfer through the atmosphere (scattering and absorption) (aka, aerosol-radiation interaction, ari)
  2. indirect effects of aerosol that change the radiative properties of clouds, or change the lifetime of clouds (aka, aerosol-cloud interaction, aci)
The IPCC AR5 [LINK] includes a lot of treatment of aerosol radiative forcing. Since it's newer, perhaps we should focus there for some clarity on this issue. An important distinction is drawn in AR5 between radiative forcing (RF) and effective radiative forcing (ERF). While RF is just what we were describing, namely the change in the TOA net flux (allowing adjustment of the stratosphere), but ERF allows the troposphere to also adjust to the forcing agent. To establish ERF is tricky because it does not allow the global average surface temperature to adjust; the idea is that ERF includes tropospheric "rapid adjustments" to occur, while RF only allows for the rapid stratospheric adjustment to occur. Confused yet? 

We will return to this distinction in another post. For now, we need to consider that both direct and indirect effects have a RF but also an ERF. This complicates the picture because it further muddies the water with respect to how we describe how aerosols effect the climate system. Mostly AR5 seems to deal with RF associated direct aerosol effects and ERF for indirect effects.  For now, though, let's return to our basic question of what is aerosol radiative forcing.

Based on IPCC AR4 and AR5, along with a lot of literature reviewed therein, and also my own literature review that spans from the 1980s to today, the easiest way to express the meaning of aerosol radiative forcing is:
Aerosol radiative forcing is the change in TOA radiative fluxes between the preindustrial period and the present day. The aerosol radiative forcing can be divided into direct effects in which aerosol effects radiative transfer and indirect effects in which aerosol interacts with clouds. 
Estimates of the total direct aerosol radiative forcing is around -0.35 (-0.85 to +0.15) W m-2. Including indirect effects switches to using the ERF concept, which we will examine in another post, but the AR5 bottom line is that the total aerosol effect is a negative forcing of about -1 W m-2, but that is basically plus or minus 1 W m-2

What I want to point out before closing is that I described RF in the beginning as fundamental, but the definition that I've just provided seems far from fundamental. When we use this definition of aerosol radiative forcing, we need to define what pre-industrial means and what present day means. We know intuitively what both are supposed to mean, but quantitatively this is ambiguous. Particularly troublesome is that we do not have adequate observations from pre-industrial times to really know what the aerosol concentrations or emissions were. This provides an irreducible uncertainty for aerosol radiative forcing using this definition.  We will revisit some of these concepts in future posts, and we will return to the difficulties associated with this definition of aerosol radiative forcing.

2012-12-29

Joseph Romm raves about Reagan, balks at Barrack: Figures of speech make and break communication


I have recently read Joseph Romm's new book, Language Intelligence,
which is really a brief review of rhetoric. It introduces modern readers to
the age-old topic of eloquent language intended to persuade
audiences. Romm uses just a few prime examples for each of the several
topics covered, from the ancient Greek greats to medieval masters who
wrote the King James Bible to modern practitioners such as Lady
Gaga. The point is to expose the principles of rhetorical discourse,
such as the various forms of repetition, irony, metaphor, and
seduction, and provide readers with some of the tools necessary to
build an effective argument as well as to erect a wall to defend
against the constant bombardment by advertisers, politicians, and
other persuaders.

The lessons are clear and well illustrated by examples. Especially
useful are the examples from recent political figures such as both
George Bushes, Bill Clinton, Barrack Obama, and Mitt Romney. Several
Republican strategists are pointed out for their cunning use of
rhetorical devices (Luntz and Rove, especially). Scientists (climate
scientists, especially) are singled out for their clumsy attempts to
communicate, usually avoiding rhetorical figures of speech. The
use of the figures being discussed occasionally becomes too blatant,
often in the final paragraphs of sections, but it is pleasing as a
reader to see such employment as sections close because it reinforces the
lesson. I am convinced that this brief introduction should be standard
reading for college students across disciplines, and those in the
sciences should pay careful attention to the lessons and employ more
intelligent language when describing their own work. Older readers
might pick up some new tricks, too, if they choose to read the book.

2012-08-28

American Meteorological Society Statement on Climate Chnage

Posting two days in a row!?!?

I just wanted to draw attention to the updated statement on climate change from the American Meteorological Society. Here's the link: [LINK]. It is just a 7 page statement that goes through the following sections:

  • Background 
  •  How is climate changing? 
  •  Why is climate changing? 
  •  How can climate change be projected into the future? 
  •  How is the climate expected to change in the future? 
  •  Final remarks
There is nothing surprising in the statement. The AMS supports the scientific consensus that the Earth's surface and lower atmosphere are warming due to the accumulation of greenhouse gases in the atmosphere from combustion of fossil fuels and deforestation. Overall, it is well-written and straight-forward, and I recommend taking a look at it no matter what your background is. My guess is that everyone will get the gist, and if you've got any background in climate science then you'll pick up on some of the details. I'd quibble over some of the word choices here and there, but the substance is fine. Maybe they over emphasize climate models in the future section, because many of the points they make there are not based solely on model projections, but also observations and basic theory. Anyway, go take a look.

2012-08-27

Smart meters and dumb people

As a regular listener of Coast to Coast AM, I have been aware of a conspiracy theory involving the transition from old-timey analog utility meters to internet-connected smart meters. Smart meters allow 2-way communication between a house's utility meter and the utility company. The idea is to monitor electricity use in real time (or near real time), which can allow more nimble management of the electric grid. The idea is to get electricity where it is needed when it is needed and allow better management of electricity generation. Both proponents and opponents cite the potential for tiered pricing, such as raising prices during peak energy use times. While some say this will help incentivize energy conservation, others say the tiered pricing could hurt lower income households disproportionately.

The consipracists, however, are not worried about low-income households or energy conservation.  There are really two flavors of the smart meter conspiracy. First is an irrational fear of technology that manifests as a concern about the radiation from smart meters being a health hazard. Yes, really [example]. This is not a legitimate concern, as the radiation levels are even below those of cell phones, and it is unlikely that many residents will spend significant time with their heads agains their electricity meter. The second version of the conspiracy is rooted in a deep distrust of government and an overly aggressive view of privacy. These are the people, like the ones cited in this Grist post [LINK] and the accompanying AP news article [LINK] about the smart meter opposition in Texas, who believe that the smart meters are ... well, let's just boil it down, they think that the smart meters allow the government to spy on them at home [great example, go ahead a browse this crazy site, I'll wait.]. There might be some actual privacy issues with smart meters (which that example kind of hits, but then goes to crazy), such as the potential for utilities to synthesize usage and sell the information to interested parties (who want to target their marketing efforts). This probably isn't much of a concern at this point, as it is unclear that utilities are savvy enough to profitably undertake such an analysis. Really, this comes down to some far-right-wing ideas that get mixed up by fear mongers into ridiculous conspiracy theories, encapsulated by this quote from the above cited blog, "This is all part of the radical green agenda that is being forced down the throats of people all over the world."

Maybe I should just list a couple of points that I think are relevant (in no particular order):

  1. Energy conservation is a good idea, and represents one of the "stabilization wedges" that we talk about as currently available solutions to global warming. [LINK]
  2. The radiation associated with electronics is not harmful.  [LINK]
  3. One of the criticisms that the opponents of smart meters seem to bring up often is that the FCC does not have strong enough restrictions on radiation [example]. Yet, as pointed out in Grist and the AP story above, these people tend to be on the far-right/libertarian/tea-party fringe of the political spectrum, meaning that philosophically they are opposed to government regulation (in favor of letting the "market" work out the appropriate solutions). This is completely inconsistent. I don't think this is an argument against the smart meter opposition, just a point that I wish would be discussed.
  4. The possibility of utilities selling the information aside, there seems to be a general fear of a degradation of privacy with smart meters, but I don't think there is any evidence that any personal information could be or is being collected by these meters.
  5. The transition to smart meters is being driven by the "market" as utilities try to reduce costs and maximize efficiency. This is a direct descendant of the deregulation of utilities in the USA, which right-leaning folks should be applauding (if they were being consistent with their purported economic philosophy).
  6. Because the utilities are deregulated, this information would be flowing to the utilities, and not the government. That means that there must be an extra layer of conspiracy in order to bring the government into the picture. Each additional layer of conspiracy makes the theory less and less plausible.
I think I can leave it there.

Note that in the links above that are cited as examples, I have used the "rel=nofollow" attribute which prevents search engines from following the links and improving those cites' search rankings. I decided to do that because those cites, while entertaining, do not present a useful view of the issue. The other links are normal, and represent appropriate source material. 

2012-07-20

Making mds and mdworker processes stop sucking CPU

I usually don't post computing tips, but I just learned how to get rid of a problem that has been bothering me off and on for over a year.

The problem: On multiple Macs running OSX 10.6 and/or 10.7, sometimes a large amount of CPU power (and also system memory) is being used by a process called mds. The problem persists over many hours, and on laptops causes the fans to spin and battery to drain.

Some relevant information: MDS and its friend mdworker are part of Apple's Spotlight software. Basically they troll your system looking for changes and recording them in an index. That index is used by Spotlight to help you find things like that email from four years ago describing how do some arcane thing that you couldn't possibly remember, except that it has the phrase "don't cry at this point". Anyway...

The fix: I've tried a number of things over the many months of having this problem. At home, I removed external drives from the Spotlight index and that fixed the problem. To do that, open System Preferences, go to Spotlight, click on the Privacy tab, and add the external drives to the list.

On my work computers I've done the same thing, but still have the problem sometimes. It has been especially bad on my new MacBook Air with no external drives connected.

Doing some searching, I found this very useful bit: LINK.

The crucial thing that post points out is that backup software that works in the background can cause files to be constantly changing. Constantly changing files need to be re-indexed in the view of Spotlight/mds.

For some reason, I'm not allowed to backup however I want at work (e.g, to a Time Machine drive that could be sitting right here in my office). Instead, my IT department installs some software that supposedly backs up my system. Going to /System/Library/Application\ Support and finding the name of that backup software (in this case it is Symantec) and adding that directory to the list of non-indexed places solves the problem. This is equivalent to the case in that link that uses some other backup software, and I bet that other backup systems also trigger the same behavior.

REPEAT: Exclude your backup software's /System/Library/Application\ Support directory from Spotlight indexing in System Preferences -> Spotlight -> Privacy and mds CPU usage should drop to 1% or less.

Sigh of relief.

2012-05-01

Wind turbines and warming

The Christian Science Monitor [LINK] does a decent job in smacking the media coverage of a new Nature Climate Change paper by Zhou et al. [LINK]. It is no surprise to see another example of the media making hay of a superficially surprising study. Most of the articles apparently get the story mostly right, but the headline writers once again have their heads up their asses... but actually not, as their job is to get people to read the headline and then click to get to the article in order for the ads on that page to load. One particularly horrible story comes from, wait for it, FoxNews.com; the story is titled "Wind farms are warming the earth, researchers say" and it is written poorly by Eric Niiler (of Discovery News?). I'm clearly not going to post a link to that story, but just say that it does not present a very fair assessment of the paper (which I did link to above and which I did actually read).

The science of the study is nothing terribly exciting, but could provide observational evidence for the scale and magnitude of a simple mixing effect. The idea is that big wind turbines in Texas are mixing air in their vicinity, and at nighttime that means mixing air in two distinct layers. On clear nights, as the ground cools, cool air settles in contact with the ground. Cool air is more dense than warm air, so this situation creates a stable vertical structure of colder, denser air below warmer, lighter air. Usually temperature decreases as you go up, so we call this stable configuration an inversion. Just to throw some more jargon at you, this very common situation is a nocturnal stable boundary layer. The study finds trends during 2003-2011 in both daytime and nighttime surface temperature, but the larger and convincing trend is the nighttime trend, especially during summer. 

These stable layers are usually quite thin, maybe 100 meters deep. Just above the inversion, the air retains the heat from the daytime (sometimes called the residual layer). There is also a slightly less common meteorological phenomenon called a nocturnal low-level jet that is a layer of relatively fast wind that can form in or above this residual layer. (see my skillfully drawn cartoon)

If you want to put wind turbines up, one attractive feature would be thin nighttime inversions and frequent low level jets. As it happens, Texas has a lot of places like that. Note, however, that you would not want very strong winds on the top of the turbine compared to the bottom because strong torque is structurally undesirable. 

Let's imagine a turbine spinning in a low level jet above a stable layer (as in my awesome cartoon). The motion of the blades creates a turbulent wake just downstream, which mixes approximately isotropically. Let's say this means that the air is mixed up from the surface to a little above the turbine height. The result of this mixing is the same as if you stir a cup that has some warm liquid on top and cold liquid on bottom: the temperature is mixed and the resulting temperature is the mass-weighted average of the warm fluid and the cool fluid. This is exactly what the paper finds in western Texas, though they don't include a calculation of the efficiency of the mixing because their satellite data does not include profiles of temperature with height. The authors take the data in close proximity to the turbines, average it, and subtract the average from the data that is farther away from turbines. In the residual, they find this warming trend. This means that very close to the turbines, there is a nighttime warming trend, which the authors attribute to the mixing by the turbines. 

The consequences of this finding are not some kind of cautionary tale about trying to use renewable energy sources. In terms of climate change, this means almost nothing. First, as is pointed out in even that FoxNews.com article, these turbines produce 10,000 megawatts of electricity (that's enough for about three million American homes); that is a substantial amount of CO2 not being emitted to the atmosphere. Second, the effects being reported are small. Third, but most important, the warming effects of the turbines are confined to the immediate vicinity of the turbines themselves. The entire study area in western Texas is about 100km across, and the warming from the turbines is warming that is enhanced in a fraction of this small area. Related to this, the effects are not additive. Once a region has enough turbines to create this effect, that's it, the "trend" can not continue into the future because there won't be more turbines added to the area. (In fact, as turbine design gets more efficient, you'd expect to see the trend reversed eventually because there would be less downstream turbulence.)

Finally, some of the blow-hards (ha ha, get it?) are saying that this warming could be bad for crops. It is plausible that crops are being grown within the small regions that could be affected by this nighttime mixing. If these people were so worried about warming affecting crops, I think they'd note that the local impact in western Texas is about the same amount of warming as the ENTIRE GLOBE has seen from greenhouse gases. A nearby weather station [LINK] shows a pretty clear warming trend (note there must be a site change or instrument change around 1959) and I'd guess that the overall 20th Century warming for western Texas is greater than the wind farm effect. The 1979-2005 trend looks like it is about the same size in summer [LINK], and note that is average temperature, not nighttime temperature which is the notable value for these wind farms. I'm not sure what 0.5C warming at nighttime does to crops, but I'm guessing that the greenhouse warming is more important (even within this study area) than the wind farm "warming." 


LINKS:

http://www.csmonitor.com/Science/2012/0430/Don-t-believe-the-headlines.-Wind-farms-do-not-cause-global-warming


http://www.wunderground.com/climate/local.html?id=42572266000&var=TAVG

2012-04-24

ENVISAT likely dead in the sky

More bad satellite news: ENVISAT seems to have stopped phoning home. ENVISAT is a giant satellite managed by the European Space Agency, and has been collecting a ton of environmental data since 2002. It has 10 different instruments, and has been used extensively in climate research over the past decade.

Sometime on 8 April, ENVISAT stopped communicating with controllers at the ESA. There has been a concerted effort to determine the cause of the failure and re-establish communication of the satellite. This effort is impressive, with the French satellite Pleiades even being turned away from Earth to try to capture images of ENVISAT to determine the state of the solar panels. Other observations of the satellite are also being used to make sure that ENVISAT is -- and is staying -- in a stable orbit. There is some hope that a connection can be reestablished if ENVISAT has gone into -- or could be cajoled into switching to --  "safe mode." Otherwise, it is likely that there has been catastrophic failure of the main computer or power source on board, and there's no hope of recovery. 

ENVISAT's estimated lifetime was just 5 years, so again over-engineering paid off (like the Mars rovers, SeaWiFS, etc). Unfortunately, the Sentinel satellites that are planned to replace ENVISAT have not been put into orbit yet, so there is likely to be a gap in the data record for some of the quantities that ENVISAT monitors. These even include CO2; the launch failure of the Orbiting Carbon Observatory (NASA) in 2009 left just ENVISAT and a Japanese satellite with capabilities to monitor CO2 from space. Now there is just the Japanese satellite, the Greenhouse Gases Observing Satellite. Apparently there is a lot of controversy and arguing about how to pay for the Sentinels, making the ESA's commitment to climate monitoring just as shaky as NASA and NOAA's in the USA.

SOURCES:
Geoff Brumfiel, 
Nature
 
484,
 
423–424
 
()
 
doi:10.1038/484423a

ESA [link]

2012-03-02

What's the difference between the Maunder Minimum and the late twentieth century?

Somehow I became distracted by an online "discussion" in which someone was confused about the connections between solar variability and climate variability. The result of that distraction was that I made a plot, shown here, that compares the Maunder Minimum (1645 - 1715 CE) to the last 60ish years (1945-present). Below is the text I wrote to present this figure, along with data sources and citations. 


Let's compare the Maunder Minimum with the second half of the 20th Century. Since there isn't really that much data, we are limited to a few quantities. I have chosen two, one is the total solar irradiance (TSI), which is the energy received by the Earth from the sun (per time per area, expressed in Watts per square meter). This data comes from a reconstruction by Wang et al. (2005) as modified by Kopp & Lean (2011), and was obtained at the link below. This data is very similar to other reconstructions (Lean et al. 2004, 2000, 1995), but is basically shifted downward by about 5 W/m2 as a recalibration. The updated data covers 1610 to 2011. The second quantity we can investigate is the northern hemisphere temperature anomaly (degrees C). There are not direct observations (of sufficient quality) reaching back to the Maunder Minimum, so we will use the temperature reconstruction of Moberg et al. (2005). That data actually stops at 1979, but we can visually extend the data to near present day by including the observed northern hemisphere temperature anomaly from Smith et al. (2008) which reaches to January 2012. For the Smith et al. data, I chose to use the land plus ocean values. There is an issue with baselines, the Moberg et al. data are anomalies with respect to the 1961-1990 average temperature while the Smith et al. data are with respect to the 1901-2000 average temperature (which is cooler than the shorter, later average). I have not adjusted for this offset, and the result is that the pink dots in the figure should actually all be slightly upward compared to the red line. The agreement between the overlapping interval is quite good, but would be a little worse if the shift were included. There are other reconstructions available, and I have no reason to choose Moberg's over the others except that it is perhaps the most recent. The TSI and temperature reconstruction data are both annual, and I've made sure the data are centered within each year. The Smith et al. data are monthly (which is why I need to center the annual records). 

Finally, just for completeness, I define the Maunder Minimum using the definition on the Wikipedia page, which is 1645 to 1715. This is in line with the TSI data. For the more recent climate, I chose to simply shift 300 years into the future. So the plots cover 1945 to 2015, but the data sets stop at whatever their last times are. The Maunder Minimum times and data are shown in blue, while the 20th/21st Century data are shown in red and pink with the time labeled along the top of each panel. 

To try to head off any misinterpretation of this plot, let me simply state what it shows. During the Maunder Minimum, there was reduced solar activity and the average northern hemisphere temperature was about 0.6 C cooler than the the 1961-1990 average value. During the period covering 1945 to the 1960s/1970s, the northern hemisphere temperatures were relatively flat and near their 20th Century average. Over those years, the sun was active, showing "normal" sunspot and TSI 11-year cycles. From some time in the 1970s until present, shown in the pink circles, the northern hemisphere average temperature shows a clear upward trend. All the monthly average values after 1996 are warmer than the 20th Century average. During this rapid and clear warming, the solar activity was typical, and quite similar to the 1945-1975 period, showing clear quasi-periodic activity. There appears to be little connection between variation in solar activity and variation in northern hemisphere average temperature during the late 20th Century. 

REFERENCES

Kopp & Lean (Geophysical Research Letters, 38, L01706, doi:10.1029/2010GL045777, 2011)

Lean, J.. 2004: Solar Irradiance Reconstruction. IGBP PAGES/World Data Center for Paleoclimatology. Data Contribution Series # 2004-035. NOAA/NGDC Paleoclimatology Program, Boulder CO, USA.

Lean, J. 2000: Evolution of the Sun's Spectral Irradiance Since the Maunder Minimum. Geophysical Research Letters, Vol. 27, No. 16, pp. 2425-2428, Aug. 15, 2000. 

Lean, J., J. Beer, and R. Bradley. 1995: Reconstruction of Solar Irradiance Since 1610: Implications for Climate Change. Geophysical Research Letters, v.22, No. 23, pp 3195-3198, December 1, 1995.

Moberg, A., D.M. Sonechkin, K. Holmgren, N.M. Datsenko and W. Karlén. 2005: Highly variable Northern Hemisphere temperatures reconstructed from low-and high-resolution proxy data. Nature, Vol. 433, No. 7026, pp. 613-617, 10 February 2005. 

Wang, Lean, and Sheeley (The Astrophysical Journal, 625:522-538, 2005 May 20)

DATA SOURCES
TSI: http://lasp.colorado.edu/sorce/data/tsi_data.htm
TSI: http://www1.ncdc.noaa.gov/pub/data/paleo/climate_forcing/solar_variability/lean2000_irradiance.txt
NH T Reconstruction: http://www.ncdc.noaa.gov/paleo/pubs/moberg2005/moberg2005.html
NH T Observations: http://www.ncdc.noaa.gov/cmb-faq/anomalies.php

2012-01-29

The WSJ op-ed page again

The Wall Street Journal has published an op-ed by 16 "scientists" claiming that global warming isn't a big deal, and probably stopped many years ago. They do call for funding satellite and ground-based observations of the climate system, which is generous of them. I read the piece, which is full of half-truths and less. It can be found under the title "No Need to Panic about Global Warming" in (or around) 27 January 2012.

Notably, the WSJ rejected a similar letter my 255 members of the National Academy of Sciences. The main difference between the two submissions was, oh, everything. The NAS members call out the poor coverage of global warming and its impacts in the media. Peter Gleick covered this in an article published on Forbes [LINK].

I noticed at least two things about the list of 16 "scientists" on the WSJ piece. First, most of them have little if any professional experience studying the climate system. Second, there is a preponderance of old, white men on the list. I'm not the only one who noticed, Ben Nolan also did, and he's working through the list to see who these people are, including how old and how white they might be [LINK]. It comes as no surprise that many on the list are connected financially to the fossil fuel industry and/or to conservative think tanks (that are likely funded by the fossil fuel industry). I hope Mr. Nolan follows through and completes the list, as I think the rest are just as oily as those he's tracked down so far.

2011-12-16

Ralph Hall, chair of the House of Representatives science committee

I have not read such incoherent drivel from such a highly placed official since George W. Bush was in the whitehouse. Please read this story and accompanying interview with Ralph Hall; it is a shocking view into the world of American politics. It is a world where people say things like "I don't have any proof of that. But I don't believe 'em. I still want to listen to 'em and believe what I believe I ought to believe." [LINK]

2011-10-06

Sherwood's dredging up history

Steve Sherwood has an outstanding piece in Physics Today that compares climate "denialism" to historical precedent. In particular, he compares today's climate deniers to those who denied general relativity and heliocentrism in the past. The three stories are interwoven beautifully, definitely worth the read. Here's the link: LINK

2011-10-04

The American ‘allergy’ to global warming: Why?

"Climate change has already provoked debate in a U.S. presidential campaign barely begun. An Associated Press journalist draws on decades of climate reporting to offer a retrospective and analysis on global warming and the undying urge to deny." -- Editor's Note regarding this piece by Charles Haney, The American ‘allergy’ to global warming: Why? [link].

Nice, concise overview of the state of climate science and "denialism" in the United States.

2011-09-16

Start pre-gaming, only two years to the next IPCC report

The last IPCC report on the physical science of climate change, called the Fourth Assessment Report (AR4), was published in 2007. Since that time, plans for the next assessment (AR5) have been underway.

I think I posted before about the announcement of the chapter outlines and the lead authors [link]. The groundwork has been laid for some time, with the lineup of authors finalized earlier this year [pdf]. A lot of the effort falls upon Thomas Stocker [link], the Co-Chair of "Working Group 1," the job previously held by Susan Solomon [link] for AR4. Not only that, but there have already been two meetings of the lead authors, the most recent being in July in Brest, France.

If they are able to keep on schedule, then the AR5 will be published in two years: in September 2013.

The first order draft will be done and reviewed in the next few months. As you know, the assessment is basically a gigantic review paper of climate science. As such, it relies on published results for the basis of the review. The last day to have papers submitted to journals and be considered for inclusion in the AR5 is 31 July 2012, so if you want to get something into the next IPCC report, you have about 10 months to get it done and submitted. It also needs to be accepted by 15 March 2013, giving you plenty of time to deal with those pesky reviewers.


2011-09-06

Remote Sensing shakeup

The editor-in-chief of the open-access journal Remote Sensing has resigned. An editorial in the journal explains the situation. The short version is that this journal is the one that let that Spencer & Braswell paper slip through peer review; that publication has fatal flaws throughout its assumptions and analysis. The paper was quickly and brutally criticized in the climate-blogosphere, but exaggerated and praised in the right-wing, and much of the mainstream, media. The journal was criticized for letting such a poor paper get published. The editor-in-chief, Wolfgang Wagner, has now had a chance to review the criticisms and the paper and the review process. He has come to the conclusion that the paper should not have been published. The reason seems to come down to poor selection of reviewers by the editor that was in charge of that paper. The reviews came back with little criticism, to which the authors responded, and that basically tied the editor's hands and the paper had to be accepted.

From my point of view, the failing was the editorial board's misunderstanding of the subject matter and clear mishandling of the review process. From the tone and content of the paper, it was clearly a contrarian point of view and had the flavor of ideological bias. I suspect that the reviewers were selected from the list of suggested reviewers supplied by the authors. In cases of controversial content, there needs to be at least one critical reviewer selected from the community involved, and I'm pretty sure none of the reviewers in this case were in the mainstream of the climate change community.

I'm glad that Wolfgang Wagner has stepped down. I don't know that he did anything wrong except let the other editors make some bad decisions, but this move does help to show that the journal doesn't want to build a reputation for publishing garbage, contrarian papers. Hopefully this will be a small blemish on the journal's reputation which won't mar the whole thing.  It will be interesting to see if the next editor-in-chief conducts a review of the evidence and officially retracts the paper (it is unclear whether this journal has provisions for that, but they should consider it).

The original paper can be found from the journal's web page. The editorial is worth a read, and can be found in the current issue [LINK]. On a related note, a new GRL paper by Andy Dessler destroys the Spencer & Braswell argument in under 4 pages [LINK].

UPDATE: Lots of coverage of this story on the climate blogs. One worth seeing is Mooney's post on DeSmogBlog, which rehashes some similar scandals involving climate, intelligent design, and autism. [LINK]

2011-08-31

CLOUD experiments

The first results from the CLOUD experiments have been getting a lot of media attention. The focus of the attention is the Nature paper that was published this week: [news][paper].

The goal of this project is to determine whether cosmic rays have a significant impact on clouds.

Let's boil this down a little. This project is a laboratory experiment at CERN. It is a cloud chamber, basically an isolated volume of air that is precisely controlled for temperature and pressure. They put very pure air into the chamber, add a little background water, and some gases like ozone, sulfuric acid, and ammonia. The chamber is heavily instrumented to look for nucleation, which just means that they try to keep track of particle formation that occurs as the vapors interact and possibly start condensing. They can do this in neutral conditions (like a classical cloud chamber), or they can shine a pion beam into the chamber. That beam is adjusted to mimic cosmic ray bombardment. The goal is to see if cosmic rays produce ions that enhance the formation of particles, which could then go on to become the seeds for cloud droplets.

The answer seems to be that shining that beam into the chamber does produce more particles. This actually isn't a surprise, as far as I can tell. One important point is that nucleation rates, that is the rate of particle formation, are smaller than observed rates unless the temperature is quite low. This means that it is unlikely that cosmic rays ionizing gases near the Earth's surface is a major source of particle formation. Certainly there is particle formation, but it is likely to be a small source of the total number of particles. This result may change when they start adding in organic molecules, but that is future work.

There is better coverage on RealClimate: link.

There is hubbub about this result because there is a crack-pot theory that galactic cosmic rays are a major control of climate because of their impact on cloud formation. There are major flaws with this theory. My own take is that cosmic rays probably do produce some of the particles in the atmosphere that go on to become cloud condensation nuclei, but there are many paths to becoming cloud condensation nuclei, and there are lots and lots of these particles around. In fact, I seriously doubt that cloud formation is frequently affected by the limitation of these aerosol particles. I've been thinking about this in terms of observed cloud properties. The number of cloud droplets is connected to the number of aerosol particles available: over land where there are lots more aerosol particles, there tends to be more, smaller droplets in clouds, while over the remote ocean the clouds are made of fewer, larger droplets. In very polluted conditions, we can observe changes in the cloud properties that follow that same trend. I think the downfall of the cosmic ray theory of cloud formation comes from the fact that out in the middle of the ocean there are still tons of aerosol particles. While many of those particles may come from cosmic ray influenced nucleated vapors, there is no evidence that there is a shortage of other sources of aerosol, so if the intensity of cosmic ray bombardment were to change, it seems unlikely that other sources of aerosol wouldn't fill whatever tiny void that change would make.

Besides this basic criticism (which amounts to the originators of the theory simply having a bit of a myopic view of cloud formation), there is also a clear lack of evidence for cosmic ray intensity modulating cloud/climate. The RealClimate piece covers that. Finally, there is the link to climate change, for which there is absolutely no evidence.

So my summary would be something like: This research presents experimental results that suggest that ionization by cosmic ray-like effects can impact nucleation rates in conditions similar to the Earth's atmosphere. The role of such nucleation enhancement in the Earth's atmosphere remains unclear, especially given that the impact seems most pronounced in conditions that are outside the atmospheric boundary layer. This is a nice contribution to basic aerosol research, which should help to constrain models of aerosol formation. The impacts on cloud formation and the Earth's climate can not be assessed with the data collected so far.


The authors are only slightly overselling their results, which is typical for authors of Nature papers. The lead author's comments can be heard in the embedded YouTube clip. The media coverage, and especially the climate change denier blogosphere, is lighting up like this experiment proves something controversial. It does not.

2011-08-08

Reactions to the Spencer paper

It's a very bad paper. That is the short story. I haven't thought through all my criticisms of it yet (maybe I'll write something here eventually). In the meantime, I like this overview piece at Climate Central by Michael D. Lemonick [LINK]. I especially enjoyed the comment that, "... it's not that NASA data are blowing a hole in anything. It's that Spencer's interpretation of NASA data are blowing... something, somewhere."

For those of you looking to actually read the paper, it is in a journal called Remote Sensing, and it is open access. You can find it by looking up doi:10.3390/rs3081603. Let me reiterate that this is a bad paper, with many incorrect statements, assumptions, and reasoning. It isn't worth you time reading this paper when you could better spend it reading an informative one about climate sensitivity... oh, I don't know, maybe doi:10.1175/2008JCLI1995.1

2011-07-26

A peer-review recipe

Eos is on a hot streak, at least compared to the usual Eos status quo. There's another useful article in the 12 July issue. It is an overview, or maybe a primer, about peer reviewing [LINK]. It is written by K.A. Nicholas and W. Gordon. I'm not sure what they do, Nicholas is at Lund U. in Sweden, and Gordon is at UT Austin. The article is a bit prescriptive for my taste, but I think it will be helpful reading for graduate students and others who are doing their first reviews. It's a good reminder for the rest of us about how we should think about some aspects of doing reviews.

One difference between my reviews and the ones outlined by the article is in how to treat fatally flawed manuscripts and majorly flawed ones. The article suggests identifying these errors and being done with the review. In my experience, there is a good chance that the manuscript will still get "accepted subject to major revisions," so this may be the only chance to really convey to the authors what they need to improve. (Otherwise, there's the risk of multiple rounds of reviews, which no one wants.) To that end, I always provide a full review, even if I am recommending rejection of the paper. I always hope that the editor will appreciate the effort, and also that the authors will get more constructive criticism to help improve a new version of the manuscript, or at least have a comprehensive set of comments to address in a revised version that will come back for another review.

On a side note, this is also helpful, at least in principle, in the "dialog" between the climate science community and the climate change denying community. There is a lot of mistrust of the peer-review process among "skeptics" (note they are not skeptical, really). The article in Eos is a simple explanation of how peer-reviews are written, and I think almost everyone will agree that the description is roughly what reviews look like. While sometimes reviews are more or less helpful, the thought behind them is basically the one described by the article. They are meant as constructive criticism, pointing out strengths and weaknesses of work, judging the contribution to the field, and recommending whether publication is reasonable or not. This is how I write reviews, and it is similar to reviews I receive. In climate science, reviews are usually anonymous, though there's always the option of signing a review so that the authors know who you are. The decision about how the submitted manuscript proceeds comes from the editor, not directly from the reviewers, so even if a paper receives an unfair review, the editor can take corrective action. When that doesn't happen, the authors can appeal to the editor, pointing out where the review has gone off track. The usual course of action would be for the editor to find another reviewer, who can then corroborate the criticism or back up the author. There really isn't anything secretive or mysterious about the process. There are obvious measures to avoid conflicts of interest, and procedures for interaction among the participants. So while there are interesting arguments about how to improve peer review (blinded reviews, e.g.), I think the current system is quite transparent and attempts to be fair to all involved. I hope that critics of climate science will read this article (and David Schultz's book, LINK) before they attack the peer review process in general.

2011-07-19

News Corp behind Climategate?

Was Rupert Murdoch's right-wing-leaning News Corp behind the illegal hacking of the University of East Anglia's Climatic Research Unit? There's no direct evidence at this time, but Joe Romm has put it out there in today's Climate Progress [LINK]. While Romm's connections seem tenuous, there is no doubt that News Corp's news outlets made great use of the stolen emails, and no one would be surprised if the suspicion were to be found true. Either way, I think Romm's real point is that there have been so many investigations of the scientists tangled up in "climategate," but the actual crime remains unsolved. Who broke into UEA's computer system, selectively stole and then distributed emails from CRU? The motive is almost certainly political/idealogical, and whoever did it had some competence with "hacking," and some savvy about how to distribute the stolen information to the media... It would certainly bring closure to the whole affair if the perpetrators were brought to justice.

2011-06-24

Comparing volcanic CO2 emissions to human-made emissions

There's a terrific article by T. Gerlach in last week's issue of AGU's EOS newspaper [PDF]. Gerlach is apparently a volcanologist, not a climate scientist. The article compares estimates of CO2 emission by volcanic sources to human-made emissions. There is a common misconception (I'm not sure how common) about the relative contributions of volcanos and humans to the emission of CO2.

So, which do you think is the larger source of CO2, global volcanos or human-made sources (includes land-use change, cement production, transport, energy, etc)?


The answer is that humans produce (as of 2010) about 135 times more CO2 than all the volcanoes in the world. The numbers are something like humans emitting about 35 billion tons versus volcanoes emitting 260 million tons. There doesn't appear to be any tricky business going on, and all kinds of volcanoes (including underwater volcanoes) are included. In fact, Gerlach goes on to make lots of interesting comparisons: one good one is that human activities produce the same amount of CO2 emissions as the global volcanic annual source every 12.5 hours.

Gerlach also accounts for the explosive volcanic eruptions like Mount St. Helens and Pinatubo, which are called paroxysms. These can suddenly emit a lot of CO2, but they are rare. While these paroxysms are happening, their emission rate might be as large as all of humanity's. The catch is that they only last a few hours, and wind up contributing very little to the global CO2 emissions.

Really nailing the coffin shut, Gerlach also considers what it would mean for volcanism if the emission rate did match the human-made CO2 emissions of  about 135 Gigatons per year. It sounds like it would mean that we would need one (or more) supereruptions per year. Supereruptions are huge volcanic eruptions, like when Yellowstone blew up 2 million years ago, and they are rare even for geologists, happening only every 100,000-200,000 years or so.

The only weakness of the article is that Gerlach forgets to address the obvious climate change denial argument that humans are not emitting as much CO2 as claimed. There are hints of how to deal with this, however, as Gerlach mentions that the total volcanic emission is equal to that of about two dozen large coal-fired power plants. The volcanic number is so small that we can easily eliminate huge amounts of human-made emissions and still have more human-made emissions than volcanic emissions by more than an order of magnitude. So I think this presentation is a great place to point to really come to terms with this comparison, and the vast difference in the volcanic versus human-made emissions means that there can be no dispute.

Also see another Gerlach article at EARTH Magazine [LINK].