2009-01-23

Cloud computing and computing clouds

More and more I'm frustrated with the cyber-infrastructure of climate science. It seems to be on the verge of crisis, in a Kuhnian sense. Everyone has individual solutions for how to do large computations, manage very large data sets, and collaborate between institutions. For example, due to limited resources, I just had to move some simulation output from a remote server to a local, external hard drive. One simulation (not a big one) generated some 50GB of output that I don't really want to throw away. Retrieving this data took hours, and then several more hours to send it from the remote site to my desk. It's crazy, inefficient, and isolating.

There needs to be a better way. We need to harness the power inherent in "cloud computing" and the latest technology for using simple, intuitive web interfaces for accessing remote data (e.g., MobileMe, Google applications, etc.) and apply them to scientific computation, data storage, and data analysis.

We have seen small steps in these directions from projects like SETI@home and climateprediction.net, among others. I have also just read an article from Nature [LINK] saying that Amazon (see update below) and Google have both started down these roads, as has the NSF with something called DataNet. However, as the article notes, there are serious challenges, not just in terms of technology but also dealing with access, cost, and fairness. These can be touchy issues, especially in fields where the rate of work can vary greatly among different research groups.

I'll also just complain that even besides dealing with sharing and storing data, the ever-growing size of data sets in Earth Sciences, and particularly in the climate sciences, demands new tools for analyzing and visualizing the data. I've seen some projects that seek to deal with the emerging issues, but the progress of these new tools seems to be lagging significantly behind the growing data sets. As a concrete example, take the analysis of output from the NICAM, a global cloud-system resolving model. This is a model that has points every 7km over the entire surface of the earth. A good deal of variable are on vertical levels, say about 50 of them. It is conceivable that you'd be interested in examining global fields every hour for several years. On a typical desktop, loading a single 3-dimensional field for ONE hour would require all (or more) of the available memory, making operating on the field pretty slow, and doing serious number crunching is basically impossible. This isn't going to be a special case for long, either. A new generation of cutting-edge models will have similar resolution, and as they start producing actual simulations (i.e., ones from which scientific results are desired), analysis tools need to be available to do the job. Right now, I don't have any such tools. Those that do exist need to be made available and useable, and soon.

UPDATE:
I have been looking into these vague notions a bit more. Amazon has a side company called Amazon Web Services that sells cloud computing (computation, storage, database query, etc). The service seems to leverage the fact that Amazon has a ton of computational power and storage just sitting around, so they try to sell their downtime to companies that need more cyberinfrastructure than they can afford. It's a pay as you go system, and you only pay for the compute power/time that you actually use. It seems very interesting. Of course, the problem is transferring this kind of system to a more science community system. It would be nice, for example, if the same kind of system were available from an NSF computing center, and you could access data interactively using a web browser, or submit large simulations from a web browser that then run in the cloud with results going to the online storage facility. Of course, the problem is that "science" doesn't have a giant existing distributed computing environment with plenty of downtime, and there's not a lot of incentive to set one up (i.e., the NSF isn't that altruistic). These are just thoughts to chew on.

2009-01-19

Climate blog smackdown

So for those who follow the climate blogs (by which I mean the blogs in which climate change deniers' arguments are routinely dismantled with high school math and physics), Tamino has just posted one of the best smackdowns to a denier analysis I've seen in a while. Tamino writes the "Open Mind" blog, and the entry is from 19 January [LINK]. Basically, a denier says that the increase in atmospheric CO2 is not anthropogenic, but natural, and presents a time series analysis of the Mauna Loa CO2 data using carbon isotopes. It probably sounds good if you aren't being critical about it, but in about three lines of algebra, Tamino proves that the analysis is exactly wrong, and the "result" is dictated by the terrible way the analysis was done. Smack. Fail. Totally worth reading.

2008-10-12

A geoengineering teaser

So, just read the "News and Views" piece in Nature Geoscience (Vol.1, (644), 2008; doi:10.1038/ngeo326) called "Climate change: Cool spray" by Heike Langenberg. I can't spend the time to really get into it, but I certainly will in the short term. The article simply reports some ideas presented by John Latham in two papers in the Philosophical Transactions of the Royal Society (Phil. Trans. R. Soc. A doi:10.1098/rsta.2008.0137; 2008, and Phil. Trans. R. Soc. A doi:10.1098/rsta.2008.0136; 2008). According to Langenberg, Latham proposes as a low-cost geoengineering fix to anthropogenic global warming the injection of sea salt into the marine boundary layer. This takes advantage of the Twomey effect, whereby additional condensation sites lead to smaller cloud droplets which reflect a higher fraction of incident light (a cooling effect).

Okay, well, I haven't read the papers yet, but I will. And since this is one of those areas where I know something, I should be able to address some of the issues that this plan raises. As has been pointed out regarding other geoengineering ideas, this one is fundamentally a shortwave effect, while the global warming is a longwave one. That means that the crux of the plan relies on reducing the total energy in the climate system, probably by using thin ribbons of clouds over subtropical oceans. Meanwhile, outside of those regions, the same sunshine comes in, and the same CO2 is sitting in the atmosphere, and presumably there's still enhanced water vapor. The effect of those clouds is to change the surface temperature beneath them, creating temperature gradients in the surface air temperature and sea surface temperature. This perturbs the low-level circulation. The low level circulation moves some energy around, but the majority of the north-south energy transport in the climate system is accomplished through storms grabbing energy from the low latitudes and moving it northward. So depending on the placement and extent of this cloud shield, the effects on both the low-level wind field and the indirect effect on the storm tracks will significantly alter the naive expectation that reflecting more light back to space will offset human-induced warming.

But more on the details in a future post.

2008-09-26

Great animation about climate change

Wake Up, Freak Out - then Get a Grip from Leo Murray.


Wake Up, Freak Out - then Get a Grip from Leo Murray on Vimeo.

I saw this posted on Grist, and thought it was pretty cool.

I'm not a fan of the "tipping point" argument, actually. Not that I don't believe there are tipping points, but just because I don't know how to really quantify the risk they pose. We don't know where the tipping points are, but we do know about the mechanisms of adverse amplifying feedbacks in the climate system that could introduce these tipping points. I'm interested in the physical processes, and trying to quantitatively understand their impact on the climate system.

2008-08-26

Turbulent times at the science-public interface

I read a fair number of climate-related blogs. One of these days I'll try to get some kind of "blog roll" going on the side of the page. In the meantime, there's a small box over there that has items I'd like to share, not always science related, but things I found interesting. Anyway, I sometimes get frustrated with these blogs because it seems like most of their effort is to address/attack/debunk/explain the climate change skeptics/sceptics/deniers/inactivists. This seems less and less useful to me, for a couple of reasons. First, there just aren't that many of those deniers out there that can make waves in the media with anything close to a reasoned argument. Second, just going point by point through their "arguments" to show why they are wrong appears to me to be a weak form or rhetoric. Actually, this is a major problem with the democratic party, too, as they seem to just respond to the outlandish attacks of the republicans.

I think this topic has been addressed in excruciating detail in the climate-blogosphere recently... I should have links to Mooney and other here, but I'm just riffing today. There were a lot of posts about whether or not to directly address the deniers, or whether it was counterproductive. Today I think it is largely counterproductive.

Not to say that these idiotic arguments shouldn't be ripped apart, they should, but we don't need 50 blogs all ripping apart one obscure denialist's claims, as it really just gives them much more "credibility." For example, this weird person that as far as I've seen is only working in the blogosphere and only goes by the name "lucia." She's making all kinds of noise, and because places as high profile as realclimate and deltoid spend time on her, I think she's attracted quite a following. This despite the fact that she is unable to present a reasoned argument that actually answers a question (a recent post by Grumbine demonstrated this).

This is just a bit of a rant, I guess. My point is that I'm now feeling more strongly that refutations of these arguments should simply be compiled in some climate science wiki, where all the people who feel compelled to add their take on a denialist claim should be allowed to add their part. The blogs themselves would be much more interesting if they would start explaining the science of climate change (or impacts of climate change) without the obligatory strawman provided by denialist claims.

From the point of view of marketing climate science and the general findings of the field (e.g. the IPCC report), there need to be stronger statements of the things we know, worded without our normal scientific, passive, conservative language. I've recently seen more of this, even in the recent report on severe weather and climate change, but communication to the public needs to be better. Yes, it probably has to be dumbed down, meaning less nuance and fewer caveats, which as scientists we don't like. However, the public is bombarded with too much information, and the average American is undereducated in basic science, so we have to develop a language to communicate that we are certain that humans are causing climate change, and there are severe risks involved in doing so. And this has to be done without being condescending. Again, this is exactly what the democrats need to do. We need those lists of talking points that the conservative think tanks cook up and distribute. It's sad, and I don't like it, but is there any other way to do it? The ideal way would be to ground all Americans firmly with rational thought and basic science, but that is long-term and probably unrealistic. *Sigh*

Haven't other fields gone through all this? What about AIDS research? Tobacco causing cancer? Evolution, obviously. Stem-cell research... Why can't we start talking amongst ourselves about dealing with the public and the press, without getting bogged down in the details of our own fields? Climate science needs to look to these examples, get the people who have experience, and ask for advice.

[UPDATE: A related post today from Grumbine, discussion versus debate.]

2008-08-13

Endangered Species Act to be undermined by new rules

This is outrageous. The Bush administration is trying to introduce some kind of executive rule that would allow federal agencies to conveniently ignore the impact of projects on endangered species. Not only that, but the new rule would apparently ban federal agencies from assessing the greenhouse gas emissions from projects. You need to go read the story, there's one at Yahoo!. There are some additional reactions at gristmill.

I suppose that some might believe this argument that the current system is bloated and that federal agencies really can do their own assessments. However, the idea of independent review is so fundamental to science, and is so successful, that to just throw it aside for an intrinsically flawed self-evaluation system seems ludicrous to me.

We'll see how this plays out. Whether my paranoid, leftist brain is overpowering my rational, pragmatic one will come clear in the coming weeks and months.

2008-08-08

Newsflash? No. (again)

Well, in place of posting some hasty response to a story or new paper, I thought I'd point out that there's way more climate-related press these days than even a year ago. It's too much to ingest, much less to respond to it all. The press is saturated with climate, global warming, deniers/skeptics/inactivists, and all manner of good and bad reporting. Nothing new for some fields, though I think the climate science community is still learning to deal with so much attention. Maybe that is a good topic for a post in the near future... why do climate scientists STILL get abused by both the mainstream press and alternative press (on both sides of the issue)?

Anyway, here's a couple of recent stories.

First, the US government might believe in human-caused global climate change. LINK Didn't this happen years ago? Geez.

Apparently the army isn't waiting to see what the higher-ups say, because they're going green. LINK This is actually a pretty big story, as the army is a gigantic consumer of energy.

And a little good news, there's a lot more gorillas in the world than previously believed. LINK

A late addition, I just saw this news story about Plattner et al. who find that the "pipeline" or the "committed climate change" might be much longer than had been thought. LINK

2008-06-19

Amplification of Cretaceous Warmth by Biological Cloud Feedbacks -- Kump and Pollard 320 (5873): 195 -- Science

A correction, I have read the Kump & Pollard paper [Amplification of Cretaceous Warmth by Biological Cloud Feedbacks -- Kump and Pollard 320 (5873): 195 -- Science], which I discussed a little yesterday.

I looked at it again briefly, and the main things to add are (1) their experiment is what I thought, which is that they artificially change the effective droplet radius. This is fine, as it is artificially prescribed anyway. And (2) they provide a reference for the assertion that DMS is the primary non-anthropogenic CCN. That reference is actually a perspective in Science by Andreae(DOI: 10.1126/science.1136529). I've just read that piece, which is interesting, but I feel is deeply flawed. Rather than pick it apart, let me just say that my primary concern is that it does not seem to give a broad overview of the current measurements, but cherry picks a few studies that may or may not be designed well enough to get at the points being made. In the end, the firm conclusion is that pre-human aerosol distributions, specifically CCN-sized aerosol, is very poorly understood, and that no good way to guess the concentrations has been devised yet. Good points, and that is all I have to say of that article.

More on CCN later though, as this is a good area to explore!

2008-06-18

biology in climate models

Just read a quick blurb in Nature [link] that kind of rankled me. The writer, David Beerling, worries about the fidelity of biological processes in climate models. Fair enough. He cites interesting work by Kump & Pollard that suggest a deficiency in climate modeling of the Cretaceous period might be (partly) due to lower numbers of cloud condensation nuclei because phytoplankton are more stressed and produce less dimethylsuphide, producing more cloud cover but thinner clouds.

In the last paragraph, Beerling writes that the results are unsatisfying because "the effects of heat on biological aerosol emissions need to be better described in their model for it to generate really solid conclusions."

I hardly know where to start. I think the comment is fair to an extent, but perhaps misguided.

Starting from the Kump & Pollard paper, which I admit I haven't read yet, I am not convinced that there's much evidence for this biological effect. If they've artificially changed the aerosol concentration or the CCN concentration, then it's almost a foregone conclusion that there'll be big effects in the simulation. That's one of those parameters which, in most large-scale models, is not well constrained and is set to help make a reasonable current climate. That probably means that it could be adjusted to make a "reasonable" past climate, too, but knowing what that means is a different story. The second issue is whether DMS is really such an important source of CCN. I know it is a source, but does stressing phytoplankton really have so much influence on the mean cloud field?

For that matter, this affect would mostly influence regions of low, stratiform cloud. Other regions are probably not that influenced by modest changes in CCN concentration, as the low clouds are mostly convective anyway (and I'm guessing salt would be their main source of CCN -- I could be wrong). So the result is dependent on the way these clouds are parameterized as well as the assumptions about the biological processes influencing CCN. Sounds shaky. I'll look at that paper and post an update. If I can find anything, I'll also post something about the source for CCN, and whether DMS is really that important.

As for adding biology to climate models, I'm very hesitant about the issue. There are potentially important climatic feedbacks. And if we could construct models for the biological interactions, that would help with long-term climate simulations, especially for future climate change and paleoclimate simulations. It'd be great. We're at the very nascent stages though. Current generation climate models incorporate some kind of land model that has simple biology, and usually isn't interactive (meaning the plants don't respond to changing climatic conditions). The ocean usually has no biological model, but there are some marine ecosystem models that exist and are being tested for the next generation of climate models. The complexity ranges dramatically from very simple to quite complex, but there is still much debate about the results from models incorporating these ecosystem components. This is a slippery slope though. Adding ecosystem models seems like a great idea, but if they are true models with prognostic equations, it usually means more expensive simulations (more computer time, more storage, and more human time to analyze the output). And where do we draw the line? Phytoplankton respond not just to temperature and salinity, which are the state variables in ocean models, but also to light availability (varies with depth) and micro- and macronutrients like nitrogen and phosphorous. Should we have include nutrient models, or prescribe the nutrient distributions. Well, if we try to prescribe them, then there will be biologists who (rightfully) will say that the models are missing the feedback between changes in nutrients and biology, which propagates upstream to the rest of the climate model. If we do somehow include nutrient models (perhaps by making the biological production simple diagnostics from available nutrients), then geologists and geographers and social scientists will argue that the processes that are the natural and anthropogenic sources and sinks of nutrients are not properly represented. For example, factories and power plants emit a lot of sulphur, but there's a lot of variability among factories and power plants, as well as seasonal and daily cycles in their emissions. Do we need to simulate these cycles to properly represent the emissions to the atmosphere? If so, that would mean that for climate change simulations, we'll have to model the changing energy needs of populations, since that will impact the amount of emissions from the powerplants. Meanwhile, current climate models to not properly represent processes like wave breaking at the ocean surface which is probably one of the main sources of CCN over the ocean; when will we get to add this? For that matter, different kinds of aerosol have totally different properties as CCN, and what happens chemically to these things in cloud droplets can have an influence on future cloud evolution, so should our microphysical models incorporate the chemistry of individual chemical species within cloud droplets? How do we do that?

My point is not that we should not add more complexity to climate models, only that we don't know how to do it. Even if we stick to the ocean, ice, atmosphere, and land surface, we know we're missing huge chunks of well understood physics. To venture into the chemical and biological world, much less the anthropological, seem hasty at this point. I think we'll do it; I mean, it is being done, but we have to remember that climate models are tools for understanding the natural world. They won't be able to provide solid answers to questions without much thought; climate models can not be run as black boxes with the results taken as truth, no matter how much we add to them.

2008-05-06

Impact of warming on insects and other ectotherms

There has been quite a lot of press coverage of a new paper in PNAS (doi:10.1073/pnas.0709472105) by Deutsch et al. about how global warming might affect land-based invertebrates. The paper is very short, and easy to understand, so I recommend it. It's actually a simple idea, based on empirically derived "fitness curves" for different organisms. As I read it, the story comes down to the fact that tropical temperatures don't vary much during the course of the annual cycle, while at higher latitudes it does. This has affected the organisms that live in these different climate regimes; tropical organisms have come to "expect" a small temperature variance, and don't do well when the temperature changes more than normal. Organisms from places with distinct seasons are more amenable to temperature swings. This has been derived empirically as these fitness curves, which are broad for extra-tropical organisms and narrow for tropical ones. An interesting characteristic is that the maximum fitness level comes at a temperature optimum, followed by a precipitous decline. So Deutsch and co-authors followed up on this, defining a "warming tolerance" and a "thermal safety margin," which just measure how close an organism lives to its maximum temperature tolerance and to it's optimal temperature, respectively. Then they apply a warming scenario to see what happens, and it turns out that tropical insects (the data they used), get pushed really hard compared with midlatitude insects. This is somewhat surprising, since the tropics don't warm as much as higher latitudes, but because the tropical organisms live so close to their maximum temperature already, they are heavily stressed by the more moderate warming. They extrapolate to a global scale of impact on insects, and then three extra classes (lizards, frogs, turtles). This part of the paper does not seem surprising after the initial analysis, though the extension to lizards, frogs, and turtles helps deliver the message.

We often hear about the impact of climate change on biology, ecosystems, etc., but we don't often see such a concise and simple, yet far-reaching and quantitative analysis of "impact." This paper stands out to me because of these traits, and gives an excellent example to use when discussing the importance of climate change beyond temperature, precipitation, etc.

2008-04-22

Another example of what should have been done already

Today I read an article posted on Science Daily (Aerodynamic Truck Trailer Cuts Fuel And Emissions By Up To 15 Percent) that made me kind of upset. The story is about a new, more aerodynamic sideskirt design for truck trailers. The design reduces drag, and increases fuel efficiency by about 10%. That is great, right, so why am I so mad about it? Well, because this very small improvement could have been made years ago with little effort, but corporate inertia has kept this kind of innovation from being properly implemented. EVEN MORE EGREGIOUS though, is that there have been radically different designs for trucks for decades that could improve fuel efficiency by 25% WITHOUT CHANGING THE ENGINE. The design that I'm aware of is by Luigi Colani, who I've only become aware of recently by watching "Future Car" on the Discovery Channel. In the 1970s, Colani came up with a radically more aerodynamic truck design, which apparently sat on his shelf unimplemented for a couple decads. In 2001, he introduced a new design, this one 50% more efficient than conventional trucks, but still no one is building it or even stealing some of his ideas. Why? It doesn't make sense.

While I was trying to remember Colani's name, I found two examples of a "more efficient truck." First was just from a couple months ago. Navistar has introduced a new model called LoneStar, which is supposed to be 5-15% more fuel efficient than traditional trucks. The second was from 1995, when the US DOT gave an award to Kenworth for their T600A, which had been produced since 1985. The award was described by Barry Langridge, Kenworth's general manager, "It literally changed the face of the trucking industry forever by creating a new generation of fuel efficient trucks which have saved billions of gallons of fuel. The 70,000 T600s built since '85, when compared to non-aerodynamic conventional models, will save an estimated 1.25 billion gallons during their useful lives." As far as I can tell, these trucks still are getting 6-8 miles per gallon.

Also read a similar post at the "Our Futrure" blog, which sent me to the Colani site and echoes my lack of enthusiasm for current truck design.

2008-04-21

recent deaths

I just wanted to note a rash of recent deaths of prominent scientists.

Arthur C. Clarke, writer and futurist, died at 90 years old on 19 March 2008. NYTimes Wikipedia

John A. Wheeler, physicist, died at 96 years old on 13 April 2008. NYTimes Wikipedia

Edward N. Lorenz, meteorologist and "discoverer" of chaos, died at 90 years old on 16 April 2008.
NYTimes
Wikipedia


A couple of other notable deaths include:
David Gale, UC Berkeley mathematician, died at 87 on 7 March 2008. Frederick Seitz, physicist, died at 96 on 2 March 2008. Astronaut G. David Low died at 52 on 15 March 2008.

Two very different actors also recently died: Charlton Heston and Paul Scofield.

2008-04-15

Science Debate 2008 - May in Oregon!?!

Obama has backed out of the debate scheduled for next week, and McCain and Clinton were non-committal. So organizers are trying to schedule the Science Debate for May in Oregon. [LINK].

Please visit the official Science Debate 2008 web site and find a way to support this important cause. Basic research funding, ethical stands on scientific/technological issues, and policy decisions that should be informed by scientific findings all need to be discussed in an open and fair forum, and the candidates must be expected to be knowledgeable, thoughtful, and articulate about how they will deal with science and technology in the next 4-8 years.

update: If you happen to be reading this and have any doubt that we need a president who cares deeply about science and technology, go read ScienceNOW's April Fool's day joke: Bush to Science: "Let's be friends". This is an official website under the American Association for the Advancement of Science! Can you imagine this kind of attitude from the mainstream scientific organizations with ANY OTHER ADMINISTRATION?

Brian Greene brings science to the masses?

I just saw a blurb in Science about the World Science Festival, which is apparently an attempt by Brian Greene (The Elegant Universe) and others to popularize science on a large scale. It seems like a good idea, and there look to be some really fun events. However, I do wonder about the effectiveness of making science a cultural event in the middle of New York City, where there are abundances of both cultural events and people interested in science. It will be nice to see such an effort in a less "sophisticated" place, even if it is a big city like Houston, Denver, or the Twin Cities: places that could reach large audiences that don't have such easy access to cultural and scientifically interesting events. Just a thought.

2008-03-22

Leveraging the Amazon Kindle, some ideas

This is off-topic, but I just struck myself with a couple of simple ideas related to the Amazon Kindle. I'm sure you know, but just in case, go look at the Kindle, it's an interesting device that uses "E-Ink" to make it's display look a lot like an actual printed page. It connects wirelessly to a cell phone network to download books from Amazon, or I think it can be connected and synced with a library on a computer, or something like that. The important thing is that it's a small, lightweight device that can (apparently) successfully mimic the experience of reading a page from a physical book, magazine, or newspaper. People have responded pretty well to it, and today I noticed that Amazon has had supply issues because of the large demand. Great, but I'm not trying to advertise this device, which I've never even held in my hands, but really I'm trying to make the case that there is such a device and that it isn't horrible. Many people complain about reading text from monitors of any kind, TVs, computers, iPods/iPhones, etc, and that is why they always print things and carry around books and papers and such. The early success of Kindle shows there's a chance these people could be satisfied with a similar device, at least to some extent. This leads me to some obvious observations about the potential for such devices.

Go to a high school, or better yet, just go across the street from a high school when classes let out and students start pouring out from the school. Yes, you will probably be put on some kind of list with the local law enforcement, but do it anyway. You will see that kids are loaded with gigantic backpacks, hanging improperly from their shoulders. This was an issue when I was in school, and I've seen it recently enough that I know it still is. It isn't just teenage garbage that fills those packs, there are a bunch of heavy textbooks. In my days, I know that I often had to leave some books at school or home because I couldn't fit all my books in my bag every day; there are injuries due to heavy bags, which is just stupid. Many times, students take home textbooks just for a few pages, either of reading or homework assignments. Wouldn't it be convenient if they carried one "book" with them that had all their course material at the ready? A Kindle-like device would be perfect for this.

An added bonus is that if publishers work with the content provider (e.g., Amazon), then new editions of books, or supplements, or any number of extras could be provided to under-funded schools and students at minimal costs. Once the burden of putting ink to paper (incurring costs related to ink, paper, layout of presses, binding, packing, shipping, etc) is removed, the cost of materials drops dramatically. At higher education levels, this is even better, since small, esoteric books are really expensive, and authors don't make any money from them anyway, this could take the cost of such books to the floor, and college students, grad students, and professional researchers could actually afford to buy the books that are most useful to them (do you hear the frustration?). If you don't know what I'm writing about, go look at technical books in just about any field, for example, Cloud Dynamics by Houze costs $80+ on Amazon, which is a graduate/researcher level book that is really useful to a relatively small number of people. If we could just download it to our digital library, the cost SHOULD be dramatically reduced while still preserving the publisher's profit and the royalties that Houze gets (which I bet isn't much).

It would also be possible to self-publish under this model, so instructors could upload their class notes to some service which would make the download available to students. This might hurt copy stores, who make a lot of money by selling over-priced readers to students, but would be great for instructors and students.

Similarly, scholarly journals could use this distribution model very easily. They already have online subscription models, by which most people now go to a web site, find the article of interest and download the pdf of the article, which is then read on-screen or printed. This would be a natural extension of that model, and would reduce printing costs and wasted paper.

What needs to be done to make a Kindle-like device actually work for these educational and scholarly purposes? Very little. One potential gotcha is that the current E-Ink technology used in the Kindle does not allow color. This presents serious limitations compared to having a regular pdf or a printed book. This is especially true for high school and lower-level textbooks, which regularly rely on "creative" layout and color to "draw the student in." There is electronic paper with color already developed, however, so this might not be a deal-breaker (example).

The other technology-related speed bump is speed. One of the criticisms of the Kindle, and E-Ink in general, is that it is slow to render pages. I imagine the problem would be worse with graphics-heavy, color-intensive pages, so quick flipping back and forth would be a problem. And as we know, when doing homework or research, there are often periods of intense page flipping, searching for some specific passage or re-reading something that didn't register the first time. There are myriad potential solutions, from having a little screen-space for "saving" passages or efficient built-in searching, to more dramatic changes in E-Ink.

Other problems are design of the device, which would probably require different models catering to grade-school, high school, college, and professionals. No problem. Then there's the potential problem of getting publishers to embrace this model. They're apparently onboard with Kindle, Amazon offered 80,000+ titles when the Kindle was released, but it might be more difficult for the small publishing houses to deal with such a dramatic change in distribution. I don't know enough about publishing to really have a good guess, but we know the music and movie industries have basically freaked out and tried to avoid moving into a digital distribution framework. And then there is the cost of these devices, and how to widely distribute them to schools already strapped for cash. You can imagine the nightmare scenarios.

So at the end, I don't think any of this is new, and I'm sure it has all been discussed much more elsewhere, but I'm suddenly enamored with the idea of carrying a single "book" wherever I go, having all the books I need, and being able to buy technical books for a fraction of the current cost. Maybe this is an idea that a company like Amazon could actually pursue, too, since they probably have enough sway to get a school district to do a test program by providing Kindles and the textbooks needed (except it doesn't address the color problem yet). It would also help to raise a generation of people who don't "need" to have paper in front of them to read.

UPDATES:

Here's some ideas to improve the Kindle, though I think there's a conflict with some of these suggestions and the E-Ink interface. That E-Ink technology is not as versatile as a lot of bloggers think it is.

Ouch! Here's Scoble on the Kindle, and he's pretty unhappy about the lack of design. Watch the video, I think these are good points that need to be addressed in this commercial form of the product, but also for some future educational applications.

Another overview of the Kindle, with some of the same criticisms. Down in the comments, someone suggests Kindle Textbooks as a good option for this technology, so at least someone else has thought of this.

Here's a post that thinks Kindle has a chance at textbooks.

There's a thread on the Amazon site about college texts on Kindle. There seems to be support for this, but the publishers aren't yet on board. I really don't think it'll work unless you can buy a textbook for, say $25, as opposed to the paper versions for $50-150. The idea of automatic updates to new editions (maybe with some sanity limits) is so sweet for the consumer that it'd sell Kindles alone. The publishers lose out though, except I bet they could increase their profit margins to make up the difference.

2008-02-18

I found the included video from a lecture by Naomi Oreskes on Deltoid. You'll remember Oreskes from her Science article a while back in which she showed that there is strong scientific consensus in the belief that global warming is human-induced. In this lecture, she presents a very brief history of the science of global warming, doing an excellent job of going back to the very roots, and making the important point that scientists have predicted global warming for at least 50 years. A related point is that as time has marched on, the predictions have gotten more detailed, and they've shown to be true so far. I especially think back to the 1988 Hansen paper, which showed projections of climate change from numerical simulations, which has now shown to be a conservative estimate of the warming. In the second half of her lecture, Oreskes discusses the "denial of global warming." This goes back to that now familiar, but surprisingly recent, poll that most Americans still think there is scientific debate about whether global warming is human-induced (versus a "natural cycle" or such). Oreskes asks why this is, when scientists, as she has just shown, really reached consensus about global warming in the 70s/80s and about the cause of the warming in the mid to late 1980s. She traces the origins to the Marshall Institute, and a tactic she calls the "tobacco strategy." She traces the history of the Marshall Institute to its roots as a PR campaign to defend Reagan's star wars program: ultimately a conservative, anti-communist group. She follows the progression, and discusses Fred Singer and others, who have through the past two decades argued against scientific issues essentially to stop government regulation (and thus "creeping communism"). It's a very interesting presentation, clear and objective, and I think shows very well how the "tobacco strategy" has effectively misguided the American public through deliberate manipulation of mass media outlets.

2008-02-07

Environmental Research Letters - Best of 2007

An e-mail I received today:


Environmental Research Letters (ERL) has just released the Best of 2007, a mixture of Perspectives and Letters that best represent the high quality and breadth of the contributions that were published last year in ERL, as chosen by the Editorial Board, guest editors and publishing team.

This special collection includes contributions to invited focus issues on Environmental Health and Justice, Northern Hemisphere High Latitude Climate Change, Tropical Deforestation, and Global Impacts of Particulate Matter Air Pollution, as well as an editorial from ERL's Editor-in-Chief, Professor Daniel M Kammen.

To read the ERL Best of 2007, visit http://herald.iop.org/ERL_Bestof2007/m261/crk//link/1319 where you can access the online table of contents or download the full pdf version of this very special collection.

2008-01-22

Sex bias in peer review

I think this is an important issue, not just in terms of sex bias, but that the whole peer-review process could probably be improved.

This study (http://dx.doi.org/10.1016/j.tree.2007.07.008) by Amber Budden at U. Toronto, suggests that female authors are more successful in a double-blind peer review process rather than the more conventional single-blind review.

Just to be clear, double-blind means that the author doesn't know who the reviewers are and the reviewer does not see the name or affiliation of the author(s). Single-blind means the author submits the paper and the journal/editor finds reviewers (the author doesn't know who they are), but the reviewers see the author's name(s). Almost every scientific journal uses this single-blind approach; for no good reason.

From where I sit, there seems no good reason to maintain this single-blind review process. Not only does it possibly discriminate against women, I think there are likely many more negative effects. The most obvious one is that "prestigious" scientists, those who might have published a lot or have contributed seminal work in a field, seem more likely to get through the process less critically. This is part because they are good scientists, of course, but it can also be because there really are not enough reviewers to go around, and much more junior scientists (sometimes grad students) end up reviewing papers. It is intimidating as an inexperienced scientists to be critical of work by someone you know/respect/fear/want-to-work-with/etc. Also, many sub-disciplines are populated by a fairly small number of experts, who end up being asked to review each other's papers all the time. This can go either way: people are likely to be extra critical of rivals and less critical of friends.

Recently, there has been some open discussion of the review process (e.g., DOI: 10.1126/science.319.5859.32b and DOI: 10.1126/science.319.5859.32c), which is good. However, I haven't noticed any large-scale call for double-blind review. This is amazing, since double-blind studies are a foundation of modern science. I honestly can't think of a single reason that every journal should not immediately switch to double-blind reviews.

2007-11-26

Watch a YouTube video

This guy on YouTube has spelled out a very nice approach to how "skeptics" should look at the possibility of a changing climate.

2007-10-10

blog action day

If time permits, which is a big if right now, I will try to participate in blog action day.

Bloggers Unite - Blog Action Day

2007-10-04

Shipping Lanes

I've been sitting on the idea for this post for almost a week, but haven't had a chance to work it up. Since it doesn't look like I'm going to get to do it the way I originally wanted, I'm giving in and just going for the gusto. Maybe (yeah right) I'll come back and round out the rough edges later, but for now I want to get the basic ideas out there.

Ship tracks are the contrails of the sea. Perhaps more accurately, ship tracks are to the marine atmospheric boundary layer what contrails are to the upper troposphere. They are lines of what we will call clouds that form behind a ship. They are the focus of a recent article that I found very interesting. A news summary can be found on the Science (LINK) web site, while the paper appears in GRL. For a good picture of ship tracks, NASA's MODIS is a good resource.

The idea in the paper is to establish the radiative forcing associated with ship tracks on the global scale. This hasn't been done before using observations because ship tracks are very low, very small clouds that cover a tiny amount of Earth's surface area. However, they are common, as the paper points out, in several regions, notably off the coast of Africa and in the North Pacific. These are, somewhat coincidentally (but not really), the same regions where we think about extensive stratocumulus decks.

Schreier et al. use one year of satellite imagery, from the ENVISAT-AATSR, and go through a straight forward but intensive process of identifying ship tracks and then estimating their radiative forcing. The bottom line is that in some regions the radiative impact of ship tracks, lets call it the local radiative effect, can be a non-trivial -0.05 W/m2, but on the global scale the effect is miniscule at -0.4 to -0.6 mW/m2 (plus or minus 40%). Note that the global value is in milliWatts, so is 100 time smaller than the largest regional radiative effect (-0.05 W/m2 = -50 mW/m2). The negative sign arises because ship tracks are very low clouds that are very white (i.e., reflective), so when they appear they provide a more reflective surface for sunlight to bounce off, which to first order reduces the amount of energy in the climate system (because most of the reflected light goes back out to space) and cools the climate. This is familiar if you've been exposed to cloud "feedback" ideas, in which more low cloud cover increases the albedo of Earth and cools it. In fact, this is a terrific example of that effect, but we'll come to that shortly. It is also good to note that the radiative forcing associated with a doubling of atmospheric carbon dioxide is about 4 W/m2, which is itself a small signal in the total radiative budget (with 1365 W/m2 of incoming sunlight at the top of the atmosphere, distributed over a day (divide by 4) and an albedo of about 0.3 you're talking in the neighborhood of a 225 W/m2 of sunlight being absorbed at the surface, and all of global warming comes down to 4W/m2 give or take!).

Okay so before I sign off, leaving you totally confused. I wanted to point out a couple of interesting things about ship tracks that aren't necessarily in the article. First of all, it is helpful to remember why ship tracks form. The ships are steaming ahead, burning fairly dirty fuel to get where they are going, and the exhaust goes right out into the atmosphere. This exhaust contains particulate matter as well as precursors for particles, so the ship is basically making a trail of particles behind it. These particles act as nucleation sites for water, forming small cloud droplets. Because the ships spew out so much stuff, there are enough nucleation sites available to grow lots of droplets and form these linear clouds. Why don't the clouds form anyway if there's that much water in the atmosphere already? Well, a couple of reasons. One is that the relative humidity isn't quite 100% in fair weather conditions, but even if it were, water doesn't like to condense unless there are surfaces (supplied by the particles). At a relative humidity of about 80%, there just aren't enough particles floating around the clean maritime boundary layer to let the water condense into clouds. The ships provide the extra nucleation sites necessary, and make it even easier by supplying the boundary layer with hygroscopic particles, meaning the particles effectively decrease the saturation specific humidity (http://en.wikipedia.org/wiki/Hygroscopic). That just means that the particles are very efficient at turning water vapor into liquid water. So a ship goes by, spews out water-loving particles, water condenses on those particles forming droplets, and a big collection of droplets is a cloud. Fine, what else?

So okay, the ships go by and make lines of clouds, but we now know (or strongly suspect) based on Schrier et al. that the global effect of these cloud is negligible and the local effect is also pretty small. Can we be done with it then? Not quite. These clouds are a great example of the Twomey effect, which is an old idea now and just says that by increasing the number of particles in the air, the size of cloud droplets gets smaller, and when clouds are made up of small droplets they are brighter (i.e., more reflective). Coakley et al. (1987) presented ship tracks as such an example, showing with satellite data that the reflectivity of ship tracks is higher than the surrounding low-level cloud cover. This is exactly what leads to the radiative forcing that has now been estimated by Schreier et al. The important thing to recognize here is that the Coakley et al. study is essentially a proof of concept, showing that pollution can impact atmospheric radiative transfer. They definitely did not say ships were impacting global climate.

There is a related effect, sometimes called the Albrecht effect, which takes into account the change in cloud fraction associated with changes in particles in the atmosphere. It is presented by Albrecht (1989), and is also a pretty simple idea. When extra particles are put into the atmospheric boundary layer, they form droplets and brighter clouds, as discussed above. Smaller droplets can also change the formation of raindrops, or more precisely in the case of shallow maritime clouds, drizzle drops. The change is to reduce the precipitation efficiency, which increases the liquid water in the cloud layer, and can lead to an increase in the fractional cloudiness. The important point here is that not only could increased particle concentration in the marine atmospheric boundary layer make brighter clouds, but could actually increase the overall cloudiness. This would amplify the effects discussed by Coakley et al. because there would now be a larger area covered by brighter clouds. The Albrecht study makes use of ship tracks only in the sense of the Coakley et al. study, and only suggests that changes in precipitation could account for the sustained difference in ship tracks from the stratiform cloud in which they are embedded. This is supported to some extent by aircraft observations.

And finally, since we're covering so many bases, there's another effect that should be mentioned. Pincus and Baker (1994) present a study that extends the Albrecht study in that it accounts for the change in the thickness of clouds in the presence of varying particle concentration. They use a model of a cloudy boundary layer and account for changes in absorption and precipitation with cloud thickness and droplet number, respectively. This effect is not quite as "obvious" as the other indirect effects, but the bottom line is that more droplets can make thicker clouds with a higher albedo, which is thus another facet of this negative feedback associated with changes in atmospheric aerosol (particles). They note, however, that you'd expect to see ship tracks extend higher than surrounding clouds, which at that time was not observed. I'm not sure where this effect really stands, but it is interesting to consider.

So these are the indirect effects of aerosol on climate. We came a long way in this post, from a recent study showing that the globally averaged radiative forcing due to ship tracks is small all the way through aerosol effects on cloud albedo, precipitation processes, and horizontal and vertical cloud distribution. Well done. There are a lot more details that could have been added, and tons more studies. These will be left for future posts, though. I've included some references below for those of you who want to follow up.

References

Schreier, Mathias; Mannstein, Hermann; Eyring, Veronika; Bovensmann, Heinrich
Global ship track distribution and radiative forcing from 1 year of AATSR data
Geophys. Res. Lett., Vol. 34, No. 17, L17814
10.1029/2007GL030664 (LINK)

JAMES A. COAKLEY JR., ROBERT L. BERNSTEIN, and PHILIP A. DURKEE
Effect of Ship-Stack Effluents on Cloud Reflectivity
Science 28 August 1987:
Vol. 237. no. 4818, pp. 1020 - 1022
DOI: 10.1126/science.237.4818.1020

BRUCE A. ALBRECHT
Aerosols, Cloud Microphysics, and Fractional Cloudiness
Science 15 September 1989:
Vol. 245. no. 4923, pp. 1227 - 1230
DOI: 10.1126/science.245.4923.1227

ROBERT PINCUS & MARCIA B. BAKER
Effect of precipitation on the albedo susceptibility of clouds in the marine boundary layer
Nature 372, 250 - 252 (17 November 2002); doi:10.1038/372250a0

2007-09-11

The Arctic and its role in the climate change discourse

I spend most of my time thinking about clouds in the tropics and subtropics, but lately there's been a lot of mainstream coverage of the Arctic and how it relates to climate chage. I've posted about Arctic issues before, of course, but today I not only want to highlight a little of the coverage that I've noticed lately, but also warn you, gentle reader, that this is really just going to be one of myriad posts, articles, stories, and sundry coverage of the Arctic over the coming 2-3 years (and probably beyond). Why? Because of the "International Polar Year," which is a big enough deal to have its own domain: ipy.org. It is, as the name implies, an international effort to better understand the Earth system near the poles, from their web site:

IPY, organized through the International Council for Science (ICSU) and the World Meteorological Organization (WMO), is actually the fourth polar year, following those in 1882-3, 1932-3, and 1957-8. In order to have full and equal coverage of both the Arctic and the Antarctic, IPY 2007-8 covers two full annual cycles from March 2007 to March 2009 and will involve over 200 projects, with thousands of scientists from over 60 nations examining a wide range of physical, biological and social research topics. It is also an unprecedented opportunity to demonstrate, follow, and get involved with, cutting edge science in real-time.

Don't fool yourself either, this is not a group of environmental activists who are out trying prove something; this is a concentrated period of study of the Arctic and Antarctic by the people who do that work anyway. It should lead to some great collaborations and synthesis of datasets that haven't been able to be compared or incorporated in meaningful ways before.

So that is the future, what is going on now?

Well, just over the past few days I've read a few interesting tidbits about the Arctic, which people seem to enjoy discussing more than the Antarctic (but more on that later). One of the poster children for climate change awareness is the polar bear, which relies on big pieces of sea-ice floating around near other pieces of sea-ice. The bears hang out on the ice, get hungry, dive in after fish or seals, and come up onto more ice. Apparently they aren't so well adapted to feeding on land, plus there isn't as much food available on land for them. Anyway, a quick article from the BBC, which came to be via ClimateArk, reports on a study that suggests two-thirds (2/3) of the polar bear population will be gone by the middle of the century [LINK]. That's 30-50 years from now, if you're keeping score at home. Why are the bears going to disappear? Because the ice is going away. So what does that mean for a species that relies on ice rafts as hunting platforms? It means that the bears are going to starve and drown. That is a fact. There is already evidence that some populations of polar bears are losing weight, and it probably isn't in preparation for beach season (Regehr et al 2006, also Roberson 2005 (news), Obbard et al. 2006).

This leads directly into our topic number two: sea-ice. This is one of the reasons it's more interesting to talk about the Arctic than the Antarctic, actually. Think about the globe, and picture the poles; the south pole is covered by a landmass (Antarctica) which is actually pretty large, extending far from the pole before giving way to the Southern Ocean. The fact that it is land, combined with the fact that it is surrounded by a continuous ring of ocean, makes climate change near the pole more difficult to understand: the ice in the middle of Antarctica isn't melting. And the sea-ice is much more seasonal (for the most part, though don't forget the Larson B ice shelf!) than in the Arctic (we're painting with a broad brush here). The Arctic is just an ocean, really, which provides easy passage among North America, Europe, and Asia, you remember the Northwest Passage [news], except that it has historically been blocked up by sea-ice. Lately this isn't so true [news, Randy Boswell].

The opening of the Northwest Passage is due to summertime melting of sea-ice, as discussed in Randy Boswell's very nice piece above. There has always been a lot of seasonal sea-ice around the Arctic. During the winter there is little to no sunshine available to deliver energy to warm the surface or melt ice, so as temperatures drop, ice forms, and it stays there until summer when the sun comes out. So that happens every year, and is perfectly normal and expected. However, what has happened over the past few years is a tremendous summertime melting, and just about every year now we hear about how sea-ice extent and sea-ice area are reaching record lows. One of the problems with this is that there is a potential feedback, since the "permanent" sea-ice (that ice that does not melt during the summer) is being reduced each year, so during the winter the ice that grows is thin, leading to quick melting in the summer, which exposes more permanent sea-ice to warm water and sunshine, leading to more loss and a diminished base amount of ice going into the winter. This most recent report suggests that the speed of this cycle might have been underestimated, and now some experts (yes, they are experts in Arctic sea-ice) say that an ice-free Arctic (in the late summer) could exist by 2030 (Serreze et al 2007a,b), which is right around the corner. This bodes ill for the polar bears.

Finally on this subject, it is interesting to note the relationship between the absurd observed sea-ice melt in the last few years compared with our best comprehensive climate models (Serreze et al. 2007b, Overland & Wang 2007). Some of the current-generation models do sort of okay, while basically all of them show a strong trend in the Arctic, but none of the models accurately predict the magnitude of the observed trend. Let me reiterate that these models don't know anything about the observations; they are physical models of climate system forced by atmospheric composition (carbon dioxide) and sunshine, so this isn't a matter of poor data assimilation or statistical techniques or a poor model (in the sense of statistical modeling). This is a dramatic underestimation of the impact of climate change on a region of the world known to be prone to positive feedbacks. What this means is that our "uncertainty" about the future of climate change goes in both directions. Climate change deniers like to point out problems with the models that they think lead to unlikely warming, but here we have a beautiful example of the models underestimating what is actually happening. Perhaps the models are too conservative? Not really, I just wanted to be provocative for a moment. My interpretation is that we need to improve the physics in the models, and probably spend more effort in doing atmosphere-ocean-ice interactions much better than this round of climate models. That is a rant I'll save for later though, as this post is stretching the average blog reader's patience.


Some references:

Regehr, E.V., Amstrup, S.C., and Stirling, Ian, 2006, Polar bear population status in the southern Beaufort Sea: U.S.
Geological Survey Open-File Report 2006-1337, 20 p. [PDF]

Obbard, Martyn E., Marc R.L. Cattet, Tim Moody, Lyle R. Walton, Derek Potter, Jeremy Inglis, and Christopher Chenier, 2006, Temporal Trends in the Body Condition of
Southern Hudson Bay Polar Bears. Climate Change Research Information Note, Issue 3. Ministry of Natural Resources, Ontario, Canada, 8 p. [PDF, see also MNR SIT]

Serreze, M. C., M. M. Holland, and J. Stroeve. 2007. Perspectives on the Arctic's shrinking sea-ice cover. Science 315(5818): 1533-1536, doi:10.1126/science.1139426. [pdf]

Stroeve, J., M. M. Holland, W. Meier, T. Scambos, and M. Serreze. 2007. Arctic sea ice decline: Faster than forecast. Geophysical Research Letters 34, L09501, doi:10.1029/2007GL029703.

Overland, J. E., and M. Wang (2007), Future regional Arctic sea ice declines, Geophys. Res. Lett., 34, L17705, doi:10.1029/2007GL030808. [pdf]

2007-09-04

hurricanes again

It has been far too long since my last post... the casual blogger's constant lament. In my own defense, a lot has happened in the past few months, not the least of which is that I finished my PhD program and moved to Fort Collins, Colorado as a postdoc. I am now affiliated with both UCLA and CSU via the CMMAP project. Of course, anything I say on this blog has nothing to do with those institutions, and could still be wrong even though I am now officially an "expert."

Now to what I was going to write....


After a rather slow start, the Atlantic hurricane season is really getting going now. Early this morning Hurricane Felix came ashore along the Mosquito Coast in central America as a powerful category 5 hurricane. This is the second category 5 storm to make landfall in the past 3 weeks (following Dean), and apparently is the first time two category 5 storms have made landfall in the same season. It is also worth noting that only about 31 category 5 storms have been recorded in the Atlantic since 1928. Of course, reliable observations were not available until the 1960s; there have been 18 category 5 storms since 1966. Eight of those have occured over the past five years [2003,2007]!

The big storms are not the only story though. There is a lot of tropical activity already, including three tropical storms (Barry, Chantal, and Erin) and numerous disturbances that haven't developed. There is currently an area off the Florida coast that is probably going to develop into a tropical storm over the next few days (although there is significant wind shear). There's also a region out in the central Atlantic that could still develop, basically following the same path as Dean and Felix. These all originate as "easterly waves" coming off the west coast of Africa, and it is starting to look like it's going to be a very active season; don't be surprised to see another category 5 by the end of the month. If that does happen, it will only be the second year with more than two category 5 storms in the Atlantic.

By the way, yes there are tropical cyclones in the Pacific too! Henriette is the third East Pacific hurricane of the year (Cosme, Flossie), and although it is only a category 1 right now (possibly 2 by landfall), it is bringing substantial rain to the west coast of Mexico. In the western Pacific there have already been 7 typhoons this season and 2 tropical storms.

2007-04-30

ye olde iron fertilizing effect

So apparently people now think they can make money by throwing iron into the ocean... YES, throwing iron into the ocean.

Here's the, actually very good, story on NYTimes.com: [The Energy Challenge: Recruiting Plankton to Fight Global Warming]

The basic idea is that plankton reproduce like mad when the conditions are right, and in large swaths of the ocean the conditions are right. Except there isn't enough iron. So, when you dump some iron on those areas, plankton bloom, creating regions of increased biological activity. The upside to this, according to some, is that the plankton use carbon from the ocean to make their little calcium carbonate exoskeletons, which when the critters die, can sink to the bottom of the ocean. This means that carbon is removed from the atmosphere-ocean system... it is sequestered, like an OJ juror. So now at least two companies, so cleverly named Planktos and Climos, think they can get governments (or companies working under cap and trade systems) to pay them to go throw some iron into the ocean.

I do not reject this idea outright. There are clearly some good ideas here, but we have to be careful. Here are a couple of my primary concerns.

First, I'm worried that these plankton species will produce a lot of methane waste, possibly negating any decrease in atmospheric CO2 that they might be responsible for. There is similar concern with nitrous oxide, apparently.

Second, the amount of carbon actually deposited might be less than has been thought recently. This is actually in this week's Science [LINK].

Third, as these operations scale up, will they account for their own carbon emissions. Boats are notoriously bad for emissions, and there's going to have to be a lot of boating involved. Also, where is this iron coming from, and how much energy (i.e., carbon) is going into collecting and transporting it?

Finally, there are possible feedbacks that could negate any good this will do. Including the old DMS-cloud condensation nuclei, in which more biology produces more aerosol (in the form of dimethylsulfide, DMS) which acts as nucleation sites for cloud droplets, making more cloud. The effect could be to shade the surface, reduce SST, and thus reduce biological productivity, leaving a rusty sea surface instead of a nice, healthy green one. I don't know if this is feasible, but things like this always seem to come up.

Also consider the amount of carbon dioxide that needs to be removed from the atmosphere. I've just come from a talk that reminded me of this. Carbon dioxide, when frozen, has about the same density as water. That means that a ton of CO2 is about one cubic meter (size of a coffee table). You're now talking about removing billions of tons of carbon dioxide annually, which is an enormous mass, many cubic kilometers of frozen CO2. The ocean is a big place, but we've got to be careful about how and where such deposits are made, or they'd just be mixed back up to the atmosphere. There's just so many potential pitfalls that it is hard to imagine a successful implementation. But as I said at the beginning, I'm willing to keep an open mind on the subject, and would be happy to see a successful strategy.

2007-04-27

US Army getting into supercomputers

Here's a quick story that seems like it is important. I will refrain from any interpretation of speculation here.

Army funds supercomputing center [LINK]

2007-02-04

IPCC AR4 - Summary for Policymakers

So you by now know, I hope, that the Intergovernmental Panel on Climate Change has issued it's Fourth Assessment Report of working group I (i.e. the "physical scientists). Okay, actually, it has only released an abstraction of the AR4 called the Summary for Policymakers. The full report will be available in a couple months. For now, get the summary from the UCAR site [PDF].

There are a lot of issues about this report and the IPCC in general that I'm tempted to start spouting. Instead, I'm going to let those thoughts roll around a little more, and perhaps wait for the full report. For today, I just want to point out some key points, about the summary, along with some cautionary words.

The report is written by climate scientists (so is the summary). All the writers, both lead authors and contributing authors, work on a voluntary basis. I think the idea, at least for the north American and European scientists, is that the IPCC is an important way for scientists to interface with the policymakers and general public, and that working on the report is an important outreach activity. The IPCC as an entity is organized under the auspices of the World Meteorological Organization (WMO) and the United Nations Nations Environment Programme (UNEP), and as such it is open to members of those organizations. That means that a lot of scientists are eligible to contribute to the report, but it also means that there are a lot of governments that have vested interests in what the report ultimately says.

Do governments influence the content of the report? Well, from what I've heard ('round the proverbial water cooler), there is very little direct interaction with world governments. Drafts of the reports are sent out to tons and tons of people, including governments, so there are notes sent back. Those notes have to be addressed individually, but I'm sure that most of them are insignificant and are basically ignored. The indirect influence is probably more important. The scientists writing the report are aware of the political/societal implications, and try to protect themselves by explicitly avoiding making prescriptive suggestions; they don't say what to do about climate change, they just evaluate the scientific evidence to evaluate the extent of climate change and projections of future change. The indirect influence of governments and economics makes the authors, in my opinion, even more conservative in their language than scientists normally are. I think the report generally does not embrace more extreme projections and predictions, which is to their credit, but is a caveat when reading the report or the summary. That is, some of the key points are probably more conservative than individual scientists would suggest.

Of course, there is also pressure to put new and important results into the report. This pressure comes less from external sources and more from the drive to show how much we've learned since the last report. There are a few points in the summary that I was surprised to see, not because they aren't important, but because I think there have to be many, many caveats. The two examples that stick out are (1) that tropical cyclones are getting more intense with global warming and (2) that patterns of precipitation will change in the future, with specific patterns emerging as robust. I'll blog more about both of these in the future, but here I'll just say that I look forward to reading the specifics in the full report. I generally believe the first claim, while the second one seems extremely poorly constrained by climate models.

Even with these opposing pressures, the results of the summary are largely unsurprising and in line with the Third Assessment Report (2001). The lower range of climate sensitivity has inched up a bit from 1.5C to 2.0 C. That basically has had to move up as we've seen more and more warming over the past decade. They are also making the upper range of possible sensitivity more hazy by mentioning some projections of greater than 5C or so, even though they don't necessarily incorporate those very sensitive projections in the non-analysis that goes into writing the report. (I'll also talk about what I mean by non-analysis in a future post.)

So, yes, global warming is happening. Oh, and yes, it is because of humans emitting carbon dioxide into the atmosphere.

2007-01-09

Newsflash? No.

Well, according to NOAA's National Climatic Data Center, 2006 was even warmer than 2005. That means, if you haven't been keeping score, that 2006 was the warmest year on record (for the USA). I didn't find an actual ranking for the global mean temperature, but I think we can safely assume 2006 was in the top 5 (if not the top 2). The press release blames (rightly, for a change) ENSO (which is El Nino) for the getting 2006 into the top seat. It turns out December was really hot, mostly because there weren't very many storms crossing the country. My suspicion is that this El Nino will continue to make this winter much warmer than average, and 2007 will beat 1998 and 2006 as the warmest year in the past 1000+ years.

LINK TO NCDC

2006-12-12

meltdown

I don't have time to really think too hard about this story, but it is making the rounds, so I'll at least acknowledge it. Some NCAR simulations now predict essentially no late-summer ice in the Arctic by 2040. See the story at BBC [LINK] or at Nature [LINK]. The actual paper is in Geophysical Research Letters [doi:10.1029/2006GL028024].

What does this mean? Well, I haven't had a chance to look closely at the paper, but I have some first impressions. I did get a sneak-peek of these results this summer, and at the same time was introduced to some of the details of the sea-ice model used in CCSM (NCAR's climate model), so maybe I'll be able to say something halfway meaningful. The paper itself does not predict an ice-free Arctic in 2040, so let's just get that out of the way. This paper is really about the possibility of abrupt decreases in sea-ice in a changing climate, and the current generation of climate models suggest a real possibility of large reductions in perennial ice coverage in the first half of the 21st century. The main focus is a set of CCSM simulations using one of the emissions scenarios from IPCC. They also take a look at some of the results from other IPCC models. The CCSM always has what the authors call "abrupt reductions" in Arctic ice, and several of the other models also show large reductions.

I am willing to accept these results, but I think some skepticism has to be exercised still. First off, this is a GRL paper, which is a journal of short, usually preliminary, work focusing on "sexy" results. The peer-review process for GRL is sometimes thought to be a little lax, and sometimes the quality of the work is questionable. That does not seem to be an issue for this paper; the CCSM is a respected climate model, the authors are top-notch climate scientists, and this work is presented well. That said, this is not the last word on this project; I'm sure that the authors are doing more detailed work and are planning a longer, more careful analysis for another journal (e.g., Journal of Climate, Climate Dynamics). The best thing that could do would be to better quantify what "abrupt changes" really are, and the physical processes that trigger them, which is a big open question in this paper. They say the abrupt changes are driven by thermodynamics, but don't really present evidence of this; I assume they mean that wind patterns/ocean currents are changing to just move ice out of the Arctic, but it is not explained. The other thing to keep in mind is that even in the current generation climate models, the sea-ice models are fairly crude. I don't mean that in a bad way, the people working on these models are doing the best they can. Ice processes are quite complicated, and to properly model sea-ice, much like "properly" modeling clouds, the simulations need to be run in much higher resolutions. That kind of resolution is too expensive right now, and even if the resources were there, it would be a tough sell to dedicate it to the sea-ice component rather than better atmospheric and oceanic components. This particular climate model is known to be fairly sensitive, and when it gets knocked out of equilibrium, the sea-ice is one of the things known to respond fairly erradically. So while I think the CCSM, and several other high-end climate models, can get a lot of important changes correct, we still can't trust the details of these fully coupled simulations. My interpretation is then something like this: in the near future (50 years), it is likely that rapid reductions in perennial Arctic sea-ice will be observed, associated with (but not well-correlated with) increasing atmospheric greenhouse gases.

2006-12-07

Would I qualify?

A small story, ultimately of no consequence, suggesting the "need" for a Nobel Prize for the Environment: LINK.

Funny as it may sound, I don't think it would be a very good idea to add such a prize. I would like to see more earth scientists honored for their contributions to understanding physics, chemistry, and dynamical system, but the Nobel prizes are so high profile, and climate change such a charged issue, I think such a prize would be politicized immediately. That would sully the award, because there would always be questions about why people get the prize. Not only that, but it would be difficult to separate scientific achievment in understanding the environment from conservation of the environment, which is more social science or economics or political or who knows what. If conservation groups tended to be awarded the prize, then earth scientists would be even less likely to be honored, since they wouldn't get the environment prize and they'd usually be excluded from the physics or chemistry prizes.

There is also the issue of the maturity of the fields of meteorology, oceanography, climate dynamics, environmental sciences, and such. While those of us in the field could come up with a list of deserving people, it would be difficult after a few years to say with confidence that a person(s) have made a lasting positive contribution. This comes up in the other science prizes when people complain that awardees get the award decades after the work, but the defense is that it takes that long to figure out what work needs to be honored. We don't really have very many decades of work to choose from (of course, excluding the early pioneers like Bjerknes, Charney, Richardson, Ekman, von Neuman, and many other dead folks).

2006-12-05

Down under, where the carbon is.

Today's little tidbit is a short news story about a Journal of Climate paper [LINK]. The paper is about a climate simulation that includes an ocean (and presumably a carbon cycle model). The bottom line is that they think they have a credible southern hemisphere atmospheric circulation, with then drives a realistic Antarctic circumpolar current. If you have never done so, go get a globe and look at it from the "bottom," so you are looking right at the south pole; notice that there is a ring of water around Antarctica. That's the only place where that happens, and it makes a big difference to the world's climate. Anyway, they say that as the winds around Antarctica move south, they change the uptake of carbon dioxide into the ocean, which partially offsets the climate change associated with the anthropogenic greenhouse effect. That's good! Unfortunately, it also accelerates sea level rise (by pumping heat into the ocean, raising the temperature faster) and ocean acidification (which might feed back onto the carbon cycle if critters start dying off). So there you go. Maybe I'll try to tap Nikki for more nuianced insight, since this is closely related to her work.

2006-11-17

mid-term updates

The dearth of posts here for the last month should be taken as an indication of progress and stress here at the home base. It is difficult to save the world one simulated cloud at a time, but it will be worth it. The blog will likely continue to suffer in coming months, but I will try to put up interesting tidbits on a weekly-ish basis.

Today's tidbit is about supercomputers. What do I know about supercomputers? Not too much, but sometimes I use them. Okay, sometimes I use small little chunks of them (anywhere from 8 to 128 processors right now, maybe more in the near future). However, people who do know about high-performance computing are abuzz about the new rankings of the top 500 supercomputers [LINK]. The IBM machine at the Lawrence Livermore National Lab is destroying its competition, running at an impressive 280.6 teraflops. That is 280.6 trillion operations per second, where an operation is basically adding or multiplying some numbers. A nice desktop computer can usually crank out about one billion operations per second (1 gigaflops), which is 280,600 times less than the BlueGene/L at LLNL. The next closest speed to the BlueGene/L is at Sandia National Lab, which runs a Cray (called "red storm") that gets 101.4 teraflops. That seems like nothing in comparison, but it is only the second system to break the 100 TFLOPS barrier.

For comparison, the Earth-Simulator in Japan (made from NEC parts, 5120 processors) is now ranked 14th at about 35 TFLOPS. That facility is still considered an amazing feat, and the atmospheric simulations coming from them are still astounding people in the atmospheric sciences [EXAMPLE]. NCAR's newest machine (one I definitely do not have access to) is "blueice," an IBM machine running 1696 processors at 10.5 TFLOPS.... I think this machine is getting expanded very soon, too. They also have a BlueGene to play with that is ranked 144, using 2048 processors, and IBM machines (1600, 608 processors) at numers 193 and 213.

Why does any of this matter? Well, for one thing, we are inching closer and closer to the ultimate goal. Also, we are on the brink of "peta-scale" computing, which is probably going to change the way computational science gets done. We'll be able to do simulations much faster with much finer resolutions, which will produce incredible amounts of data. It will be a challenge over the next few years to develop ways to deal with all that data. It will require different software approaches as well as new hardware. With standard desktop technology of today, the file I/O (that is, just reading the data from the hard disk) is far too slow to deal with the amount of information that we're going to be dealing with. Crunching the numbers and then visualizing the model output is going to be tremendously difficult without an incredible amount of support from computer-savvy folks who can help the scientists. The technology is coming, money is already being spent, projects are being planned, so now is the time to start thinking about how to deal with the output.

2006-10-03

The Ozone Hole is confusing

I've been seeing small news items over the past week or two saying that this year's Antarctic ozone hole has matched the previous record, and that the amount of ozone is the lowest ever [e.g., LINK]. While this is interesting and important news, I'm wondering if it might confuse people. Afterall, a lot, way more than you think, of people think there is a direct link between anthropogenic global warming and the ozone hole. Not just laypeople on the street either, smart and usually-informed people think this. Climate scientists everywhere are constantly being forced to correct people at cocktail parties and other social events. "No, the ozone hole is due to chemicals called CFCs in the upper atmosphere; global warming has to do with burning fossil fuels."

After the gigantic ozone hole of 2000, the size has actually decreased, leading most to believe that the Montreal Protocol of 1987 was a smashing success, and that the hole would disappear in 50 years or so. Actually, that is still what most people in the know are thinking.

So what's with the new big ozone hole? Well, it may have something to do with global warming. Sigh.

Basically, every winter (in Antarctica the winter is during June-July-August) it gets really, really, really cold around and over Antarctica. Because of the geography of the southern hemisphere, there are incredibly strong winds that essentially circle around the continent of Antarctica. Cold air basically gets trapped inside this huge votex, and has nothing better to do than get even colder, all winter. During this deep freeze, polar stratospheric clouds (PSCs) can form, in which molecular chlorine can form (Cl2), and the stronger the vortex, the larger the PSCs and the more Cl2 can form.

When the spring comes, sunlight is added to the equation. Sunlight easily breaks the molecular chlorine into atmoic chlorine. The atomic chlorine (Cl) quickly reacts in a chain of events that destroys ozone; it's a catalytic reaction; a single chlorine atom can tear apart many ozone molecules. This is why the ozone hole appears suddenly in September, when the sun finally shines on the pole.

How is this related to global warming? Well, the same course of events happens year in and year out, but there is variability, of course. Because of the international agreement to eliminate the use of CFCs, every year the amount of CFCs decreases. As a side note, CFCs get absorbed by the upper ocean, and are used as a great passive tracer to study ocean motion. Even with the decrease, the coldness of the winter is still quite variable. The colder the winter, the stronger the polar vortex, the more PSCs can form and condition the stratosphere for ozone depletion. It is possible that the large-scale circulation pattern of the southern hemisphere could adjust to make the polar night colder even as the global surface temperature rises. A paper from 2000 explores some of these issues of synergy between stratospheric ozone depletion and greenhouse gas warming (Hartmann et al, 2000, PNAS). This is ongoing research, as the question of how the circulation will adjust to a warmer world is hard to answer, but my feeling is that more and more people seem to think that the change might favor these extremely cold winters with a strong polar vortex and favorable conditions for ozone depletion.

By the way, Cambridge has a nice ozone hole web site: LINK

2006-09-25

Slate misunderstands wine AND global warming

Last friday, Slate.com posted an article by Joel Waldfogel called "Go North, Young Grapes: The effect of global warming on the world's vineyards." I was excited to see it, since I'm really interested in both global warming and wine. However, after reading it, I find several major deficiencies, some of which are obvious errors in understanding what global warming is and how plants, specifically grape vines, work.

The article reports on something called a "working paper" by Orley Ashenfelter and Karl Storchmann, who I think are economists. The paper is called "Using a Hedonic Model of Solar Radiation to Assess the Economic Effect of Climate Change: The Case of Mosel Valley Vineyards," written for the National Bureau of Economic Research, Inc., whatever that is. I have looked at this paper, which you can find through the RePEc (Research Papers in Economics) web site [LINK,pdf], and so I will let Joel Waldfogel off the hook for a time while we discuss the paper itself.

Section 2B of the paper states, "it is apparent that
total solar radiation is highly dependent on the amount, kind and density of clouds, and varies
with time and place. For the sake of simplicity engineers often calculate the so-called
extraterrestrial radiation, that is, the radiation that would be available if there were no
atmosphere (Duffie and Beckman, 1991)." What this means to me is that they don't want to account for variations in the atmosphere (weather and such, you know, that's not important), so they are going to use what I would call solar insolation. However, that varies only with latitude and time of year, and they are calculating it at the ground, so they ignore the atmosphere but take into account the slope of the ground. Okay, well, I'll tell you why that is a poor assumption shortly.

Let me now quote from section 2D:
"D. Other Factors that Affect Vineyard Sites Gladstones (1992) provides a detailed analysis of several other factors that make specific geographic sites more or less suitable for the production of high quality grapes. Important factors include those that reduce diurnal (night-day) temperature differences. Nearness to a body of water and, especially, soil type are important determinants of diurnal fluctuations..."
Hold on to this for our discussion below.

Before going on to the analysis, the authors discuss the data for vineyard prices, and how they take into account non-south-facing slopes, altitude, and soil characteristics. There are gross assumptions built into these choices, which I will ignore here. However, let's just say that vineyards don't necessarily suffer from being farther away from bodies of water, despite the authors' assumption. One aspect that might be worth mentioning here is that the authors state that they think vineyards far from large bodies of water will be hurt because they don't have smaller diurnal temperature fluctuations; as I understand it, grapes do extremely well in conditions where there is large diurnal variation.... the hot days and cool nights of California's Napa, Sonoma, and Mendocino counties come to mind.

So how do they do global warming without an atmosphere? Well, they don't. They do a very simple energy flux calculation using blackbody radiation, albedo (reflectivity) and an "emissivity." Fair, except that instead of actually considering something resembling an emissivity, the authors choose to assume that the energy emitted from the surface is half of that emitted from the atmosphere. Crude to say the least, especially when it would have been easy to do much better. So they are sort of taking account of the greenhouse effect, since they'll get temperatures that are way too cold if they don't. They then plug in a temperature change associated with global warming, and get the amount of "radiation energy" that must be associated with that change, and they continue to assert this is "solar radiation" (actually in their figure they say "positive net radiation" which is correct).

Here's the thing. They set up their model using solar insolation, or atmosphere-free radiative flux at the surface, but then they try to apply a climate change that relies on a crude assumption about the atmosphere. This is inconsistent. They could have done better, but let us accept it. A greater problem is that they are making a model based on how agriculture should use incident solar radiation, which is visible light. Yes, there is a connection between sunshine and temperature, but plants are highly dependent on the actual sunshine for photosynthesis, not temperature alone. This is a complex biological relationship the authors fail to take into account.

They mention that there are other factors that affect vineyards, as quoted above. A critical one is the night-day temperature variation. The model punishes vineyards for having a large/larger diurnal variation, including an assumption that higher altitude vineyards are farther from water, must have larger day-night temperature variations, and therefore suffer more from "global warming." I'm just not sure why they do that, as I've learned that wine grapes are better with large diurnal cycles, and also wines made from mountainside vineyards are among the most prized/collected wines in the world. This is actually going to be important too, because in global warming scenarios, the diurnal variation is often affected more than the actual maximum temperature. That is because the effect is in the infrared, not the visible light, so after the sun goes down the surface can't cool as efficiently because the atmosphere is warmed. That means minimum temperatures get higher, and they change more than daytime maximum temperatures, which reduces the diurnal variation. The authors ignore this fact.

The major deficiency of the paper is the assumption that plants will thrive under warmer conditions based on energy input arguments. While it is true that there will be a larger energy flux into the surface under global warming, this energy will be in the infrared, which does not necessarily benefit plants. In marginal growing areas where occasional freezing conditions damage crops during the growing season, increases in daily minimum temperatures might reduce the occurence of these freezes, but the increased energy flux will not increase photosynthetic activity. Vineyards will not benefit directly from global warming by absorbing more radiant energy.

A more appropriate hypothesis to test is whether the changes in growing season length might affect vineyards. Since "spring" will start earlier, plants might respond by starting their growth cycle earlier. Autumn-like temperatures will come slightly later, so the growing season my be extended further. In the case of vineyards, this might allow grapes to ripen more, which increases the sugar content of the berries and increases the alcohol content of the wine. More importantly, different grape varieties might benefit by a longer growing season, so areas that only grow grapes with a short "hang time" now might be able to expand to other longer "hang time" varieties. Regions that don't have a long enough growing season to properly ripen grapes might get a boost and obtain growing seasons long enough to produce them (thinking especially of regions of Oregon and Washington).

Existing vineyards are unlikely to be affected by global warming, especially in established regions with strong control on growing practices (e.g., Bordeaux, Burgundy). It is possible that the nature of the wine will change, as warmer days and nights might change the sugar levels of grapes, or various other aspects of the fruit. It is also possible that changes in rainfall patterns will significantly alter the agricultural practices, and the possibility of severe droughts and floods putting more vintages in jeopardy in the future is a distinct threat.

2006-09-15

Sun spots only predict hemlines

This week's Nature has a short review article about the effect of variations in the Sun's luminosity on Earth's climate. In fact, most of the article is about trying to understand the Sun's luminosity and the solar physics at work. In the end, I think the important thing to glean is that there is a well-known 11-year sunspot cycle, and sunspots are cooler than the solar surface. However, when there are lots of sunspots, the sun is actually a bit brighter than normal because of faculae and the "magnetic network" of bright thermal "leaks," that let more energy escape the solar surface. All the evidence points to variations is luminosity (brightness or energy flux) being due almost entirely to magnetic field variations. Not so surprising perhaps. More surprising is that as hard as people try to find secular variability in the luminosity, it doesn't seem to change much. Even less surprising is that the variations that are observed, and inferred from proxies, should have a minimal influence on Earth's climate. This, despite global warming denialists always talking about "solar variability" as if it were a well-known, well understood phenomenon.

Here's something that hardly ever gets said out loud: climate scientists know at least as much about climate as solar physicists know about the sun. There, I said it. The two fields are covered in very different ways in popular press, though. Why? My little theory goes like this: People (general public, policymakers, media) can associate solar physics with astrophysics, which is like physics, which they (usually) didn't understand when they took it in high school/college compared; climate science, on the other hand, is not like physics (to them), and maybe it is more like meteorology, which is like the weather report, which is always wrong (right? Actually, no, but that is the perception.) So there is this tendency to not believe the "climate scientists" or "climatologists" (an even worse term) when they publish a new result, and this skepticism is amplified because there are so often controversial policy consequences/implications that bring out more vocal opposition and "fair and balanced" sort of treatment in the media. Contrast that with findings about the sun or stars or astronomy in general, which is mostly covered as amazing and important new scientific facts (unless it has to do with defining planets!). So that sort of sums up my pet theory.