About Jeff Masters
Cat 6 lead authors: WU cofounder Dr. Jeff Masters (right), who flew w/NOAA Hurricane Hunters 1986-1990, & WU meteorologist Bob Henson, @bhensonweather
By: JeffMasters, 14:21 GMT je la 27an de februaro 2009
Hurricanes, earthquakes, and tornadoes get the attention-grabbing headlines when a natural disaster kills people in the U.S. Yet heat waves, cold winter weather, severe thunderstorm winds, and flooding all killed more people in the U.S. between 1970 and 2004, according to a December 2008 article published by Kevin Borden and Susan Cutter of the University of South Carolina. Tornadoes and lightning were tied for fifth place, and Hurricanes and earthquakes tied for eighth place. However, had this study extended one more year into 2005, the roughly 1800 hurricane deaths from Hurricane Katrina would have vaulted hurricane deaths into third place, behind heat wave deaths and cold weather deaths. The study also showed that people living in rural areas were most likely to die from a natural disaster than those living in cities.
Figure 1. U.S. deaths due to natural hazards between 1970 and 2004 showed that weather associated with extremes of hot and cold weather, along with severe thunderstorm winds (the "Severe Weather" category), killed the most people. Image credit: Spatial patterns of natural hazards mortality in the United States, International Journal of Health Geographics. Authors: Kevin Borden and Susan Cutter of the University of South Carolina.
The authors used Spatial Hazard Event and Loss Database for the United States (SHELDUS)(available at http://www.sheldus.org). This database provides hazard loss information (economic losses and casualties) from 1960-2005 for eighteen different hazard types, and is primarily based on data from the NOAA/National Climatic Data Center publication, "Storm Data". The numbers have high uncertainty, and the authors conclude, "There is considerable debate about which natural hazard is the most "deadly". According to our results, the answer is heat. But this finding could be changed depending on the data source, or how hazards within a data source are grouped."
Figure 2. U.S. deaths due to natural hazards for the 10- and 30-year period ending in 2007, according to the National Oceanic and Atmospheric Administration (NOAA). Image credit: NOAA.
To illustrate, a 2008 study by Thacker et al. called, "Overview of deaths associated with natural events, United States, 1979-2004", concluded that cold deaths were twice as common as heat deaths in the U.S. However, they noted that the 1995 Chicago heat wave, which killed between 600 and 700 people by some estimates, was not properly represented in the data base used in their study. This data base attributed only 50 deaths in the entire state of Illinois to heat in 1995. The authors conclude that their data base "under-reports the actual number of deaths due to severe heat".
Another example: NOAA plots up annual natural hazard deaths from the same source ("Storm Data") as the first study I montioned. Their statistics for the ten-year period ending in 2007 show a much different picture (Figure 2). Heat deaths are a much more dominant source of mortality than cold and winter storm deaths, by a factor 3.5. The take-home message from all this is that heat- and cold-related extreme weather are probably the deadliest weather hazards in the U.S., but we really don't know the proportion of people killed by each. One can easily cherry pick the study of one's choice to show a desired result.
How global warming might affect heat and cold-related deaths
If the globe continues to warm up this century, as predicted by the Intergovernmental Panel on Climate Change (IPCC), heat-related deaths will increase and cold-related deaths will decrease (duh!). Unfortunately, that's about the most intelligent thing one can say about the matter. The 2007 IPCC report (section 22.214.171.124, Heat- and cold-related mortality), does not attempt to estimate the numbers, saying, "Additional research is needed to understand how the balance of heat-related and cold-related mortality could change under different socio-economic scenarios and climate projections."
This high uncertainty in future heat- and cold-related deaths does not stop advocates on either side of the global warming issue from cherry picking results from selected studies to support a particular point of view. For example, opinion columnist George Will stated in a recent Newsweek column: "In Europe, cold kills more than seven times as many as heat does. Worldwide, moderate warming will, on balance, save more lives than it will cost--by a 9-to-1 ratio in China and India. So, if substantially cutting carbon dioxide reverses warming, that will mean a large net loss of life globally." Will bases his arguments on Danish statistician Bjørn Lomborg's controversial 2007 book, "Cool It: The Skeptical Environmentalist's Guide to Global Warming." However, as pointed out by Danish biologist Kåre Fog, who has assembled a large web site dedicated to pointing out the errors in Lomborg's books, the huge number of excess deaths attributed to cold by Will and Lomborg are in large part because the death rate naturally rises in the winter: "Old and seriously sick people have less vitality in the dark season. It is too bold to say that the excess deaths during the dark part of the year are `deaths due to excess cold?. There is no evidence that a warmer climate will alter the seasonal variation. These people would soon die in any case, even if winters became warmer. Indeed, cold and warm climates, like Finland and Greece, have approximately the same seasonal variation in mortality." The IPCC underscores this problem, stating: "projections of cold-related deaths, and the potential for decreasing their numbers due to warmer winters, can be overestimated unless they take into account the effects of influenza and season".
Heat wave deaths are subject to a degree of uncertainty as well. It is somewhat of a subjective call if an elderly person who dies during a heat wave died primarily as a result of the heat, or of a pre-existing heart or respiratory condition. Complicating the diagnosis is the fact that air pollution is at its worst during heat waves, and can also be blamed as the cause of death in some cases. Different studies will use different criteria for classify deaths due to heat, pollution, or pre-existing medical conditions during a heat wave, leading to widely varying estimates of mortality. For example, the European heat wave of 2003 is blamed for 35,000, 52,000, or 70,000 deaths, depending upon the source. You're more likely to hear the higher 70,000 figure quoted by advocates of doing something about global warming, and the 35,000 figure quoted by those opposed.
The three 2008 studies for the U.S. show the ratio of cold deaths to heat deaths ranges from 2:1 to 1:3, which is very different from the 7:1 and 9:1 figures quoted by Will and Lomborg for Europe, India, and China. I don't trust any of these numbers, since heat and cold mortality statistics are highly uncertain and easy to cherry pick to show a desired result. It is rather unproductive to argue about how many people die due to heat and cold in the current climate or in a future climate. Excess heat deaths due to climate change should not get as much attention as the potential for death due to reduction in crop yields due to increased heat and drought, regional collapses of the oceanic food chain from the steady acidification of the oceans, and the wars these conditions might trigger.
For more information
For those interested, Kåre Fog also presents a list of the errors in Al Gore's book and movie, An Inconvenient Truth, and has a Comparison of error counts between Al Gore and Bjørn Lomborg. Lomborg has assembled a Short reply to Skeptical Questions to respond to some of Fog's criticisms, but does not answer Fog's criticism on cold deaths vs. heat deaths. Suffice to say, one should be wary of trusting climate change information from either source, or from opinion columnists, or from politicians. Blogs can also be a questionable source of climate change information, though I think wunderground Climate Change blogger Dr. Ricky Rood is one of the most knowledgeable and unbiased climate change experts in the world. Though imperfect, the best source of climate change information is the U.N. Intergovernmental Panel on Climate Change (IPCC). The level of scientific collaboration and peer review that went into that document is one of the most remarkable achievements in the history of science, and the IPCC was fully deserving of the Nobel Prize awarded to it last year. Blogs and books like Lomborg's and Gore's have not gone through peer-review by scientific experts on climate change, and will have far more errors, biases, and distortions of the truth than the IPCC reports.
Updated: 19:48 GMT je la 16an de aŭgusto 2011
By: JeffMasters, 14:17 GMT je la 25an de februaro 2009
The $789 billion stimulus package passed by Congress and signed into law on February 13 gives some $21.5 billion for scientific research and development across all agencies, according to the American Association for the Advancement of Science. "The stimulus package is a singular event in the history of science funding," said John Marburger, former presidential science adviser and head of the Office of Science and Technology Policy under George W. Bush. Indeed, the coming year will be a good year to be a graduate student. The extra $3 billion given to the National Science Foundation will go to fund a wide variety of scientific research at universities.
According to an article in last week's Nature magazine, here is a breakdown on who gets what among the government's scientific agencies:
National Science Foundation
Stimulus: $3 billion
2008 budget: $6.1 billion
Highlights: $2.5 billion will go towards external research grants, including $300 million for instrumentation. A separate allowance of $400 million will go to construction of major facilities.
Department of Energy
Stimulus: About $40 billion
2008 budget: $23.9 billion
Highlights: Includes $11 billion for the electric grid, $5 billion for weatherproofing homes, $3.4 billion for fossil energy R&D and $2 billion for battery research. The Office of Science, which funds basic research, receives $1.6 billion. A separate $400 million will kick-start the Advanced Research Projects Agency-Energy.
Stimulus: $1 billion
2008 budget: $17.2 billion
Highlights: $400 million for science. The joint House-Senate Conference report specifies that "Funding is included herein to accelerate the development of the tier 1 set of Earth science climate research missions recommended by the National Academies Decadal Survey and to increase the agency's supercomputing capabilities". Another $400 million could be spent on rocket development to shrink a "gap" in human spaceflight capability caused by retirement of the space shuttle.
The bill also specifies, "The conference agreement includes $50,000,000 for cross agency support. In allocating these funds, NASA shall give its highest priority to restore NASA-owned facilities damaged from hurricanes and other natural disasters occurring during calendar year 2008."
It is uncertain whether NASA will try to replace the Orbiting Carbon Observatory (OCO) satellite, which crashed into the ocean near Antarctica yesterday when the satellite failed to separate from its booster rocket. The Orbiting Carbon Observatory would have been a big help in determining how CO2 cycles between the atmosphere, ocean, and biosphere. Fortunately, the Japan Aerospace Exploration Agency (JAXA) successfully launched a related satellite, the Greenhouse Gases Observing Satellite IBUKI (GOSAT), on January 23. GOSAT focuses primarily on carbon dioxide and methane sources.
National Oceanic and Atmospheric Administration (NOAA)
Stimulus: $830 million
2008 budget: $3.9 billion
The joint House-Senate Conference report specifies that "$600,000,000 should be spent for construction and repair of NOAA facilities, ships and equipment, to improve weather forecasting and to support satellite development. Of the amounts provided, $170,000,000 shall address critical gaps in climate modeling and establish climate data records for continuing research into the cause, effects and ways to mitigate climate change."
National Institutes of Health
Stimulus: $10 billion
2008 budget: $29.6 billion
National Institute of Standards and Technology
Stimulus: $580 million
2008 budget: $737 million
The money must be spent quickly
Agencies have 60 days to present spending plans to the White House. The money must be spent quickly, with most of the spending required to be completed by September 30, 2010. There is no money earmarked for hurricane science, but the $600 million in NOAA's slice of the pie to help support satellite development could go towards a new QuikSCAT satellite, which would be a big help for marine forecasts and our ability to detect developing tropical storms. The main area hurricane science could use some stimulus money is for basic research into the hurricane intensification problem. I hope NOAA, NASA, and NSF see fit to spend part of the windfall on the people and computers needed to tackle this vital need. I'll have more on the subject of hurricane research progress and needs next week, when I'll be blogging from the 63rd Interdepartmental Hurricane Conference in Tampa, Florida.
You can find the full text of the stimulus bill at the White House web site.
Updated: 22:53 GMT je la 21an de oktobro 2011
By: JeffMasters, 14:58 GMT je la 23an de februaro 2009
Earth recorded its 7th warmest January on record, according to statistics released by the National Climatic Data Center. The most notable extreme temperatures were recorded in southern Australia January 28-31, when the hottest weather since 1939 occurred. January 2009 Northern Hemisphere sea ice extent is unknown, according to the National Snow and Ice Data Center. A sensor error in January caused underestimation of the ice coverage during the month, and a correction needs to be applied to the data. At worst, January 2009 had the 6th lowest Arctic ice extent on record. The record January low was set in 2006. The sensor error does not affect months prior to January, and does not affect the records lows observed in September 2008 and 2007.
A dry January with average temperatures for the U.S.
For the contiguous U.S., January temperatures were near average. It was the 59th warmest January in the 114-year record, according to the National Climatic Data Center. The month was very dry, ranking as the 5th driest January on record. Only ten (preliminary) tornado reports were logged by NOAA's Storm Prediction Center in January, making it the quietest January for tornadoes since 2004, when only three tornadoes were recorded. U.S. records set in January 2009 (courtesy of http://extremeweatherguide.com/updates.asp):
Waterloo, IA: All-time coldest temperature record tied on 1/16, -34°F
Maine: All-time coldest temperature -50°F at Big Black River
At the end of January, 21% of the contiguous United States was in moderate-to-exceptional drought. This is an increase from the 19% figure at the end of December.
Figure 1. Forecast El Niño/La Niña conditions for a number of computer models. El Niño conditions are forecast when the SST anomaly in the Niño 3.4 region goes above 0.5°C (upper red line). La Niña conditions are forecast when the SST anomaly in the Niño 3.4 region goes below -0.5°C (lower red line). Nearly all of the computer models are forecasting neutral conditions during the 2009 hurricane season (August-September-October, ASO). Image credit: Columbia University's IRI.
La Niña conditions continue
La Niña conditions continued in the Eastern Pacific Ocean in January, and NOAA's Climate Prediction Center has issued a La Niña Advisory. They define La Niña conditions as occurring when the 1-month mean temperature anomaly in the equatorial Eastern Pacific (the area 5°N - 5°S, 120°W - 170°W, also called the "Niña 3.4 region") cools below -0.5°C and is expected to persist for three consecutive months. In addition, the atmospheric response typically associated with a La Niña must be observed over the equatorial Pacific Ocean. Sea surface temperatures were 1.0°C below average in the Niña 3.4 region during January, an increase from the -0.73°C anomaly observed in December. However, it appears that La Niña has peaked, as ocean temperatures in the Niña 3.4 region have warmed since late January. Many El Niño forecast models predict a continuation of La Niña conditions through May of 2009. Despite the unusually late start to this La Niña, expected impacts during Spring 2009 include above-average precipitation over Indonesia and below-average precipitation over the central and eastern equatorial Pacific. For the contiguous United States, potential impacts include above-average precipitation in the Ohio and Tennessee Valleys and below-average precipitation across the South, particularly in the southwestern and southeastern states. Other potential impacts include below-average temperatures in the Pacific Northwest and above-average temperatures across much of the southern United States.
Updated: 22:25 GMT je la 21an de oktobro 2011
By: JeffMasters, 19:09 GMT je la 18an de februaro 2009
The top U.S. weather story of 2008 was undoubtedly Hurricane Ike. The National Hurricane Center has released its summary of Ike, and here are some of the highlights:
Ike did $19.3 billion in damage to the U.S.--fourth costliest hurricane on record, behind Katrina, Andrew, and Wilma.
Ike did an additional $4.7 billion in damage after it became extratropical. Hurricane-force wind gusts were reported in Cincinnati, and 2.6 million people lost power in Ohio. The $2.2 billion in damage to Ohio rivaled the 1974 Xenia tornado as that state's costliest natural disaster ever. Ike's remnants also caused Kentucky's most widespread power outage in history (600,000 customers). (However, the 2009 ice storm in Kentucky surpassed this total!)
Figure 1. Ike's tremendous storm surge wiped most of the Bolivar Peninsula north of Galveston clean. Image credit: National Weather Service, Houston/Galveston Office.
Figure 2. Standard 20 foot high utility pole on the Bolivar Peninsula, with debris caught about 18 feet high. The pole stands near the intersection of Highways 87 and 124, near High Island, and is about 2 feet above sea level. The combined action of the storm surge and waves on top of the surge (wave run-up) deposited the debris at the top of this pole. Image credit: Ted Eubanks.
Ike produced a 15-20 foot high storm surge along the east side of Galveston Bay and along the Bolivar Peninsula just to the north of Galveston. This was the second highest storm surge recorded in Texas, behind the 22.1 foot surge of Hurricane Carla in 1961. It is likely that the Great Galveston Hurricane of 1900 and and the 1915 Galveston hurricane had higher storm surges, though, since they were both Category 4 storms. Although Ike was a strong Category 2 hurricane at landfall, its storm surge was characteristic of a strong Category 3 hurricane.
Ike's 10-13 foot storm surge pushed 30 miles inland in Southwest Louisiana, reaching the town of Lake Charles. Isolated areas in Jefferson County, Texas, and Cameron Parish, Louisiana, had surge heights up to 17 ft. Ike's storm surge was 11 feet at Port Arthur, Texas,
Ike killed 20 people in Texas, Louisiana, and Arkansas. Another 34 people from Galveston and the hard-hit Bolivar Peninsula remain missing, according to the Laura Recovery Center, putting Ike's presumed U.S. death toll at 54. This makes Ike the 30th deadliest hurricane in U.S. history. An additional 64 indirect deaths occurred in Texas as a result of electrocution, carbon monoxide poisoning, and pre-existing medical complications. At least 28 direct and indirect deaths were reported in Tennessee, Ohio, Indiana, Illinois, Missouri, Kentucky, Michigan, and Pennsylvania from Ike's remnants. This makes the total death toll from Ike 146, due to direct and indirect deaths, with those people missing presumed dead.
Ike disrupted power to 7.5 million people--the highest ever for a hurricane (Hurricane Frances of 2004 and Hurricane Isabel of 2003 are in second place, with 6 million people affected). The "Superstorm" Blizzard of 1993 (10 million people affected) was the only weather-related disaster to knock out power to more people than Ike in the U.S. Texas and Louisiana had 2.6 million affected, Ohio 2.6 million, and Kentucky 600,000. Power outage figures are difficult to verify and collect, so if anyone has a better list of power outage figures from major weather disasters, I'd like to hear them: firstname.lastname@example.org.
The oil industry was hit hard, with ten offshore rigs destroyed, two large pipelines damaged, and fourteen refineries forced to close. Damage to the Ports of Galveston and Houston, as well as debris in Galveston Bay and the Houston Ship Channel, kept those ports closed after the storm for several days, leaving almost 150 tankers, cargo vessels, and container ships waiting offshore.
Ike damaged Galveston's 14-17 foot high protective sea wall, exposing wooden pilings that support its older sections. The storm also washed away the 70-foot wide beach that helped protect the seawall. As a result, the U.S. Army Corps of Engineers is undertaking the seawall's first major repair job in its 105-year history. About $10 million will be spent repairing the seawall, and an additional $10 million will be spent dumping 400,000 cubic yards of sand to replenish the lost beach.
Outside the U.S.
Cuba suffered $3-$4 billion in damage, and 2.6 million people were forced to evacuate (23% of the population).
The Southeast Bahamas suffered $50-200 million in damage. Additional heavy damage occurred on the nearby Turks and Caicos Islands.
Haiti probably suffered the most from Ike, with 74 deaths and ruinous flooding.
Updated: 20:39 GMT je la 19an de februaro 2009
By: JeffMasters, 13:41 GMT je la 13an de februaro 2009
The lowest temperature ever recorded in the state of Maine, a -50°F reading taken on January 16, has been confirmed as real, according to a press release issued by the U.S. Geological Survey (USGS) and National Weather Service this week. The new record occurred at 7:15 a.m. Jan. 16 at a remote river gauge in Big Black River (see USGS image at right), about four miles from the Canadian border. It ties the record set in 1933 for New England's lowest temperature, set at Bloomfield, Vermont. The old Maine record was -48°F, set in 1925 at Van Buren. All-time state records are difficult to break. The last time a state record low was set occurred January 5, 1999, when Congerville, Illinois recorded -36°F. Only one state record high temperature has been set in the past the decade--the 120°F temperature measured in Usta, South Dakota on July 15, 2006.
All-time record lows are inconsistent with global warming, right?
An impressive cold wave hit the northern and eastern portions of the U.S. January 11-18, with 17 states reporting record daily lows. In addition to the coldest temperature ever measured in Maine, one station, Waterloo, Iowa, tied its 1962 record for all-time coldest temperature, when the mercury hit -34°F on January 16. If global warming is occurring, we should not expect to see very many all-time city or state records being set. The nation's January-December average temperature has increased at a rate of 0.12°F per decade since 1895, and at a faster rate of 0.41°F per decade during the last 50 years. This 2°F rise in temperature has undoubtedly allowed more high temperature than low temperature records to be broken. However, this is a low enough amount of warming that there should still be a few cold temperature records being set, since the weather is so highly variable.
The statistics support this position. The Waterloo, Iowa mark was only the second time this decade that an all-time record cold temperature has been set at a major U.S. city. The cities I consider are the 303 cities author Chris Burt tracks in his excellent Extreme Weather book. The cites chosen were selected based primarily on their length of weather records (all the records go back to at least 1948, with most going back to the 1800s), and include all the largest cities in the U.S. The only other all-time coldest temperature record set at these cities this decade was the -44°F recorded in Grand Forks, North Dakota on 1/30/2004. By contrast, 49 all-time high temperature marks have been set this decade (Figure 1).
Perhaps a better judge of the impact of global warming on extreme temperatures, though, is to look at record warmest and coldest months. Month-long records are more reflective of the climate than an extreme event lasting just a few days. No all-time coldest month records were at any U.S. cities during January 2009, and it was not even close. Despite the cold blast of Jan. 11-18, the month of January finished out above average in temperature for the lower 48 states. So far this decade, no U.S. major city has set an all-time coldest month record. The last time a coldest month record was set occurred in January, 1994 when Caribou Maine and Bayfield, Wisconsin recorded their coldest month. By contrast, there have been 61 all-time warmest month records set in those same 303 cities between 2000 and 2008 (Figure 1). The summer of 2007 alone saw 42 all-time high (or warmest month ever) records. Just one record was set in the summer of 2008.
Figure 1. Minimum and maximum temperatures records for the U.S. for 303 major stations. The image has been updated through January 2009 to include the one record low set that month. The original version of this image was for 2007, and I modified it to update it for four changes made in the 2008 data. The numbers for the decade of the 2000s are correct, but there are four (out of 606) records that need to be subtracted off some of the earlier decades. Note the the 1930s were the most extreme decade for total number of records set, but the 1920s were the least extreme. U.S. weather has a high degree of variability from decade to decade. Image credit: Chris Burt, Extreme Weather.
Are the pattern of U.S. temperature records due to the Urban Heat Island effect?
There have been 110 all-time high temperature or all-time warmest month records set at the 303 major U.S. cities this decade, and only two such low temperature records set. Is this disparity due to global warming, or the Urban Heat Island effect? The Urban Heat Island (UHI) effect occurs when development of former natural areas into pavement and buildings allows more heat to be trapped in cities, particularly at night. During the day, the UHI effect often leads to a slight cooling, since it can increase the amount of turbulence, allowing cooler air to get mixed down to the surface. For example, Moreno-Garcia (1994) found that Barcelona, Spain was 0.2°C cooler for daily maxima and 2.9°C warmer for minima than a nearby rural station.
However, temperature records are typically taken in parks and airports removed from the main heat-trapping areas of cities, and are not as strongly affected as one might expect. There are several reasons for this. One is that when tall buildings are present, they tend to block the view to the sky, meaning that not as much heat can escape upwards. In addition, the presence of moist vegetation keeps the atmosphere moister in park-like areas (which include the grassy fields near airports where temperature measurements are taken). This extra moisture helps cool the atmosphere on a local scale of tens of meters, due to latent heat effects (the energy required to convert liquid water to water vapor). Peterson (2003) found that "Contrary to generally accepted wisdom, no statistically significant impact of urbanization could be found in annual temperatures." The study used satellite-based night-light detection to identify urban areas. Recent research by Spronken-Smith and Oke (1998) concluded that there was a marked park cool island effect within the Urban Heat Island. They found that parks in typical cities in the U.S. have temperatures 1 - 2°C cooler than the surrounding city--and sometimes more than 5°C cooler. While the Urban Heat Island effect probably has contributed to some of the reduction in record low temperatures in the U.S. in the past decade, research by Parker (2004, 2006) and Peterson (2003) theorizes that Urban Heat Island effect is a factor ten or more less important than rising temperatures due to global warming.
Is the Urban Heat Island effect partially responsible for global warming?
Global warming is affecting the entire Earth, including rural areas far from cities, and the 70% of the world covered by ocean. Thus, the Urban Heat Island effect--if not corrected for--can cause only a small impact on the global temperature figures. Since the Urban Heat Island is corrected for, the impact on the observed global warming signal should be negligible. For instance, NASA uses satellite-derived night light observations to classify stations as rural and urban and corrects the urban stations so that they match the trends from the rural stations before gridding the data. Other techniques (such as correcting for population growth) have also been used. Despite these corrections, and the fact that the Urban Heat Island effect impacts only a relatively small portion of the globe, global warming skeptics have persistently used the Urban Heat Island effect to attack the validity of global warming. There are no published peer-reviewed scientific studies that support these attacks.
Parker, D.E., 2004, "Large-Scale Warming is not Urban", Nature 432, 290, doi:10.1038/432290a, 2004.
Parker, D.E., 2006, "A Demonstration that Large-Scale Warming is not Urban", J. Climate 19, pp2882-2986, 2006.
Peterson, T.C., "Assessment of urban versus rural in situ surface temperatures in the contiguous United States: No difference found", Journal of Climate, 16, 2941-2959, 2003.
Spronken-Smith, R. A., and T. R. Oke, 1998: "The thermal regime of urban parks in two cities with different summer climates. Int. J. Remote Sens., 19, 20852104.
The surface temperature record and the urban heat island, realclimate.org post, 2004.
My next post will be sometime Tue-Thu.
Updated: 19:49 GMT je la 16an de aŭgusto 2011
By: JeffMasters, 14:29 GMT je la 11an de februaro 2009
The first killer tornado of 2009 swept through Lone Grove, Oklahoma yesterday, killing eight people and seriously injuring 14 more. Lone Grove is a town of 4,600 about 100 miles south of Oklahoma City. The large, 1/2 mile wide twister dropped down from an isolated supercell thunderstorm that formed in advance of a strong cold front that swept through the Great Plains yesterday afternoon. Three other tornadoes were also reported yesterday, two in Oklahoma and one in Texas. It was the first significant severe weather outbreak of the year in the U.S., and it spawned the first tornadoes of the year in Texas and Oklahoma. This year is a far cry from 2008, which had one of the most severe and deadly early tornado seasons on record, and ended up as the second busiest tornado year since record keeping began in 1950. In January of 2008, 84 tornadoes were observed, compared to the 10 (preliminary) observed in 2009. The slower start to tornado season this year reflects the very dry January we had. January 2009 ranked as the 5th driest and 59 warmest in the 114-year record, according to the National Climatic Data Center. It was the record driest January in portions of Oklahoma, Texas, New Mexico, Missouri, and upstate New York.
The storm that spawned yesterday's tornadoes will drag a cold front across much of the Ohio and Tennessee Valleys today. NOAA's Storm Prediction Center is giving a Slight Risk of severe weather over these regions today. The main threat will be severe thunderstorms with high winds and hail, rather than tornadoes.
Figure 1. Severe weather reports for February 10, 2009. Image credit: NOAA/SPC.
Figure 2. January 2009 precipitation was the 5th lowest on record for the contiguous U.S. Image credit: NOAA/NCDC.
Weather Underground Becomes National Sponsor of Portlight Relief Walks
Charleston, SC - Portlight Strategies, Inc., - a national grassroots non-profit organization--announced this week that Weather Underground, Inc., will be National Sponsor of the First Annual Portlight Relief Walk series of national awareness and fund raising events. Hundreds of members of the Weather Underground blog community teamed with Portlight Strategies, Inc., throughout the Fall of 2008 to provide much need relief services and supplies to victims of Hurricane Ike along the Texas coast.
These relief efforts focused on helping people with disabilities, as well as people in small towns and rural areas often marginalized by the larger institutional relief infrastructure.
Portlight Strategies, Inc., has committed to building on the stunning success of this collaborative, grassroots initiative.
In order to expand our community of supporters - and to create a financial reserve to enable strategic, proactive response in the future--Portlight is organizing walk-a-thon style events in 40 U S cities. Organizing committees have already begun meeting to plan these events in their hometowns and communities. Relief Walks will be held on weekends through the Spring and early Summer.
"Weather Underground stepping up to be our National Sponsor is a huge honor" said Paul Timmons, Jr., Portlight Strategies, Inc., Board Chair. "Our Ike efforts were born of hundreds of weather lovers coming together on the www.wunderground.com site. It's a perfect fit that Weather Underground will sponsor our first national event."
"Weather Underground is proud to have facilitated Portlight's initial success"' said Toby Skinner, of Weather Underground, Inc. "And we are pleased to be National Sponsor of Portlight's First Annual National Relief Walk event.
"The Weather Underground community was the Genesis of our post Ike success," said Timmons, "By agreeing to be National Sponsor of our First Annual National Relief Walk event, Weather Underground, Inc., becomes equally integral to our future ability to serve. We are grateful for their support and confidence."
For more information on Portlight Strategies, Inc., including the story of their Ike relief efforts and information on our Relief Walk, please visit: www.portlight.org or their Featured Blog.
I'll have a new blog post on Thursday.
Updated: 20:43 GMT je la 24an de oktobro 2011
By: JeffMasters, 15:15 GMT je la 09an de februaro 2009
Unprecedented heat, high winds, and years of record drought fanned weekend fires that claimed at least 200 lives in Australia's southeastern state of Victoria. It was Australia's deadliest natural disaster ever. The fires burnt 1200 square miles, an area 80% the size of Rhode Island. "Out there it has been hell on earth", Victorian Premier John Brumby said Saturday in a televised address. It is difficult to imagine more hellish fire conditions than those observed in Victoria state's capital, Melbourne, on Saturday (February 7). The temperature soared to 115.5°F (46.4°C), the hottest ever recorded in the city, besting the previous record of 114°F (45.6°C) set on January 13, 1939. Humidities as low as 4% and sustained north winds that reached 43 mph, gusting to 51 mph, accompanied the furnace-like heat. The dry winds were easily able to fan fires in the parched vegetation. Severe drought conditions reign in Southeast Australia, where some regions--including the city of Melbourne--have experienced their worst drought on record over the past eight years.
The previous worst-ever Australian fires were the 1939 Black Friday fires, which burned an area nearly four times the size of Rhode Island (5800 square miles), killing 71 people, and the February 1983 Ash Wednesday fires, which killed 75 people and burned 1800 square miles.
Figure 1. This image from the Moderate Resolution Imaging Spectroradiometer (MODIS) on NASA's Aqua satellite shows multiple large fires (outlined in red) burning in Southeast Australia's Victoria state on February 7. Huge plumes of smoke spread southeast, driven by fierce winds. Image credit: NASA Earth Observatory.
World's hottest temperature so far south
The heat wave over Southeast Australia began on January 27. A slow-moving high pressure system over the Tasman Sea, combined with an intense tropical low off the northwest coast of Western Australia and an active monsoon, provided ideal conditions for hot, tropical air to move over the southern parts of Australia. The most exceptional heat in January occurred in northern and eastern Tasmania. The previous state record of 105°F (40.8°C), set at Hobart on 4 January 1976, was broken on 29 January when it reached 107°F (41.5°C) at Flinders Island Airport. This record only lasted one day, as Scamander, on the east coast, reached 108°F (42.2°C) on the 30th. Four other sites broke the previous Tasmanian record that day. Nearly half of the island of Tasmania had its hottest day on record on January 30, with many records broken by large margins, particularly in the north. Launceston Airport (39.9°C) broke its previous record by 2.6 degrees°C. This is the second-largest margin by which a record high maximum has been broken at any of the 103 locations in the long-term high-quality Australian temperature data set. The January 2009 event has now been responsible for seven of the eight highest temperatures on record in Tasmania.
After a slight drop in temperatures the first few days of February, extreme heat moved back into Southeast Australia on February 6-7. The hottest temperature ever recorded in Victoria state was measured at Hopetoun, in the northwest, which hit 120°F (48.8°C) on February 7. This bested the old record of 117°F (47.2°C) set at Mildura in January 1939. The new record at Hopetoun is also the hottest temperature in the world so far south. Most of Victoria recorded its hottest day on record on February 7, and fourteen stations exceeded the previous state record temperature.
The January-February 2009 heat wave has also been unusually long lasting. Both Adelaide and Melbourne set records for the most consecutive days above 43°C. Adelaide's temperatures were at this level on each of the four days 27-30 January, and Melbourne's for three days from 28-30 January, breaking the previous records of two at both locations.
Spectacular nocturnal heat burst
On the morning of January 29, an exceptional nocturnal heat event occurred in the northern suburbs of Adelaide around 3 a.m. Strong northwesterly winds mixed hot air aloft to the surface. At RAAF Edinburgh, the temperature rose to 107°F (41.7°C) at 3:04 am. Such an event appears to be without known precedent in southern Australia.
Figure 2. Maximum temperature anomalies for February 7, 2009. Most of the southeast Australia state of Victoria experienced the highest temperatures ever recorded, with temperatures up to 32°F (18°C) above average. The cooler temperatures in northeastern Australia are due to the remnants of Tropical Cyclone Ellie and persistent monsoon rains. Image credit: "The exceptional January-February 2009 heat wave in south-eastern Australia, Australian Government Bureau of Meteorology.
Climate Change and Australia
As to whether this heat wave is due to climate change, Mr. Perry Wiles, senior climatologist with the NSW office of the Bureau of Meteorology said, "Climate change is not only increasing average temperatures, but also the frequency and severity of extreme temperature events. While any one event cannot be attributed to climate change, this heat wave is certainly consistent with that expectation. In a warming world we can expect similar extreme events more often."
Average annual temperatures have increased by about 1.3°F (0.75°C) in Australia over the past century, which is also the what the global average increase in temperature has been. According the 2007 Intergovernmental Panel on Climate Change (IPCC) report, average temperatures in Australia are expected to rise by about 4.7°F (2.6°C) by the year 2100. Droughts are expected to become more severe and more frequent in the southeastern regions affected by this weekend's fires (Figure 3), though other portions of Australia are expected to see increases in rainfall. These sorts of regional precipitation predictions are in their infancy, and we currently have low confidence in them. However, continued increases in summertime temperatures are likely to bring more frequent heat waves like this weekend's to Southeast Australia over the coming century.
Figure 3. Average temperature and precipitation changes over Australia and New Zealand from the 21 climate models used to formulate the 2007 IPCC climate change report (A1B scenario). Annual mean and summertime (December-January-February) changes are plotted for the period 1980-1999 vs. 2080-2099. Image credit: 2007 IPCC report, section 11, "Regional Climate projections".
For more information
"The exceptional January-February 2009 heat wave in south-eastern Australia", Australian Government Bureau of Meteorology.
Updated: 19:51 GMT je la 16an de aŭgusto 2011
By: JeffMasters, 14:48 GMT je la 04an de februaro 2009
Recently, one has been hearing statements in the media like, the "twelve-month long drop in world temperatures wipes out a century of warming" and the Earth has been cooling since 1998. Let's take a look at the validity of these statements. The warmest year on record, according to both NASA and NOAA's National Climatic Data Center (NCDC), was 2005. However, 1998 was virtually tied with 2005 for warmth, and the United Kingdom Hadley Center and Climatic Research Unit data set (HadCRU) rates 1998 as the warmest year on record. The three data sets use different methods, such as how they interpolate over missing data regions over the Arctic Ocean, and so they arrive at slightly different numbers for the the global average temperature. All three data sets are considered equally valid, so ignoring two of the three major data sets to claim that the globe has been cooling since 1998 is "cherry picking" the data to show the result you want.
Furthermore, 1997-1998 El Niño event was the second strongest of the past century. El Niño events directly warm a large part of the Pacific, and indirectly warm (via a large increase in water vapor), an even larger region. This extra warming--estimated to have boosted the global temperature an extra 0.1-0.2°C--made 1998's warmth spike sharply upwards from the globe's usual temperature. The climate is best measured by a multi-year average of global temperatures, in order to remove shorter-term oscillations in weather patterns like El Niño. It is not scientifically valid to base a cooling argument on a year that spiked sharply upwards from the norm because of one the largest El Niño events in recorded history. A valid way to measure whether the globe is warming or cooling is to use the average global temperature for the past ten years or longer. The 1999-2008 period was significantly warmer (by 0.18°C, according to NOAA) than the previous ten year period, despite the fact the record (or near-record) warmest year 1998 was part of this previous period. Thus, it is scientifically correct to say the globe has been warming since 1998, not cooling. This warming rate has been about 0.16°C per decade over the past thirty years. Note that even over time periods as long as eight years, the average global temperature is not always a good measure of the long-term global warming trend--particularly if a large volcanic eruption in the tropics occurs.
How often should we expect to see a new global temperature record?
The climate should warm at a rate of about 0.19°C (0.34°F) per decade, according to the computer climate models used to formulate the "official word" on climate, the 2007 report of the Intergovernmental Panel on Climate Change (IPCC). Thus, we should expect to see frequent "warmest years on record". However, 2006, 2007, and 2008 were all cooler than 2005, and 2008 was merely the ninth warmest year on record. We know that the weather has a high degree of natural variability, with warmer than average years mixed in with cooler ones. How often, then, should we expect to set a new global temperature record if the climate is warming in accordance with global warming theory?
Figure 1. Predicted and observed global annual average temperatures between 1990-2008. The thin colored lines represent 55 individual runs of the twenty computer climate models used to formulate the 2007 IPCC report. These runs were done for the A1B "business as usual" scenario, which most closely matches recent emissions. The thick black line is the multi-model mean, and the thick colored lines with symbols denote actual observations, as computed by the three major research groups that estimate annual global temperatures. The sharp down spike in 1991-1992 is due to the eruption of Mt. Pinatubo, which cooled the Earth for two years. You can make these type of plots yourself, using the publicly available PCMDI IPCC AR4 archive. Image credit: Dr. Gavin Schmidt, realclimate.org.
The twenty models used to formulate the 2007 IPCC report (Figure 1) all predict the climate will warm, but with a lot of year-to-year variability due to natural weather patterns such as El Niño. Some of the IPCC models forecast periods lasting many years (in the extreme case, twenty years) with no global warming, due to natural climate and weather oscillations. If one plots up the cumulative distribution of these IPCC model runs to see how often a global average temperature record should be broken (Figure 2), one sees that the models predict a 50% chance that we'll unambiguously break the record every six years. By an unambiguous record, I mean a record that exceeds the previous one by at least 0.1°C. We've now gone ten years without unambiguously breaking the global temperature record, which the models say should happen 25% of the time. There is a 5% chance we'll go eighteen years without unambiguously breaking the record, so it is quite possible for natural variability in the climate system to obscure the global warming signal for periods of nearly twenty years. If we still haven't had a new global temperature record by 2018, then it is time to question global warming theory. If the theory is correct, there is a good chance that we will break the global temperature record during the next year that has a moderate or stonger El Niño event (and no major volcanic eruption in the tropics, since such major eruptions can dramatically cool the climate). Since we have La Niña conditions to start 2009, it is unlikely this year will break the record.
Figure 2. Cumulative distribution of how long one would have to wait for a new global temperature record to be set between the years 1990 and 2030. Image is based on the twenty climate models used to formulate the 2007 IPCC report, using the A1B "business as usual" scenario. The curves should be read as the percentage chance of seeing a new record (Y axis) if you waited the number of years on the X axis. The two curves are for a new record of any size (black) and for an unambiguous record (> 0.1°C above the previous record, red). The 95% confidence line is marked in gray. The main result is that 95% of the time, a new record will be seen within 8 years, but that for an unambiguous record, you need to wait for 18 years to have similar confidence. Image credit: Realclimate.org.
Is global warming slowing down?
The global average temperature has declined over the past three years (Figure 1) and global average sea surface temperature (SST) has not increased over the past seven years (Figure 3). Is global warming slowing down, then, and taking a break? That was the theory advanced by a group of German climate modelers (Keenlyside et al., 2008) in the journal Nature in 2008. Using a climate model that offered a unique way to handle the initial distribution of SSTs, they concluded that over the next ten years, natural variations in the climate may temporarily mask the global warming due to greenhouse gases. They stated: "North Atlantic SST and European and North American surface temperatures will cool slightly, whereas tropical Pacific SST will remain almost unchanged. Our results suggest that global surface temperature may not increase over the next decade, as natural climate variations in the North Atlantic and tropical Pacific temporarily offset the projected anthropogenic warming". However, they go on to state that greenhouse-gas driven global warming will resume full-force after the ten-year break is over. Other climate modelers disagree with this predicted "break" in global warming. Both theories are reasonable ones, and it is possible that the recent cool years portend the ten-year "break" from global warming hypothesized by Keenlyside et al. It is too early to tell, since the relative coolness of the past few years could easily be natural "noise" (weather) imposed on the long-term global warming trend. The fact that we've had a cold winter in eastern North America and in the UK--or any other anecdotal cold or snow-related record you may hear about--can't tell us whether global warming may be slowing down or not. The amount of global warming over the past century has only been about 1.3°F (0.74°C). Thus, it should not surprise us, for example, if temperatures during tonight's hard freeze in Florida bottom out at 25°F, instead of the 24°F it would have reached 100 years ago. The long-term ten and thirty year trends in global temperature are solidly upwards in accordance with global warming theory, and claims that the globe is cooling cannot be scientifically defended.
Figure 3. Global average sea surface temperatures (SSTs) from 1990-2008. SSTs have not increased in the past seven years. Image credit: NASA/GISS.
Keenlyside, N.S., M. Latif, J. Jungclaus, L. Kornblueh, and E. Roeckner, 2008, "Advancing decadal-scale climate prediction in the North Atlantic sector", Nature 453, No. 7191, pp. 84-88, May 1, 2008
2008 temperature summaries and spin by Gavin Schmidt of realclimate.org.
My next post will be on Monday.
Updated: 19:51 GMT je la 16an de aŭgusto 2011
By: JeffMasters, 13:39 GMT je la 03an de februaro 2009
Heavy snow in London, England yesterday did something the Nazi Blitz could not--stop the city bus system. The heaviest snow in 18 years hit the city yesterday, when eight inches (20 cm) of snow blanketed the city. All five major airports in the region closed, all London city buses were pulled off the streets, and the subway system was nearly paralyzed by the snows. It was the heaviest snow since February 7, 1991, when 25 cm fell. In that storm, wind-driven snow drifts of up to three meters (ten feet) crippled road and rail networks across the country.
Heavy snows are rare in London, due to the moderating influence of the ocean waters that surround the British Isles. The prevailing winds from the west or northwest pass over these ocean waters, which are heated by the warm Gulf Stream Current. This makes rain the usual form of winter precipitation for London. Monday's snowstorm was made possible by a strong high pressure system over Scandinavia, whose clockwise flow pumped cold air from western Russia westward over the North Sea and into Britain. A trough of low pressure that formed in this cold air mass moved over Britain, bringing heavy snow. As the cold air accompanying the trough passed over the relatively warm waters of the North Sea, it picked up moisture and instability, enhancing the snows over England in a manner similar to Lake Effect snowstorms over the Great Lakes of North America. The instability of the air mass within Monday's snowstorm was so intense that several UK locations reported "thundersnow", a very rare occurrence in the British Isles. Monday's snow storm also caused traffic problems in France, Switzerland, and Spain. The storm spawned a possible EF2 tornado in southern Spain, as well.
This winter has been the coldest winter since 1996-1997 for England and Wales. Minimum temperatures were as much as 2°C below average in December in western portions of the UK. A cold air outbreak the first week of January brought some of the coldest temperatures seen in southern England since 1991.
Figure 1. AVHRR visible satellite image of the United Kingdom on February 2, 2009, at 13:28 GMT. A strong trough of low pressure, propelled by a cold arctic flow of air from Russia, is crossing over London, dumping heavy snow. Image credit: University of Bern, Switzerland.
I'll have a new post Wednesday or Thursday.
Updated: 19:05 GMT je la 05an de januaro 2012
By: JeffMasters, 13:55 GMT je la 02an de februaro 2009
Punxsutawney Pennsylvania's famous prognosticating rodent, Punxsutawney Phil, saw his shadow this morning. According to tradition, this means that a solid six more weeks of winter can be expected across the U.S. From the official web site of the Punxsutawney Groundhog Club, groundhog.org:
Here Ye! Here Ye! Here Ye!
On Gobbler's Knob on this fabulous Groundhog Day, February 2nd, 2009
Punxsutawney Phil, the Seer of Seers, Prognosticator of all Prognosticators,
Awoke to the call of President Bill Cooper
And greeted his handlers, Ben Hughes and John Griffiths
After casting a joyful eye toward thousands of his faithful followers,
Phil proclaimed that his beloved Pittsburgh Steelers were World Champions one more time
And a bright sky above me
Showed my shadow beside me.
Six more weeks of winter it will be.
How did this this crazy tradition start?
It all started in Europe, centuries ago, when February 2 was a holiday called Candlemas. On Candlemas, people prayed for mild weather for the remainder of winter. The superstition arose that if a hibernating badger woke up and saw its shadow on Candlemas, there would be six more weeks of severe winter weather. When Europeans settled the New World, they didn't find any badgers. So, instead of building wooden badgers, they decided to use native groundhogs (aka the woodchuck, land beaver, or whistlepig) as their prognosticating rodent.
What the models say
The latest 16-day run of the GFS model shows the jet stream retreating to a position in southern Canada in about a week, which will usher in milder temperatures over the eastern half of the U.S. compared to average. However, the model predicts that a series of cold air outbreaks typical for February will occasionally dip down over northern regions of the U.S. over the coming two weeks, bringing colder than average temperatures to the Pacific Northwest, and near average temperatures to the Midwest and Northeast. The latest 1-month and 3-month outlooks from NOAA's Climate Prediction Center show a continuation of this pattern for the remainder of winter and into Spring, with a heightened chance of above-average temperatures in the Southern U.S. Cooler than average temperatures in the Northern Plains are typical when weak La Niña conditions are present in the Eastern Pacific (Figure 1), as is currently the case.
Figure 1. Departure of winter temperature from average for winters when a weak La Niña event was present. Temperatures in the Northern Plains have typically been 1-2°F below average for the eight winters in the past 50 years that have had weak La Niña events. Image credit: Jan Null, Golden Gate Weather, and NOAA/ESRL.
Kentucky's ice storm
Six more weeks of winter is not what ice storm-battered Kentucky needs, as the state continues to recover from its most widespread power outage in history. High temperatures Tuesday and Wednesday will be below freezing over the most of the state, but should warm into the 50s by the end of the week. More than half of the 600,000 customers affected by the outage have now had their power restored. The previous largest power outage in state history occurred just five months ago, when the remnants of Hurricane Ike brought wind gusts near hurricane force to Kentucky.
The Groundhog Oscillation: convincing evidence of climate change
According to a 2001 article published in the prestigious Annals of Improbable Research titled, "The Groundhog Oscillation: Evidence of Global Change", Punxsutawney Phil's forecasts have shown a high variability since 1980. This pattern, part of the larger "Groundhog Oscillation" or GO cycle, is convincing evidence of human-caused climate change.
More on climate change in my next post on Tuesday or Wednesday, when I'll look at claims that the Earth has been cooling since 1998.
Updated: 19:05 GMT je la 05an de januaro 2012
The views of the author are his/her own and do not necessarily represent the position of The Weather Company or its parent, IBM.
Cat 6 lead authors: WU cofounder Dr. Jeff Masters (right), who flew w/NOAA Hurricane Hunters 1986-1990, & WU meteorologist Bob Henson, @bhensonweather