Archive for March, 2012

Four Hundred Chernobyls: Solar Flares, Electromagnetic Pulses and Nuclear Armageddon

ChernobylAn abandoned middle school, part of the contaminated area surrounding the Chernobyl Nuclear Power Plant, in Pripyat, Ukraine, March 17, 2011. The ghost town which once had a population of about 50,000 people, was given a few hours to evacuate in April 1986 as radiation streamed into populated areas after an explosion at the reactor. (Photo: Joseph Sywenkyj / The New York Times)

 

There are nearly 450 nuclear reactors in the world, with hundreds more being planned or under construction. There are 104 of these reactors in the United States and 195 in Europe. Imagine what havoc it would wreak on our civilization and the planet’s ecosystems if we were to suddenly witness not just one or two nuclear meltdowns, but 400 or more! How likely is it that our world might experience an event that could ultimately cause hundreds of reactors to fail and melt down at approximately the same time? I venture to say that, unless we take significant protective measures, this apocalyptic scenario is not only possible, but probable.

Consider the ongoing problems caused by three reactor core meltdowns, explosions and breached containment vessels at Japan’s Fukushima Daiichi facility and the subsequent health and environmental issues. Consider the millions of innocent victims who have already died or continue to suffer from horrific radiation-related health problems (“Chernobyl AIDS,” epidemic cancers, chronic fatigue, etcetera) resulting from the Chernobyl reactor explosions, fires and fallout. If just two serious nuclear disasters, spaced 25 years apart, could cause such horrendous environmental catastrophes, it is hard to imagine how we could ever hope to recover from hundreds of similar nuclear incidents occurring simultaneously across the planet. Since more than one-third of all Americans live within 50 miles of a nuclear power plant, this is a serious issue that should be given top priority.[1]

In the past 152 years, Earth has been struck by roughly 100 solar storms, causing significant geomagnetic disturbances (GMD), two of which were powerful enough to rank as “extreme GMDs.” If an extreme GMD of such magnitude were to occur today, in all likelihood, it would initiate a chain of events leading to catastrophic failures at the vast majority of our world’s nuclear reactors, similar to but over 100 times worse than, the disasters at both Chernobyl and Fukushima. When massive solar flares launch a huge mass of highly charged plasma (a coronal mass ejection, or CME) directly toward Earth, colliding with our planet’s outer atmosphere and magnetosphere, the result is a significant geomagnetic disturbance.

The last extreme GMD of a magnitude that could collapse much of the US grid was in May of 1921, long before the advent of modern electronics, widespread electric power grids, and nuclear power plants. We are, mostly, blissfully unaware of this threat and unprepared for its consequences. The good news is that relatively affordable equipment and processes could be installed to protect critical components in the electric power grid and its nuclear reactors, thereby averting this “end-of-the-world-as-we-know-it” scenario. The bad news is that even though panels of scientists and engineers have studied the problem, and the bipartisan Congressional electromagnetic pulse (EMP) commission has presented a list of specific recommendations to Congress, our leaders have yet to approve and implement any significant preventative measures.

Most of us believe that an emergency like this could never happen, and that, if it could, our “authorities” would do everything in their power to prevent such an apocalypse. Unfortunately, the opposite is true. “How could this happen?” you might ask.

Nuclear Power Plants and the Electric Power Grid

Our current global system of electrical power generation and distribution (“the grid”), upon which our modern lifestyles are utterly dependent, is extremely vulnerable to severe geomagnetic storms, which tend to strike our planet on an average of approximately once every 70 to 100 years. We depend on this grid to maintain food production and distribution, telecommunications, Internet services, medical services, military defense, transportation, government, water treatment, sewage and garbage removal, refrigeration, oil refining, gas pumping and all forms of commerce.

Unfortunately, the world’s nuclear power plants, as they are currently designed, are critically dependent upon maintaining connection to a functioning electrical grid, for all but relatively short periods of electrical blackouts, in order to keep their reactor cores continuously cooled so as to avoid catastrophic reactor core meltdowns and fires in storage ponds for spent fuel rods.

If an extreme GMD were to cause widespread grid collapse (which it most certainly will), in as little as one or two hours after each nuclear reactor facility’s backup generators either fail to start, or run out of fuel, the reactor cores will start to melt down. After a few days without electricity to run the cooling system pumps, the water bath covering the spent fuel rods stored in “spent-fuel ponds” will boil away, allowing the stored fuel rods to melt down and burn [2]. Since the Nuclear Regulatory Commission (NRC) currently mandates that only one week’s supply of backup generator fuel needs to be stored at each reactor site, it is likely that, after we witness the spectacular nighttime celestial light show from the next extreme GMD, we will have about one week in which to prepare ourselves for Armageddon.

To do nothing is to behave like ostriches with our heads in the sand, blindly believing that “everything will be okay” as our world drifts towards the next natural, inevitable super solar storm and resultant extreme GMD. Such a storm would end the industrialized world as we know it, creating almost incalculable suffering, death and environmental destruction on a scale not seen since the extinction of the dinosaurs some 65 million years ago.

The End of “The Grid” as We Know It

There are records from the 1850s to today of roughly 100 significant geomagnetic solar storms, two of which, in the last 25 years, were strong enough to cause millions of dollars worth of damage to key components that keep our modern grid powered. In March of 1989, a severe solar storm induced powerful electric currents in grid wiring that fried a main power transformer in the HydroQuebec system, causing a cascading grid failure that knocked out power to 6 million customers for nine hours and damaging similar transformers in New Jersey and the UK. More recently, in 2003, a less intense but longer solar storm caused a blackout in Sweden and induced powerful currents in the South African grid that severely damaged or destroyed 14 of their major power transformers, impairing commerce and comfort over major portions of that country as it was forced to resort to massive rolling blackouts that dragged on for many months.[3]

During the great geomagnetic storm of May 14-15, 1921, brilliant aurora displays were reported in the Northern Hemisphere as far south as Mexico and Puerto Rico, and in the Southern Hemisphere as far north as Samoa.[4] This extreme GMD produced ground currents roughly ten times as strong as the 1989 Quebec incident. Just 62 years earlier, the great granddaddy of recorded GMDs, referred to as “the Carrington Event,” raged from August 28 to September 4, 1859. This extreme GMD induced currents so powerful that telegraph lines, towers and stations caught on fire at a number of locations around the world. Best estimates are that the Carrington Event was approximately 50 percent stronger than the 1921 storm.[5] Since we are headed into an active solar period much like the one preceding the Carrington Event, scientists are concerned that conditions could be ripe for the next extreme GMD.[6]

Prior to the advent of the microchip and modern extra-high-voltage (EHV) transformers (key grid components that were first introduced in the late 1960s), most electrical systems were relatively robust and resistant to the effects of GMDs. Given that a simple electrostatic spark can fry a microchip and thousands of miles of power lines could act like giant antennas for capturing massive amounts of GMD-spawned electromagnetic energy, modern electrical systems are far more vulnerable than their predecessors.

The federal government recently sponsored a detailed scientific study to better understand how much critical components of our national electrical power grid might be affected by either a naturally occurring GMD or a man-made EMP. Under the auspices of the EMP Commission and the Federal Emergency Management Agency (FEMA), and reviewed in depth by the Oak Ridge National Laboratory and the National Academy of Sciences, Metatech Corporation undertook extensive modeling and analysis of the potential effects of extreme geomagnetic storms on the US electrical power grid. Based upon a storm as intense as the 1921 storm, Metatech estimated that within the United States, induced voltage and current spikes, combined with harmonic anomalies, would severely damage or destroy over 350 EHV power transformers critical to the functioning of the US grid and possibly impact well over 2000 EHV transformers worldwide.[7]

EHV transformers are made to order and custom-designed for each installation, each weighing as much as 300 tons and costing well over $1 million. Given that there is currently a three-year waiting list for a single EHV transformer (due to recent demand from China and India, lead times grew from one to three years), and that the total global manufacturing capacity is roughly 100 EHV transformers per year when the world’s manufacturing centers are functioning properly, you can begin to grasp the implications of widespread transformer losses. 

The loss of thousands of EHV transformers worldwide would cause a catastrophic grid collapse across much of the industrialized world. It will take years, at best, for the industrialized world to put itself back together after such an event, especially considering the fact that most of the manufacturing centers that make this equipment will also be grappling with widespread grid failure.

Our Nuclear “Achilles Heel”

Five years ago, I visited the still highly contaminated areas of Ukraine and the Belarus border where much of the radioactive plume from Chernobyl descended on 26 April 1986. I challenge chief scientist John Beddington and environmentalists like George Monbiot or any of the pundits now downplaying the risks of radiation to talk to the doctors, the scientists, the mothers, children and villagers who have been left with the consequences of a major nuclear accident. It was grim. We went from hospital to hospital and from one contaminated village to another. We found deformed and genetically mutated babies in the wards; pitifully sick children in the homes; adolescents with stunted growth and dwarf torsos; fetuses without thighs or fingers and villagers who told us every member of their family was sick. This was 20 years after the accident, but we heard of many unusual clusters of people with rare bone cancers…. Villages testified that ‘the Chernobyl necklace’ – thyroid cancer – was so common as to be unremarkable.
– John Vidal, “Nuclear’s Green Cheerleaders Forget Chernobyl at Our Peril,” The Guardian, April 1, 2011
[8]

What do extended grid blackouts have to do with potential nuclear catastrophes? Nuclear power plants are designed to disconnect automatically from the grid in the event of a local power failure or major grid anomaly; once disconnected, they begin the process of shutting down the reactor’s core. In the event of the loss of coolant flow to an active nuclear reactor’s core, the reactor will start to melt down and fail catastrophically within a matter of a few hours, at most. In an extreme GMD, nearly every reactor in the world could be affected.

It was a short-term cooling-system failure that caused the partial reactor core meltdown in March 1979 at Three Mile Island, Pennsylvania. Similarly, according to Japanese authorities, it was not direct damage from Japan’s 9.0 magnitude Tohoku Earthquake on March 11, 2011, that caused the Fukushima Daiichi nuclear reactor disaster, but the loss of electric power to the reactor’s cooling system pumps when the reactor’s backup batteries and diesel generators were wiped out by the ensuing tidal waves. In the hours and days after the tidal waves shuttered the cooling systems, the cores of reactors number 1, 2 and 3 were in full meltdown and released hydrogen gas, fueling explosions which breached several reactor containment vessels and blew the roof off the building housing reactor number 4’s spent-fuel storage pond. Of even greater danger and concern than the reactor cores themselves are the spent fuel rods stored in on-site cooling ponds. Lacking a permanent spent nuclear fuel storage facility, so-called “temporary” nuclear fuel containment ponds are features common to nearly all nuclear reactor facilities. They typically contain the accumulated spent fuel from ten or more decommissioned reactor cores. Due to lack of a permanent repository, most of these fuel containment ponds are greatly overloaded and tightly packed beyond original design. They are generally surrounded by common light industrial buildings with concrete walls and corrugated steel roofs. Unlike the active reactor cores, which are encased inside massive “containment vessels” with thick walls of concrete and steel, the buildings surrounding spent fuel rod storage ponds would do practically nothing to contain radioactive contaminants in the event of prolonged cooling system failures.

Since spent fuel ponds typically hold far greater quantities of highly radioactive material then the active nuclear reactors locked inside reinforced containment vessels, they clearly present far greater potential for the catastrophic spread of highly radioactive contaminants over huge swaths of land, polluting the environment for multiple generations. A study by the Nuclear Regulatory Commission (NRC) determined that the “boil down time” for spent fuel rod containment ponds runs from between 4 and 22 days after loss of cooling system power before degenerating into a Fukushima-like situation, depending upon the type of nuclear reactor and how recently its latest batch of fuel rods had been decommissioned.[9]

Reactor fuel rods have a protective zirconium cladding, which, if superheated while exposed to air, will burn with intense, self-generating heat, much like a magnesium fire, releasing highly radioactive aerosols and smoke. According to nuclear whistleblower and former senior vice president for Nuclear Engineering Services Arnie Gundersen, once a zirconium fire has started, due to its extreme temperatures and high reactivity, contact with water will result in the water dissociating into hydrogen and oxygen gases, which will almost certainly lead to violent explosions. Gundersen says that once a zirconium fuel rod fire has started, the worst thing you could do is to try to quench the fire with water streams, which would cause violent explosions. Gundersen believes the massive explosion that blew the roof off the spent fuel pond at Fukushima was caused by zirconium-induced hydrogen dissociation.[10]

Had it not been for heroic efforts on the part of Japan’s nuclear workers to replenish waters in the spent fuel pool at Fukushima, those spent fuel rods would have melted down and ignited their zirconium cladding, which most likely would have released far more radioactive contamination than what came from the three reactor core meltdowns. Japanese officials have estimated that Fukushima Daiichi has already released just over half as much total radioactive contamination as was released by Chernobyl into the local environment, but other sources estimate it could be significantly more than at Chernobyl. In the event of an extreme GMD-induced long-term grid collapse covering much of the globe, if just half of the world’s spent fuel ponds were to boil off their water and become radioactive, zirconium-fed infernos, the ensuing contamination could far exceed the cumulative effect of 400 Chernobyls.

Electromagnetic Pulse (EMP) Attack

Many of the control systems we considered achieved optimal connectivity through Ethernet cabling. EMP coupling of electrical transients to the cables proved to be an important vulnerability during threat illumination…. The testing and analysis indicate that the electronics could be expected to see roughly 100 to 700 ampere current transients on typical Ethernet cables. Effects noted in EMP testing occurred at the lower end of this scale. The bottom line observation at the end of the testing was that every system failed when exposed to the simulated EMP environment.
— Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack [11]

Electromagnetic pulses (EMPs) and solar super storms are two different, but related, categories of events that are often described as high-impact, low frequency (HILF) events. Events categorized as HILF don’t happen very often, but if and when they do, they have the potential to severely affect the lives of millions of people. Think of an EMP as a super-powerful radio wave capable of inducing damaging voltage spikes in electrical wires and electronic devices across vast geographical areas. (Note that the geomagnetic effects of solar storms are also described as “natural EMP.”)

What is generally referred to as an EMP strike is the deliberate detonation of a nuclear device at a high altitude, roughly defined as somewhere between 24 and 240 miles (40 and 400 kilometers) above the surface of the Earth. Nuclear detonations of this type have the potential to seriously damage electronics and electrical power grids along their line of sight, covering distances on the order of an area 1,500 miles (2,500 kilometers) in diameter, an area roughly equal to the distance between Quebec City in Canada and Dallas, Texas.

The concern is that some rogue state or terrorist organization might build its own nuclear device from scratch or buy one illegally, procure a Scud missile (or similar weapon) on the black market and launch the nuclear device from a large fishing boat or freighter somewhere off the coast of the United States, causing grid collapse and widespread damage to electronic devices across roughly 50 percent of America. Much like an extreme GMD, a powerful EMP attack would also cause widespread grid collapse, but one limited to a much smaller geographical area.

A powerful EMP from a sub-orbital nuclear detonation would cause extreme electromagnetic effects, starting with an initial, short-duration, “speed of light” pulse, referred to as an “E1” effect, followed by a middle-duration pulse called an “E2” effect, followed by a longer-duration disturbance known as an “E3” effect. The “E1” effect lasts a few nanoseconds and is similar to massive discharges of electrostatic sparks, which are particularly damaging to digital microelectronic chips used in most modern electronic equipment.

The “E2” effects last a fraction of a second and are equal to many thousands to millions of lightning strikes hitting over a widespread area at almost exactly the same time. In the case of a nuclear-induced EMP, its E3 effect starts after about a half-second and may continue for several minutes. The E3 effect can be thought of as a “long, slow burn,” and, electromagnetically, it is quite similar to the effects from an extreme GMD, except that the latter may continue for a number of hours or days.

A “successful” EMP attack launched against the US would most likely result in the immediate collapse of the grid across roughly 50 percent of the country, a stock market crash and critical failures in many affected areas’ electronic systems that control nuclear reactors, chemical plants, telecommunications systems and industrial processes. These systems include programmable logic controllers (PLC), digital control systems (DCS), and supervisory control and data acquisition systems (SCADA).

The only good news about an EMP strike is that its effect will cover a much smaller area than an extreme GMD, so there will be a significant portion of the rest of the United States, as well as the rest of the outside world, left intact and able to lend a hand toward rebuilding critical infrastructure in the affected areas. Imagine the near-total loss of a functioning infrastructure across an area of about a million square miles (approximately 1.6 million square kilometers, roughly equivalent to 50 Hurricane Katrinas happening simultaneously) and you will have some idea of the potentially crippling effect of an EMP attack from a single, medium-sized, sub-orbital nuclear detonation!

Preventing Armageddon

The Congressionally mandated EMP Commission has studied the threat of both EMP and extreme GMD events and made recommendations to the US Congress to implement protective devices and procedures to ensure the survival of the grid and other critical infrastructures in either event. John Kappenman, author of the Metatech study, estimates that it would cost about $1 billion to build special protective devices into the US grid to protect its EHV transformers from EMP or extreme GMD damage and to build stores of critical replacement parts should some of these items be damaged or destroyed. Kappenman estimates that it would cost significantly less than $1 billion to store at least a year’s worth of diesel fuel for backup generators at each US nuclear facility and to store sets of critical spare parts, such as backup generators, inside EMP-hardened steel containers to be available for quick change-out in the event that any of these items were damaged by an EMP or GMD.[12]

For the cost of a single B-2 bomber or a tiny fraction of the Troubled Asset Relief Program (TARP) bank bailout, we could invest in preventative measures to avert what might well become the end of life as we know it. There is no way to protect against all possible effects from an extreme GMD or an EMP attack, but we could implement measures to protect against the worst effects. Since 2008, Congress has narrowly failed to pass legislation that would implement at least some of the EMP Commission’s recommendations.[13]

We have a long ways to go to make our world EMP and GMD safe. Citizens can do their part to push for legislation to move toward this goal and work inside our homes and communities to develop local resilience and self reliance, so that in the event of a long-term grid-down scenario, we might make the most of a bad situation. The same tools that are espoused by the Transition movement for developing local self-reliance and resilience to help cope with the twin effects of climate change and peak oil could also serve communities well in the event of an EMP attack or extreme GMD. If our country were to implement safeguards to protect our grid and nuclear power plants from EMP, it would also eliminate the primary incentive for a terrorist to launch an EMP attack. The sooner we take these actions, the less chance that an EMP attack will occur.

For more information or to get involved, see http://empactamerica.org, http://survive-emp.com and http://www.transitionnetwork.org, or contact your Congressperson at http://www.contactingthecongress.org.

Endnotes
[1] Bill Dedman, “Nuclear Neighbors: Population Rises Near Nuclear Reactors,” MSNBC.com. Accessed December 2011.

[2] Dina Cappiello, “Long Blackouts Pose Risk to U.S. Nuclear Reactors,” Associated Press, March 29, 2011.

[3] Lawrence E. Joseph, “The Sun Also Surprises,” New York Times, August 15, 2010. Accessed August 2010.

[4] S. M. Silverman and E. W. Cliver, “Low-Altitude Auroras: The Magnetic Storm of 14-15 May 1921,” Journal of Atmospheric and Solar-Terrestrial Physics 63, (2001), p. 523-535. Additionally, “High-Impact, Low-Frequency Event Risk to the North American Bulk Power System: A Jointly Commissioned Summary Report of the North American Electric Reliability Corporation and the U.S. Department of Energy’s November 2009 Workshop,” June, 2010, p. 68.

[5] Committee on the Societal and Economic Impacts of Severe Space Weather Events: A Workshop National Research Council, “Severe Space Weather Events: Understanding Societal and Economic Impacts Workshop Report,” National Research Council of the National Academies (2008), p. 7-13, and p. 100. Additionally, E. W. Cliver and L. Svalgaard, “The 1859 Solar-Terrestrial Disturbance and the Current Limits of Extreme Space Weather Activity,” Solar Physics (2004) 224, P. 407-422.

[6] Richard A. Lovett, “What if the Biggest Solar Storm on Record Happened Today?” National Geographic News, March 2, 2011. Accessed December 2011.

[7] John Kappenman, “Geomagnetic Storms and Their Impacts on the U.S. Power Grid,” Metatech Corporation, prepared for Oak Ridge National Laboratory, Meta-R-319, January 2010, p. 2-29.

[8] John Vidal, “Nuclear’s Green Cheerleaders Forget Chernobyl at Our Peril,” Guardian.co.uk, April 1, 2011.  Accessed May 2011.

[9] NUREG-1738, “Technical Study of Spent Fuel Pool Accident Risk at Decommissioning Nuclear Power Plants,” February 2001, as reported in “Petition for Rulemaking: Docket No. PRM-50-96,” Foundation for Resilient Societies before the Nuclear Regulatory Commission, p. 3-9 and 49-50. Accessed December, 2011.

[10] Arnold Gundersen, interview by author, November 2011.

[11] “Report of the Commission to Assess the Threat to the United States from Electromagnetic Pulse (EMP) Attack: Critical National Infrastructures,” April, 2008, p. 6.

[12] John Kappenman, interview by author, December 2011.

[13] Dr. Peter Vincent Pry, “Statement Before the Congressional Caucus on EMP,” EMPact America, February 15, 2011. Accessed November 2011.

Advertisements

Planet Under Pressure 2012

See URL:

 

http://www.planetunderpressure2012.net/

 

 

^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

 

Sustainable Cities: Meeting the Challenge of Rapid Urbanization the Focus of “Planet Under Pressure 2012”

Published March 28, 2012 10:21 AM

Sustainable Cities: Meeting the Challenge of Rapid Urbanization the Focus of “Planet Under Pressure 2012”

/wildlife/article/44192/print

Addressing the social, environmental and economic challenges associated with rapid and growing urbanization is bringing some 3,000 experts from around the world together in London this week for the “Planet Under Pressure 2012” conference.

 

With world population forecast to increase from 7 billion today to more than 9 billion by 2050, humanity’s urban footprint will take up 1.5 million more square kilometers of land by 2030 at current rates, an area comparable to that of France, Germany and Spain combined. That translates into an average 1 million more city dwellers every week for the next 38 years, with the world’s total urban population forecast to increase from 3.5 billion today to 6.3 billion by 2050, according to Planet Under Pressure 2012 conference organizers.

These trends are impossible to stop, practically speaking, which means that rather than addressing the question as to whether or not urbanization should take place, but how best to urbanize, states Dr. Michael Fragkias of Arizona State University, one among nearly 3,000 conference participants.

“Today’s ongoing pattern of urban sprawl puts humanity at severe risk due to environmental problems,” Fragkias adds, issues that conference attendees intend to discuss, debate and offer solutions to. “Dense cities designed for efficiency offer one of the most promising paths to sustainability, and urbanization specialists will share a wealth of knowledge available to drive solutions.”

Article continues: http://www.triplepundit.com/2012/03/sustainable-cities-meeting-challenge-rapid-urbanization-focus-planet-pressure-2012/

IPCC predicts rise in extreme climate events

IPCC predicts rise in extreme climate events

T.V. Padma

 

28 March 2012 | EN

Children in a flooded street

Heavy rainfall events are expected to become more common in many areas of the world this century

Flickr/IFRC

[LONDON] Climate change could mean unusually high temperatures occurring much more often in most parts of the world by the end of the century, according to a special report on extreme weather events from the Intergovernmental Panel on Climate Change (IPCC).

“A hottest day that occurs once in 20 years is likely to become a one-in-two year event, except in the high latitudes of the northern hemisphere, where it is likely to be one-in-five years,” according to Sonia Seneviratne, a climate expert at ETH Zurich, which is part of the Swiss Federal Institute of Technology.

Seneviratne was speaking at the Planet Under Pressure conference, which is being held in London this week (26–29 March).

She was a member of an IPCC group set up in 2009 to compile the Special report on managing the risks of extreme events and disasters to advance climate change adaptation (SREX), which was published today (28 March).

This article is part of our Planet Under Pressure 2012 coverage — which takes place 26–29 March 2012. To read insights from our conference team please visit our blog.

Contributors to the report include experts on disaster recovery and risk management, together with members of the physical sciences and climate change mitigation and adaptation disciplines.

The report marks the first time that the scientific literature on extreme events has been synthesised by a single team, Seneviratne told the conference.

It assesses observations and predicts changes in temperature extremes, heavy rainfall and drought for 26 regions. According to the IPCC, it offers “an unprecedented level of detail regarding observed and expected changes in weather and climate extremes, based on a comprehensive assessment of over 1,000 scientific publications”.

Seneviratne said that it is likely that both the frequency of heavy rainfall, and the proportion of total rainfall from heavy rains, will increase in many areas of the world this century. A “high” daily rainfall that has typically occurred once in 20 years is likely to happen every 5 to 15 years.

The average tropical cyclone maximum wind speed is likely to increase, but the global frequency of tropical cyclones is likely to decrease or remain unchanged, she said.

Increased dry weather is predicted for southern Africa, north-eastern Brazil, central Europe, Mediterranean countries and central North America.

The report also notes that observations gathered since 1950 suggest it is “very likely” – with a 90 to 100 per cent probability – that there has been an overall decrease in the number of cold days and nights, and an increase in the number of warm days and nights.

Seneviratne said that the report found that many of the projections contained in the IPCC’s fourth assessment report, which was released in 2007, are robust – for example, projections relating to temperature extremes. However some projections, such as those relating to droughts and tropical cyclones, “need to be revised, based on new evidence and literature“.

The IPCC says that “policies to avoid, prepare for, respond to and recover from the risks of disaster can reduce the impact of these events and increase the resilience of people exposed to extreme events”.  

Link to report summary

A spectacular snap of one billion stars in the Milky Way

Far out: A spectacular snap of one billion stars in the Milky Way, which took astronomers 10 years to create

  • Scientists produced the picture by combining infra-red light images from two telescopes in the northern and southern hemispheres
  • The archived information known as the Vista Data Flow System, will allow scientists to carry out future research without generating further data
 

By Jill Reilly

PUBLISHED: 18:02 EST, 28 March 2012 | UPDATED: 20:10 EST, 28 March 2012

Ten years of patience and star-gazing by astronomers has finally culminated in this incredible image showcasing around one billion stars in the Milky Way.

It was produced by scientists who combined infra-red light images from two telescopes in the northern and southern hemispheres.

Astronomers from the UK and Chile gathered the data which was then processed and archived by teams at the universities of Edinburgh and Cambridge. Star-filled sky: Around one billion stars in the Milky Way can be seen together for the first time in an image captured over a decade by astronomers

Star-filled sky: This incredible image, shows in detail, the star forming area in the Milky Way. Scientists produced the picture by combining infra-red light images from two telescopes in the northern and southern hemispheres

 

They have now made it available to studies around the world and hope it will change the way scientists carry out future research.

Dr Nick Cross, of the University of Edinburgh’s school of physics and astronomy, said: ‘This incredible image gives us a new perspective of our galaxy and illustrates the far-reaching discoveries we can make from large sky surveys.

 

‘Having data processed, archived and published by dedicated teams leaves other scientists free to concentrate on using the data and is a very cost-effective way to do astronomy.’

Dr Cross said the archived information on the billions of stars, known as the Vista Data Flow System, will allow scientists to carry out research in future without needing to generate further data.

 
Zoomed out: Astronomers from the UK and Chile gathered the data which was then processed and archived by teams at the universities of Edinburgh and Cambridge who, in turn, have made it available to studies around the world

Zoomed out: Astronomers from the UK and Chile gathered the data which was then processed and archived by teams at the universities of Edinburgh and Cambridge who, in turn, have made it available to studies around the world

As well as being published online the image is being presented at the National Astronomy Meeting in Manchester today and shows the plane of the Milky Way galaxy from Earth’s perspective.

It combines data from the UK Infrared Telescope in Hawaii and the Vista telescope in Chile.

Astronomers used infra-red radiation instead of visible light to enable them to see through much of the dust in the Milky Way and record details of the centre of the galaxy.

The work was supported by government body the Science and Technology Facilities Council.

A Climate For Change

Katharine Hayhoe came out of the closet in 2009. In the decade and a half since she’d gotten her Ph.D. in atmospheric sciences, her professional colleagues had known Hayhoe as an increasingly prominent expert on climate change — the author or co-author of scientific papers, textbook chapters and major reports on the science and the impacts of global warming and, since 2005, a faculty member at Texas Tech University, in Lubbock.

But in the fall of 2009, Hayhoe and her husband, Andrew Farley, published a book titled A Climate for Change: Global Warming Facts for Faith-Based Decisions, and a fact she’d always kept out of her professional life was suddenly very public. Hayhoe and Farley are evangelical Christians, and Farley, an associate professor of applied linguistics at Texas Tech, is also the pastor of a local church.

 

Credit: Mark Umstot

“In the U.S., evangelical Christians tend to be politically conservative, and even anti-science,” said the Canadian-born Hayhoe. “So in scientific circles, saying you’re an evangelical Christian is like saying ‘I check my brain at the door.’ I seriously wondered what this would do to my scientific reputation — was I tossing everything I’d done in the toilet?”

She needn’t have worried. The book won praise not only from religious leaders, but also from hard-nosed scientists and environmentalists, including a past president of the American Association for the Advancement of Science and the president of the National Wildlife Federation. Since it came out, moreover, Hayhoe has been busier than ever professionally: in 2011, she served on a National Academy of Science committee on stabilization targets for greenhouse gases, and spearheaded an effort to have Texas Tech co-host one of six Regional Climate Centers sponsored by the Department of the Interior.

Increasingly, though, Hayhoe sees her mission as one of outreach. At one level, that means talking to professionals who need information on climate to do their jobs. At Texas Tech, for example, she offers a course on climate science and policy for grad students — in any discipline.

“We have civil engineers,” she said, “water resource people, architects, natural resource managers, agricultural scientists, geoscientists, wildlife biologists . . . and we had so many requests from faculty to audit that I’m giving a one-week intensive course for them as well.” Hayhoe has also just finished a book for the U.S. Fish and Wildlife Service on how to use climate science and climate models to inform decisions about how to manage ecosystems.

But she’s also talking to ordinary people who are simply skeptical about the whole business. “When we first came to West Texas,” Hayhoe said, “I knew that most people in the area didn’t accept that climate change is real. I felt a little bit like a missionary going to Africa. I thought I might end up in a stew pot.”

Within a couple of months after her and Farley’s arrival, though, Hayhoe began getting speaking invitations at women’s groups, churches, grade schools. “People had good, legitimate questions about why they should believe climate change is caused by humans,” she said, “and telling them, ‘you’re an idiot’ is not going to change their minds. But many people in conservative communities feel that this is what they’re being told.”

She also got plenty of questions by way of her husband, who was invited to pastor the nondenominational, evangelical Ecclesia church soon after they came to Texas. “People started to realize that if the pastor’s wife took climate change seriously, maybe it wasn’t just a plot by liberal tree huggers who want Al Gore to rule the world.” The congregation was too polite to ask Hayhoe about the issue directly, but they did ask her husband. “Andrew got millions of questions,” she said. “He would tell them, ‘I’ll find out.’ He’s a very conservative person, went to a Southern Baptist school, and he would tell me, ‘this is a good question, you have to have a good answer.’ ”

 

But good answers about the science, Hayhoe said, are not always enough, because much of the opposition is emotional, not fact based. So she tries to make a connection based on what she has in common with her listeners. “ I can’t just say, ‘I’m a scientist,’” she said. “I am a human, a mother, an evangelical Christian who knows that Jesus said to love God and love your neighbor as yourself. The impacts of climate change are going to fall disproportionately on the poorest. Who doesn’t believe we should take care of the poor and needy? When I start from that place, I’ve seen dramatic shifts. People say ‘what can I do about it?’ ”  

But that’s not always the case. Last year, recalled Hayhoe, she went to speak to a group of petroleum geologists. “These are white male engineers who study fossil fuels, which may be four categories of people most hostile to hearing about climate change. I felt like I was going into the lion’s den.”

Indeed, at least one member of the audience accused her of making it all up in order to score money from the government. But afterward, she got an email from someone who’d been there saying. “I still disagree, but you were courteous, and you don’t deserve what was being said.” It was, Hayhoe said, “the best email ever. If the entire U.S. were in that situation where we’re respectfully disagreeing, but talking, we’d be in very different position than we are today.”

It doesn’t look as though that will happen anytime soon, however. Back in December, Rush Limbaugh got wind that Hayhoe had contributed a chapter to a book Newt Gingrich was putting together on the environment. Limbaugh called out Gingrich for working with a “climate babe” — and Gingrich, already under fire from conservatives from once having taken climate change seriously, dropped the chapter like a hot potato. “Nice to hear that Gingrich is tossing my #climate chapter in the trash. 100+ unpaid hrs I cd’ve spent playing w my baby,” she tweeted shortly after she found out.

Since the Limbaugh incident, Hayhoe has gotten more than her share of hate mail from people who have no interest in respectful disagreement, but it hasn’t slowed her. Next month, she’s going to speak to a cotton growers’ association — if anything, she said, they’re even more conservative than petroleum geologists. “The head of the association goes to our church,” Hayhoe said. “He told me, ‘if you want to come talk to us, just don’t mention global climate change or Al Gore.’”

So she won’t, and she’ll treat the growers with respect, and if anyone can get even some of them to take the threat seriously, Hayhoe’s the one to do it. “I’m optimistic in one sense,” she said. “I’ve seen that we can move people from debating science to debating solutions.” But, she adds, “I’m not sure that we can do it in time to avoid serous impacts. I’m really struggling now with a question I can’t yet answer: what could we be doing more effectively to move people from x to y?”

Given Hayhoe’s energy and commitment, however— and maybe most of all, the fact that she believes she’s doing God’s work — it would be foolish to doubt she’ll come up with the answer.

Planet Under Pressure: ‘a Much Hotter Planet’

LONDON, UK, March 26, 2012 (ENS) – Scientists at the Planet under Pressure conference in London today added their stern warning to others issued over the past several weeks – time is running out to minimize the risk of irreversible, long-term climate change and other dramatic changes to Earth’s life support system.

The 2,800 scientists, policymakers and business representatives opened their four-day conference with a reading of Earth’s vital signs and an ominous prognosis, “without immediate action, societies everywhere face an uncertain future on what may become a much hotter planet.”

Lake Hume, the furthest downstream of the major reservoirs on Australia’s Murray River system (Photo by Tim Keegan)

Hosted by The Royal Society UK’s Living with Environmental Change program, this is the largest gathering of experts in global sustainability ahead of the UN Rio+20 summit in Brazil in June and the largest gathering ever of such a group of experts.

To Professor Will Steffen, a conference speaker, there are several potentially dangerous environmental “tipping points,” among them the melting of the polar ice sheets and the thawing of perennially frozen northern permafrost soils.

Steffen, a global change expert from the Australian National University, says, “The last 50 years have without doubt seen the most rapid transformation of the human relationship with the natural world in history.”

Steffen calls the “explosion in human activity” over the past several decades, “The Great Acceleration.”

“Many human activities reached take-off points sometime in the 20th century and sharply accelerated towards the end of the century. It is the scale and speed of the Great Acceleration that is truly remarkable,” Steffen said. “This has largely happened within one human lifetime.”

“Where on Earth are we going?” he asks.

Key indicators of the planet’s state, conference speakers agreed, include growing consumption of freshwater supplies and energy by swelling numbers of people worldwide, even as billions of people lack even the most basic elements of well-being.

Professor Will Steffen (Photo by TEDx Canberra)

There are higher levels of carbon dioxide in the atmosphere. Phosphorus extraction and fertilizer production send nutrient runoff into the sea, causing huge dead zones in coastal areas.

Then the scientists report, we have rising air and ocean temperatures, melting sea ice, polar ice sheets and Arctic permafrost, rising sea levels and ocean acidification, biodiversity loss and land use changes.

At a planetary level, humanity is altering the global carbon cycle, water cycle and nitrogen cycle, warns Steffen. He worries about the release of the greenhouse gas carbon dioxide from melting permafrost because it stores the equivalent of twice the carbon in the atmosphere.

Steffen warns about the “compost bomb,” another potential contributor to a hotter Earth. The microbial respiration in thawed soils, he says, are “leading to a tipping point where heat is produced more rapidly than it can be dissipated.”

Permafrost on the Yukon Delta National Wildlife Refuge, Alaska (Photo by USFWS Alaska)

“All these environmental tipping point phenomena are part of a single system,” that Steffen says becomes clear, “when we look at how the Earth has behaved in the past.”

“The key point is,” he said, “we may reach a threshold for the Earth as a whole this century. Either we turn around a lot of these trends – the carbon dioxide trend, deforestation and so on – or we allow them to continue and push the Earth as a whole across a threshold whereby a lot of these tipping elements are activated and the world moves into a new, much warmer state.”

“There are signs that some drivers of global change are slowing or changing,” said another conference speaker, Professor Diana Liverman, co-director of the Institute of the Environment at the University of Arizona and visiting Oxford University academic.

Liverman says Earth has entered a new geological epoch hallmarked by the profound ecosystem impacts of one species – humans – so much so that it marks an entirely new geological timespan that she calls the “Anthropocene.”

Pressure will ease off the planet somewhat, says Liverman. “Population growth is slowing and will level off; the intensity of energy and carbon required for a unit of production is declining; agricultural intensification is slowing and forests are starting to expand in some regions.”

“On the other hand,” she said, “average resource consumption per person, already high in some regions, is growing steeply in emerging economies even as many poor people cannot meet basic human needs. In some countries people are consuming far too much, including carbon, water and other resources embodied in trade. We have a long way to go to turn things around.”

The 1:10 minute time-lapse history of human global CO2 emissions, online at http://www.youtube.com/watch?v=MEMse22h8c8, captures the growth of CO2 emissions from their start with the UK’s Industrial Revolution in 1750 and radiating across Europe, to North America, then Asia and worldwide.

Signs of Thawing Permafrost Revealed from Space

Signs of Thawing Permafrost Revealed from Space

Seasonal freezing patterns on land surfaces in the northern hemisphere have varied over recent years. (Credit: Vienna University of Technology)

ScienceDaily (Mar. 27, 2012) — Satellite are seeing changes in land surfaces in high detail at northern latitudes, indicating thawing permafrost. This releases greenhouse gases into parts of the Arctic, exacerbating the effects of climate change.
 
Permafrost is ground that remains at or below 0°C for at least two consecutive years and usually appears in areas at high latitudes such as Alaska, Siberia and Northern Scandinavia, or at high altitudes like the Andes, Himalayas and the Alps.

About half of the world’s underground organic carbon is found in northern permafrost regions. This is more than double the amount of carbon in the atmosphere in the form of the greenhouse gases carbon dioxide and methane.

The effects of climate change are most severe and rapid in the Arctic, causing the permafrost to thaw. When it does, it releases greenhouse gases into the atmosphere, exacerbating the effects of climate change.

Although permafrost cannot be directly measured from space, factors such as surface temperature, land cover and snow parameters, soil moisture and terrain changes can be captured by satellites.  

The use of satellite data like from ESA’s Envisat, along with other Earth-observing satellites and intensive field measurements, allows the permafrost research community to get a panoptic view of permafrost phenomena from a local to a Circum-Arctic dimension.

“Combining field measurements with remote sensing and climate models can advance our understanding of the complex processes in the permafrost region and improve projections of the future climate,” said Dr Hans-Wolfgang Hubberten, head of the Alfred Wegner Institute Research Unit (Germany) and President of the International Permafrost Association.

Last month, more than 60 permafrost scientists and Earth observation specialists came together for the Third Permafrost User Workshop at the Alfred Wegener Institute in Potsdam, Germany, to discuss their latest findings.

“The already available Permafrost products provide researchers with valuable datasets which can be used in addition to other observational data for climate and hydrological modelling,” said Dr Leonid Bobylev, the director of the Nansen Centre in St. Petersburg.

“However, for climate change studies – and in particular for evaluation of the climate models’ performance – it is essential to get a longer time series of satellite observational data.

“Therefore, the Permafrost related measurements should be continued in the future and extended consistently in the past.”

ESA will continue to monitor the permafrost region with its Envisat satellite and the upcoming Sentinel satellite series for Europe’s Global Monitoring for Environment and Security (GMES) programme.

West Antarctic Ice Shelves Tearing Apart at the Seams

West Antarctic Ice Shelves Tearing Apart at the Seams

 

Rifts along the northern shear margin of Pine Island Glacier (upper right of image). (Credit: Michael Studinger, NASA’s Operation IceBridge.)

 

 

ScienceDaily (Mar. 27, 2012) — A new study examining nearly 40 years of satellite imagery has revealed that the floating ice shelves of a critical portion of West Antarctica are steadily losing their grip on adjacent bay walls, potentially amplifying an already accelerating loss of ice to the sea.

The most extensive record yet of the evolution of the floating ice shelves in the eastern Amundsen Sea Embayment in West Antarctica shows that their margins, where they grip onto rocky bay walls or slower ice masses, are fracturing and retreating inland. As that grip continues to loosen, these already-thinning ice shelves will be even less able to hold back grounded ice upstream, according to glaciologists at The University of Texas at Austin’s Institute for Geophysics (UTIG).

Reporting in the Journal of Glaciology, the UTIG team found that the extent of ice shelves in the Amundsen Sea Embayment changed substantially between the beginning of the Landsat satellite record in 1972 and late 2011. These changes were especially rapid during the past decade. The affected ice shelves include the floating extensions of the rapidly thinning Thwaites and Pine Island Glaciers.

“Typically, the leading edge of an ice shelf moves forward steadily over time, retreating episodically when an iceberg calves off, but that is not what happened along the shear margins,” says Joseph MacGregor, research scientist associate and lead author of the study. An iceberg is said to calve when it breaks off and floats out to sea.

“Anyone can examine this region in Google Earth and see a snapshot of the same satellite data we used, but only through examination of the whole satellite record is it possible to distinguish long-term change from cyclical calving,” says MacGregor.

The shear margins that bound these ice shelves laterally are now heavily rifted, resembling a cracked mirror in satellite imagery until the detached icebergs finally drift out to the open sea. The calving front then retreats along these disintegrating margins. The pattern of marginal rifting and retreat is hypothesized to be a symptom, rather than a trigger, of the recent glacier acceleration in this region, but this pattern could generate additional acceleration.

“As a glacier goes afloat, becoming an ice shelf, its flow is resisted partly by the margins, which are the bay walls or the seams where two glaciers merge,” explains Ginny Catania, assistant professor at UTIG and co-author of the study. “An accelerating glacier can tear away from its margins, creating rifts that negate the margins’ resistance to ice flow and causing additional acceleration.”

The UTIG team found that the largest relative glacier accelerations occurred within and upstream of the increasingly rifted margins.

The observed style of slow-but-steady disintegration along ice-shelf margins has been neglected in most computer models of this critical region of West Antarctica, partly because it involves fracture, but also because no comprehensive record of this pattern existed. The authors conclude that several rifts present in the ice shelves suggest that they are poised to shrink further.

Lawsuit Seeks Halt to TSA’s Use of Full-Body Scanners at Airports amid Safety Concerns

see url:

 

http://www.democracynow.org/2012/3/28/lawsuit_seeks_halt_to_tsas_use

Tag Cloud