Rense.com



Lack Of Earthquakes Shakes
Up Quake Forecast Model
By Lidia Wasowicz
UPI Senior Science Writer
9-19-2

PALO ALTO, Calif (UPI) -- To geophysicists' dismay, a method widely used to assess long-term earthquake hazards in the United States, Japan, New Zealand and other rattle-prone areas has failed to hold up at what was considered the ideal testing ground.
 
A long-anticipated temblor forecast by the technique for the tiny central California town of Parkfield -- one of the most seismically active and closely monitored sites in the world -- never materialized. The town's sustained stillness, when it was supposed to toss and turn within a predictable period of time, has shaken scientists' belief in the validity of what has become a standard tool for predicting quakes in the most active regions.
 
Government agencies in a number of Pacific Rim countries routinely use the strategy for long-range hazard assessments, such as the widely publicized 1999 U.S. Geological Survey report projecting a 70 percent probability a large quake will strike the San Francisco Bay area by 2030.
 
"The message I would send to my geophysical colleagues about this model is, 'Use with caution,'" said Eric Segall, professor of geophysics at Stanford University and lead author of the study, which will be published in the Sept. 19 issue of the British journal Nature.
 
"(The new results) should disturb many of us who have relied upon the concept of time-predictability in our forecasts," Ross Stein of the USGS in Menlo Park, Calif., who analyzed the findings, told United Press International. "Alternative approaches to earthquake forecasting must now be explored."
 
The time-predictable recurrence model assumes a temblor will strike on a segment of a fault -- a jolt-producing rupture in Earth's crust -- when the stress released from a previous shaker has built back up to the pre-quake level. Although that premise remains intact, the means of timing the recurrence does not.
 
According to computations based on the model, the moderate, magnitude-6 earthquake that rocked Parkfield -- a picturesque rural town halfway between San Francisco and Los Angeles -- in 1966 should have been followed by another in 1987, said Segall and Stanford geophysics graduate student Jessica Murray. The fact that it has not casts serious doubt over the model developed in 1980 by Japanese geophysicists building on the premise that earthquakes in fault zones are caused by the constant buildup and release of strain in Earth's crust.
 
"We think it's rather damning that it doesn't work here," Segall said in an interview with UPI. "If it's going to work anywhere, you'd think it would work here."
 
Parkfield provides the perfect perspective, perched along a 15.5-mile (25-kilometer) stretch of the notorious 30-million-year-old San Andreas fault, which suddenly slipped on one side and swelled on the other by up to 21 feet (6.4 meters) on April 18, 1906, setting off the great San Francisco earthquake.
 
More than 20 years of intense monitoring by instruments on or near the fault surface have produced heaps of geological clues to the physical processes underlying recurring earthquakes, including a wide array of measurements spanning an entire quake cycle, since the last moderate one in 1966.
 
The time lapse between shakers at Parkfield is relatively short, an average of 22 years, compared to a century or two that typically separates temblors in earthquake country elsewhere. Moderate-size temblors of about magnitude 6 have occurred on the Parkfield section of the San Andreas fault at fairly regular intervals -- in 1857, 1881, 1901, 1922, 1934 and 1966, prompting the National Earthquake Prediction Evaluation Council in 1994 to declare Parkfield "the best identified locale to trap an earthquake."
 
As an added bonus, Parkfield's tectonic setting is study-perfect in its simplicity. Nearly all the earthquake action centers on the San Andreas rupture, in contrast to such hotbeds of quake activity as the San Francisco Bay area where rumblings emanate from a network of faults, including the San Andreas, Calaveras and Hayward.
 
The San Andreas defines an 800-mile (1,300-km) section of the boundary between the Pacific and North American plates, two of a number of gigantic shifting slabs that break up Earth's surface.
 
"With a plate boundary like the San Andreas, you have the North American plate on one side and the Pacific plate on the other," Segall said. "The two plates are moving at a very steady rate with respect to one another, so strain is being put into the system at an essentially constant rate."
 
When an earthquake occurs on the fault, a certain amount of accumulated strain is released, Murray explained.
 
"Following the quake, strain builds up again because of the continuous grinding of the tectonic plates," she noted. "According to the time-predictable model, if you know the size of the most recent earthquake and the rate of strain accumulation afterwards, you should be able to forecast the time that the next event will happen simply by dividing the strain released by the strain-accumulation rate."
 
Applying the model to the Parkfield data, the scientists came up with a forecast of when the next earthquake would strike.
 
"According to the model, a magnitude-6 earthquake should have taken place between 1973 and 1987 -- but it didn't," Murray said. "In fact, 15 years have gone by. Our results show, with 95 percent confidence, that it should definitely have happened before now, and it hasn't, so that shows that the model doesn't work -- at least in this location."
 
The investigators doubt it would work any better elsewhere, including in the densely populated metropolitan areas of Northern and Southern California.
 
"We used the model at Parkfield where things are fairly simple," Murray observed, "but when you come to the Bay Area or Los Angeles, there are a lot more fault interactions, so it's probably even less likely to work in those places."
 
To determine for certain, the method should be tested at other sites, including in Japan, Segall said.
 
"We're in a tough situation, because agencies like the USGS -- which have the responsibility for issuing forecasts so that city planners and builders can use the best available knowledge -- have to do the best they can with what information they have," he said.
 
"These findings illustrate that earthquake generation, like many natural processes, is not cut-and-dried -- it cannot be summed up in a simple rule," Murray told UPI. "However, these findings benefit the public in as much as they will be an impetus for further research and improvements in earthquake probability estimates."
 
The great Parkfield earthquake experiment will continue to delve into the mechanisms operating before, during and after an earthquake. Scientists hope to "capture" the elusive earthquake, whenever it strikes, on the array of instruments on or near the fault surface and, in the latest expansion of the project, deep underground. The tools set 1.8 to 2.4 miles (3 km to 4 km) beneath the surface will directly reveal, for the first time, the physical and chemical processes controlling earthquake generation within a seismically active fault.
 
These and other technological advances could turn long-range forecasting into a science, Murray said.
 
"As scientists we must continually question our assumptions and methods; test and revise our approaches, and develop new concepts and techniques that capture our evolving understanding of the earthquake machine," Stein told UPI. "We have an obligation to provide society with useful tools to assess and monitor for the hazards of earthquakes and to freely acknowledge what we know and what we don't know."
 
In the meantime, people living in earthquake-prone regions should plan for the inevitable.
 
"I always tell people to prepare," Segall said. "We know big earthquakes have happened in the past, we know they will happen again. We just don't know when."
 
 
Copyright © 2002 United Press International. All rights reserved.





MainPage
http://www.rense.com


This Site Served by TheHostPros