-->
LATEST HEADLINES
66th REPUBLIC DAY WISHES TO ALL INDIANSZizix Tutorials
LATEST POSTS TIME OF NOW

Radiation exposure linked to aggressive thyroid cancers, researchers confirm for the first time

fukushima disaster
For the first time, researchers have found that exposure to radioactive iodine is associated with more aggressive forms of thyroid cancer, according to a careful study of nearly 12,000 people in Belarus who were exposed when they were children or adolescents to fallout from the 1986 Chernobyl nuclear power plant accident.

Researchers examined thyroid cancers diagnosed up to two decades after the Chernobyl accident and found that higher thyroid radiation doses estimated from measurements taken shortly after the accident were associated with more aggressive tumor features.

"Our group has previously shown that exposures to radioactive iodine significantly increase the risk of thyroid cancer in a dose-dependent manner. The new study shows that radiation exposures are also associated with distinct clinical features that are more aggressive," said the paper's first author, Lydia Zablotska, MD, PhD, associate professor in the Department of Epidemiology and Biostatistics at UC San Francisco (UCSF). The paper will be published online in the journal Cancer.

Zablotska said the findings have implications for those exposed to radioactive iodine fallout from the 2011 nuclear reactor incidents in Fukushima, Japan, after the reactors were damaged by an earthquake-induced tsunami.

"Those exposed as children or adolescents to the fallout are at highest risk and should probably be screened for thyroid cancer regularly, because these cancers are aggressive, and they can spread really fast," Zablotska said. "Clinicians should be aware of the aggressiveness of radiation-associated tumors and closely monitor those at high risk."

Chernobyl studies led by Zablotska also showed for the first time that exposures to the radioactive iodine after the Chernobyl nuclear plant accident are associated with a whole spectrum of thyroid diseases, from benign to malignant. Benign encapsulated tumors of the thyroid gland are called follicular adenomas, and are treated in the same way as thyroid cancer -- by removing the thyroid gland, then giving patients pills to replace the hormones that are lost. Lifelong hormone supplementation treatment is both costly and complicated for patients.

Thyroid cancer is ordinarily rare among children, with less than one new case per million diagnosed each year. Among adults, about 13 new cases will be diagnosed each year for every 100,000 people, according to the Surveillance, Epidemiology and End Results (SEER) Program of the National Cancer Institute (NCI). But in the Belarus cohort, the researchers diagnosed 158 thyroid cancers among 11,664 subjects during three rounds of screening. Those who had received higher radiation doses also were more likely to have solid or diffuse variants of thyroid cancer, as well as to have more aggressive tumor features, such as spread to lymphatic vessels and several simultaneous cancer lesions in the thyroid gland.

Source: University of California, San Francisco (UCSF)

Should the Japanese give nuclear power another chance?

Japanese  nuclear power
On September 9, 2014, the Japan Times reported an increasing number of suicides coming from the survivors of the March 2011 disaster. In Minami Soma Hospital, which is located 23 km away from the power plant, the number of patients experiencing stress has also increased since the disaster. What's more, many of the survivors are now jobless and therefore facing an uncertain future.

This is not the first time that nuclear power has victimized the Japanese people. In 1945, atomic bombs exploded in Hiroshima and Nagasaki, creating massive fears about nuclear power in the Japanese population. It took 20 years for the public to erase the trauma of these events. It was then -- in the mid 1960s(?) -- that the Fukushima Daiichii Nuclear Power Plant was built.

According to Professor Tetsuo Sawada, Assistant Professor in the Laboratory of Nuclear Reactors at Tokyo University, it took a lot of effort to assure people that nuclear power was safe and beneficial. The first step was a legal step: In 1955, the Japanese government passed a law decreeing that nuclear power could only be used for peaceful purposes.

"But that law was not enough to assure people to accept the establishment of nuclear power," said Prof. Sawada.

He explained that the economy plays an important role in public acceptance of nuclear power. Through the establishment of nuclear power plants, more jobs were created, which boosted the economy of the Fukushima region at that time.

"Before the Fukushima disaster, we could find many pro-nuclear people in the area of nuclear power plants since it gave them money," said Prof. Sawada.

Now, more than forty years have passed and the public's former confidence has evolved into feelings of fear about nuclear power and distrust toward the government.

According to a study conducted by Noriko Iwai from the Japanese General Social Survey Research Center, the Fukushima nuclear accident has heightened people's perception of disaster risks, fears of nuclear accident, and recognition of pollution, and has changed public opinion on nuclear energy policy.
"Distance from nuclear plants and the perception of earthquake risk interactively correlate with opinions on nuclear issues: among people whose evaluation of earthquake risk is low, those who live nearer to the plants are more likely to object to the abolishment of nuclear plants," said Iwai.

This finding is in line with the perception of Sokyu Genyu, a chief priest in Fukujuji temple, Miharu Town, Fukushima Prefecture. As a member of the Reconstruction Design Council in Response to the Great East Japan Earthquake, he argued that both the Fukushima Daiichi and Daini nuclear power plants should be shut down in response to the objection of 80% of Fukushima residents.

However, the Japanese government, local scientists and international authorities have announced that Fukushima is safe. Radiation levels are below 1mSv/y, a number that, according to them, we should not be worried about. But the public do not believe in numbers.

But Genyu was not saying that these numbers are scientifically false. Rather, he argues that the problem lies more in the realm of social psychology. Despite the announcement about low-radiation levels, the Japanese people are still afraid of radiation.

"It is reasonable for local residents in Fukushima to speak out very emotionally. Within three months of the disaster, six people had committed sucide. They were homeless and jobless, " said Genyu.

It is heart-breaking to know that victims of the Fukushima Daiichi nuclear accident died not because of radiation, but instead because of depression. Besides the increasing number of suicides, the number ofpatients suffering from cerebrovascular disease (strokes)has also risen. In Minami-Soma Hospital, the population of stroke patients increased by more than 100% after the disaster.

Local doctors and scientists are now actively educating students in Fukushima, convincing them that the radiation will not affect their health.

Dr. Masaharu Tsubokura, a practicing doctor at Minami-Soma Hospital, has been informing students that Fukushima is safe. But sadly, their responses are mostly negative and full of apathy.

"I think the Fukushima disaster is not about nuclear radiation but is rather a matter of public trust in the technology ," said Dr. Tsubokura.

Dr. Tsubokura has given dosimeters, a device used to measure radiation, to children living in Minami-Soma city. But apparently, this was not enough to eliminate people's fears.

In 2012, Professor Ryogo Hayano, a physicist from the University of Tokyo, joined Dr. Tsubokura in Minami-Soma Hospital and invented BABYSCAN technology, a whole-body scanning to measure radiation in small children as well as to allay the fears of Fukushima parents.

"BABYSCAN is unnecessary but necessary. It is unnecessary because we know that the radiation is low. But it is necessary to assure parents that their children are going to be okay," said Prof. Hayano.
After witnessing the fears of the Fukushima people, Prof. Hayano thinks that nuclear power is no longer appropriate for Japan. He believes that the government should shut down nuclear power plants.

"As a scientist, I know that nuclear power is safe and cheap. But looking at the public's fear in Fukushima, I think it should be phased out," said Prof. Hayano.

But, does the government care about the public when it comes to politics?
It has only been three years since the disaster and Prime Minister Shinzo Abe has been keen to revive the country's nuclear power plants. The operations of more than 50 nuclear power plants in Japan have been suspended because of the Daiichi power plant meltdown.

Last month, Japan's Nuclear Regulation Authority approved the reopening of a power plant in Sendai for 2015.

Source: ResearchSEA

The Rising above the risk: America's first tsunami refuge

Artist rendering: entry view. Credit: TCF Architecture
Washington's coast is so close to the seismically active Cascadia Subduction Zone that if a megathrust earthquake were to occur, a tsunami would hit the Washington shoreline in just 25 minutes.

One coastal community is preparing for such a disaster by starting construction on the nation's first tsunami evacuation refuge, large enough to shelter more than 1,000 people who are within 20-minute walking distance.

The vertical evacuation-refuge will be the roof of the gym of the new school in Grays Harbor County, Washington. The Ocosta Elementary School and Tsunami Safe Haven will be the first of its kind in the nation and will be the culmination of 18 years of effort, said Tim Walsh, who is a Chief Hazard Geologist at the Department of Natural Resources and has been working on this project since The National Tsunami Hazard Mitigation Program was formed in 1995.

Walsh will present the project design for the school and structure, along with the detailed tsunami modeling used to find the best location for the refuge, at the Annual Meeting for the Geological Society of America in Vancouver, Canada, on 21 October.

The Cascadia subduction zone is a 700-mile-long (over 1,000 kilometers) fault along the West Coast, where the Juan de Fuca Plate is being forced under the North American Plate. The subduction zone is capable of producing massive earthquakes; scientists have calculated that magnitude-9 earthquakes along this fault line could generate a massive tsunami that would hit the coastlines of British Columbia, Washington, Oregon, and California within 20 to 30 minutes.

"It used to be thought that Cascadia was not an active fault," said Walsh. Not only has Cascadia been found to be an active fault, it has a 10 percent chance that it will cause an earthquake in the next 50 years, he said.

"It is more than 10 times more likely than the chance you will be killed in a traffic accident," said Walsh. "But you aren't looking at the statistics of a single person, but an earthquake that would have an effect on thousands of miles of shoreline."

The biggest challenge was at the very beginning, trying to come up with a location that could be effective and accessible to people, said Walsh. "It was difficult in the beginning to go to the public meetings in these communities and present the hazards, but have no solution for them," he said.
Project Safe Haven brought together structural engineers, oceanographers, geographers, and scientists from many other disciplines to create a safe and accessible refuge.

Walsh and his colleagues used a model called GeoClaw to research the risk a tsunami, factoring in and any potential landslides caused by the wave or megaquake. Using this model in the community for Grays Harbor County, the scientists determined the best place for the school, and what how much force the structure would have to withstand to protect refugees.

The school will be built on a dune ridge, so the roof of the evacuation shelter will be about 55 feet (almost 17 meters) above sea-level. The structure is designed to withstand earthquakes and the impact of a storm surge, with reinforced concrete cores at each corner of the gym and staircases leading to the room. The school, and refuge, is expected to be finished and operating for the 2015-2016 academic year.

Walsh would like to see other scientists and community groups working together to create novel solutions for tsunami risk, he said. Currently the Washington coast has very few tall buildings, and barely any are taller than three stories, leaving thousands of people at risk in the event of a tsunami, he said.

Source: Geological Society of America

The Massive debris pile reveals risk of huge tsunamis in Hawaii

The researchers simulated earthquakes with magnitudes between 9.0 and 9.6 originating at different locations along the Aleutian-Alaska subduction zone, and found that the unique geometry of the eastern Aleutians would direct the largest post-earthquake tsunami energy directly toward the Hawaiian Islands. The red circles are centered on Kaua‘i and encircle the Big Island. Credit: Rhett Butler
A mass of marine debris discovered in a giant sinkhole in the Hawaiian islands provides evidence that at least one mammoth tsunami, larger than any in Hawaii's recorded history, has struck the islands, and that a similar disaster could happen again, new research finds. Scientists are reporting that a wall of water up to nine meters (30 feet) high surged onto Hawaiian shores about 500 years ago. A 9.0-magnitude earthquake off the coast of the Aleutian Islands triggered the mighty wave, which left behind up to nine shipping containers worth of ocean sediment in a sinkhole on the island of Kauai.

The tsunami was at least three times the size of a 1946 tsunami that was the most destructive in Hawaii's recent history, according to the new study that examined deposits believed to have come from the extreme event and used models to show how it might have occurred. Tsunamis of this magnitude are rare events. An earthquake in the eastern Aleutian Trench big enough to generate a massive tsunami like the one in the study is expected to occur once every thousand years, meaning that there is a 0.1 percent chance of it happening in any given year -- the same probability as the 2011 Tohoku earthquake that struck Japan, according to Gerard Fryer, a geophysicist at the Pacific Tsunami Warning Center in Ewa Beach, Hawaii.

Nevertheless, the new research has prompted Honolulu officials to revise their tsunami evacuation maps to account for the possibility of an extreme tsunami hitting the county of nearly 1 million people. The new maps would more than double the area of evacuation in some locations, according to Fryer.
"You're going to have great earthquakes on planet Earth, and you're going to have great tsunamis," said Rhett Butler, a geophysicist at the University of Hawaii at Manoa and lead author of the new study published online in Geophysical Research Letters, a journal of the American Geophysical Union. "People have to at least appreciate that the possibility is there."

Hawaiians have told stories about colossal tsunamis hitting the islands for generations, but possible evidence of these massive waves was only first detected in the late 1990s when David Burney, a paleoecologist at the National Tropical Botanical Garden in Kalaheo, was excavating the Makauwahi sinkhole, a collapsed limestone cave on the south shore of Kauai.

Two meters (six and a half feet) below the surface he encountered a layer of sediment marked by coral fragments, mollusk shells and coarse beach sand that could only have come from the sea. But the mouth of the sinkhole was separated from the shore by 100 meters (328 feet) of land and seven-meter (23-foot) high walls. Burney speculated that the deposit could have been left by a massive tsunami, but he was unable to verify the claim.

The deposits remained a mystery until the Tohoku earthquake hit Japan in 2011. It caused water to surge inland like a rapidly rising tide, reaching heights up to 39 meters (128 feet) above the normal sea level. After that tsunami deluged the island nation, scientists began to question Hawaii's current tsunami evacuation maps. The maps are based largely upon the 1946 tsunami, which followed a magnitude 8.6 earthquake in the Aleutian Islands and caused water to rise only two and a half meters (8 feet) up the side of the Makauwahi sinkhole.

"[The Japan earthquake] was bigger than almost any seismologist thought possible," said Butler. "Seeing [on live TV] the devastation it caused, I began to wonder, did we get it right in Hawaii? Are our evacuation zones the correct size?"

To find out, the study's authors used a wave model to predict how a tsunami would flood the Kauai coastline. They simulated earthquakes with magnitudes between 9.0 and 9.6 originating at different locations along the Aleutian-Alaska subduction zone, a 3,400-kilometer (2,113-mile) long ocean trench stretching along the southern coast of Alaska and the Aleutian Islands where the Pacific tectonic plate is slipping under the North American plate.

The researchers found that the unique geometry of the eastern Aleutians would direct the largest post-earthquake tsunami energy directly toward the Hawaiian Islands. Inundation models showed that an earthquake with a magnitude greater than 9.0 in just the right spot could produce water levels on the shore that reached eight to nine meters (26 to 30 feet) high, easily overtopping the Makauwahi sinkhole wall where the ocean deposits were found.

The authors used radiocarbon-dated marine deposits from Sedanka Island off the coast of Alaska and along the west coasts of Canada and the United States dating back to the same time period as the Makauwahi deposit to show that all three sediments could have come from the same tsunami and provide some evidence that the event occurred, according to the study.

"[The authors] stitched together geological evidence, anthropological information as well as geophysical modeling to put together this story that is tantalizing for a geologist but it's frightening for people in Hawaii," said Robert Witter, a geologist at the U.S. Geological Survey in Anchorage, Alaska who was not involved in the study.

According to Witter, it is possible that a massive tsunami hit Hawaii hundreds of years ago, based on the deposits found in the Kauai sinkhole, but he said it is difficult to determine if all three locations experienced the same event based on radiocarbon dating alone.

Radiocarbon dating only gives scientists a rough estimate of the age of a deposit, he said. All three locations offer evidence of a great tsunami occurring between 350 and 575 years ago, but it is hard to know if it was the same tsunami or ones that occurred hundreds of years apart.

"An important next thing to do is to look for evidence for tsunamis elsewhere in the Hawaiian island chain," said Witter.

Fryer, of the Pacific Tsunami Warning Center, is confident that more evidence of the massive tsunami will be found, confirming that events of this magnitude have rocked the island chain in the not-so-distant past.

"I've seen the deposit," said Fryer, who was not involved in the study. "I'm absolutely convinced it's a tsunami, and it had to be a monster tsunami."

Fryer is so convinced that he has worked with the city and county of Honolulu to update their tsunami evacuation maps to include the possibility of a massive tsunami the size of the one detailed in the new study hitting the islands. The county hopes to have the new maps distributed to residents by the end of the year, he said.

"We prepared ourselves for the worst tsunami that's likely to happen in one hundred years," Fryer said of the current tsunami evacuation maps based on the 1946 event. "What hit Japan was a thousand-year event … and this scenario [in the eastern Aleutians] is a thousand year event."

Source: American Geophysical Union

The Hydraulic fracturing linked to earthquakes in Ohio

Seismograph (stock image). Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study. Credit: © hakandogu / Fotolia
Hydraulic fracturing triggered a series of small earthquakes in 2013 on a previously unmapped fault in Harrison County, Ohio, according to a study published in the journal Seismological Research Letters (SRL).

Nearly 400 small earthquakes occurred between Oct. 1 and Dec. 13, 2013, including 10 "positive" magnitude earthquake, none of which were reported felt by the public. The 10 positive magnitude earthquakes, which ranged from magnitude 1.7 to 2.2, occurred between Oct. 2 and 19, coinciding with hydraulic fracturing operations at nearby wells.

This series of earthquakes is the first known instance of seismicity in the area.
Hydraulic fracturing, or fracking, is a method for extracting gas and oil from shale rock by injecting a high-pressure water mixture directed at the rock to release the gas inside. The process of hydraulic fracturing involves injecting water, sand and chemicals into the rock under high pressure to create cracks. The process of cracking rocks results in micro-earthquakes. Hydraulic fracturing usually creates only small earthquakes, ones that have magnitude in the range of negative 3 (−3) to negative 1 (-1).
"Hydraulic fracturing has the potential to trigger earthquakes, and in this case, small ones that could not be felt, however the earthquakes were three orders of magnitude larger than normally expected," said Paul Friberg, a seismologist with Instrumental Software Technologies, Inc. (ISTI) and a co-author of the study.

The earthquakes revealed an east-west trending fault that lies in the basement formation at approximately two miles deep and directly below the three horizontal gas wells. The EarthScope Transportable Array Network Facility identified the first earthquakes on Oct. 2, 2013, locating them south of Clendening Lake near the town of Uhrichsville, Ohio. A subsequent analysis identified 190 earthquakes during a 39-hour period on Oct. 1 and 2, just hours after hydraulic fracturing began on one of the wells.

The micro-seismicity varied, corresponding with the fracturing activity at the wells. The timing of the earthquakes, along with their tight linear clustering and similar waveform signals, suggest a unique source for the cause of the earthquakes -- the hydraulic fracturing operation. The fracturing likely triggered slip on a pre-existing fault, though one that is located below the formation expected to confine the fracturing, according to the authors.

"As hydraulic fracturing operations explore new regions, more seismic monitoring will be needed since many faults remain unmapped." Friberg co-authored the paper with Ilya Dricker, also with ISTI, and Glenda Besana-Ostman originally with Ohio Department of Natural Resources, and now with the Bureau of Reclamation at the U.S. Department of Interior.

Source: Seismological Society of America

Many older adults still homebound after 2011 Great East Japan Earthquake

2011 Great East Japan Earthquake
A new study, published online in the journal Age and Ageing, shows that the homebound status of adults over the age of 65 in the aftermath of the 2011 Great East Japan Earthquake is still a serious public health concern. Of 2,327 older adults surveyed, approximately 20% were found to be homebound.

A team of researchers led by Naoki Kondo of the University of Tokyo's School of Public Health studied data from the city of Rikuzentakata, an area that was seriously damaged by the disaster. Of its total population of 23,302 before the events of 2011, 1,773 people died or are still missing. Of 7,730 houses, 3,368 (43.6%) were affected with 3,159 "completely destroyed." Much of the population had been concentrated in flat coastal areas, and since the community infrastructure was totally shattered, many people who lost their houses insisted on moving to areas in the mountains.

This study used home-visit interviews with 2,327 adults over 65 years old (1027 men; 1300 women), and was carried out between August 2012 and October 2013. Interviewers gathered information of current morbidity, socio-economic status, health behaviour (diet, smoking, and alcohol intake), frequency of going out, and social support. 19.6% of men and 23.2% of women were shown to be homebound, defined as only leaving the house every 4 or more days. Of those older adults who were classified as homebound, around 40% also had no contact with neighbours.

Information was also obtained on the locations of grocery stores, convenience stores, and shopping centres from the online community directory database in August 2012. Information on shopper bus stops and hawker sites was provided by a disaster support team, and the team also collated road network data. This geographical analysis indicated that distances to retail stores was associated with the risk of people being homebound.

Lead author Naoki Kondo says: "This study has important implications for public health, especially in the setting of post-disaster community reconstruction. First, community diagnoses in a post-disaster setting should cover the built environment, including access to shopping facilities. Second, to prevent older victims of a disaster such as the Great East Japan Earthquake being homebound, it is clearly essential to provide access to the facilities that fulfil their daily needs.

"Given the findings of this study, such access could be increased by the private sector, suggesting the importance of public-private partnerships for post-disaster reconstruction."

Key messages:
  • The homebound status of older victims of the 2011 Great East Japan Earthquake is a matter of public health concern
  • Geographical analysis indicated that distances to retail stores was associated with the risk of people being homebound
  • Hawker and shopping bus services contributed to improved access, providing more opportunities for going out

Source: Oxford University Press (OUP)

The San Andreas Fault system in San Francisco Bay Area are locked, overdue

San Andreas Fault. Credit: © davetroesh123 / Fotolia
Four urban sections of the San Andreas Fault system in Northern California have stored enough energy to produce major earthquakes, according to a new study that measures fault creep. Three fault sections -- Hayward, Rodgers Creek and Green Valley -- are nearing or past their average recurrence interval, according to the study published in the Bulletin of the Seismological Society of America (BSSA).

The earthquake cycle reflects the accumulation of strain on a fault, its release as slip, and its re-accumulation and re-release. Fault creep is the slip and slow release of strain in the uppermost part of the Earth's crust that occurs on some faults between large earthquakes, when much greater stress is released in only seconds. Where no fault creep occurs, a fault is considered locked and stress will build until it is released by an earthquake.

This study estimates how much creep occurs on each section of the San Andreas Fault system in Northern California. Enough creep on a fault can diminish the potential size of its next earthquake rupture.

"The extent of fault creep, and therefore locking, controls the size and timing of large earthquakes on the Northern San Andreas Fault system," said James Lienkaemper, a co-author of the study and research geophysicist at U.S. Geological Survey (USGS). "The extent of creep on some fault sections is not yet well determined, making our first priority to study the urban sections of the San Andreas, which is directly beneath millions of Bay Area residents."

Understanding the amount and extent of fault creep directly impacts seismic hazard assessments for the region. The San Andreas Fault system in Northern California consists of five major branches that combine for a total length of approximately 1250 miles. Sixty percent of the fault system releases energy through fault creep, ranging from 0.1 to 25.1 mm (.004 to 1 inch) per year, and about 28 percent remains locked at depth, according to the authors.

Monitoring of creep on Bay Area faults has expanded in recent years. The alignment array measurements made by the San Francisco State University Creep Project and recently expanded GPS station networks provide the primary data on surface creep, which the authors used to estimate the average depth of creep for each fault segment. Where available, details of past ruptures of individual faults, unearthed in previous paleoseismic studies, allowed the authors to calculate recurrence rates and the probable timing and size of future earthquakes.

According to the study, four faults have accumulated sufficient strain to produce a major earthquake. Three creeping faults have large locked areas (less than 1 mm or .04 inches of creep per year) that have not ruptured in a major earthquake of at least magnitude 6.7 since the reporting of earthquakes by local inhabitants: Rodgers Creek, northern Calaveras and southern Green Valley. The southern Hayward fault, which produced a magnitude 6.8 earthquake in 1868, is now approaching its mean recurrence time based on paleoseismic studies.

The authors also estimate three faults appear to be nearing or have exceeded their mean recurrence time and have accumulated sufficient strain to produce large earthquakes: the Hayward (M 6.8), Rodgers Creek (M 7.1) and Green Valley (M 7.1).

"The San Andreas Fault and its two other large branches, the Hayward and Northern Calaveras, have been quiet for decades. This study offers a good reminder to prepare today for the next major earthquake," said Lienkaemper.

Source: Seismological Society of America

Underwater landslide doubled size of 2011 Japanese tsunami

An ocean engineer at the University of Rhode Island has found that a massive underwater landslide, combined with the 9.0 earthquake, was responsible for triggering the deadly tsunami that struck Japan in March 2011.

Professor Stephan Grilli, an international leader in the study of tsunamis, said the generally accepted explanation for the cause of the tsunami had been the earthquake, the fifth largest ever measured, which created a significant uplift and subsidence of the seafloor. While that adequately explains the 10-meter surge that affected much of the impacted area, Grilli said it cannot account for the 40-meter waves that struck a 100-kilometer area of Japan's mountainous Sanriku Coast.

"Computer models have not been able to explain the large inundation and run-up on the Sanriku Coast using the earthquake alone," Grilli said. "Our model could only get inundation up to 16 or 18 meters, not 40. So we knew there must be another cause."

His findings were published this week in the journal Marine Geology.
In a series of models, Grilli and his former doctoral student Jeff Harris worked backwards in time to recreate the movement of the seafloor from the earthquake and concluded that an additional movement underwater about 100 kilometers north of the earthquake's epicenter must have occurred to propagate the large waves that struck Sanriku. So the URI engineers and colleagues at the British Geological Survey and the University of Tokyo went looking for evidence that something else happened there.
Reviewing surveys of the seafloor conducted by Japanese scientists before and after the earthquake, the scientists found signs of a large slump on the seafloor -- a rotational landslide 40 kilometers by 20 kilometers in extent and 2 kilometers thick that traveled down the slope of the Japan Trench, leaving a horizontal footprint the size of Paris that could only have been created by a 100-meter uplift in the seafloor. The earthquake only raised the seafloor 10 meters.

"Underwater landslides tend to create shorter period tsunami waves, and they tend to concentrate their energy in a small stretch of coastline," said Grilli. "The train of waves from the landslide, combined with the earthquake generated waves, together created the 40 meter inundation along the Sanriku Coast."

Grilli said it has been difficult to convince his Japanese colleagues of his research group's results. Most assumed that the massive size of the earthquake was enough to create the waves that were observed.
"It raises questions about how we've been doing tsunami predictions in the past," he said. "We generally have just considered the largest possible earthquake, but we seldom consider underwater landslides as an additional source," even though large tsunamis in 1998 in Papua New Guinea and in 1946 in the Aleutian Islands were found to be generated by a combination of earthquakes and underwater landslides.

Grilli also said that his analysis is under considerable scrutiny because it brings into question whether Japan had adequately prepared for natural disasters prior to the 2011 event.

"There is a lot at stake in Japan," he said. "Tsunami scientists working for government agencies use tsunami return periods that are much too low in their calculations, leading them to underestimate the tsunami risk. All of the safety procedures they have in place, including at nuclear power plants, are still based on underestimating the maximum earthquake likely to strike Japan, and they underestimate the maximum tsunami, too. Japan is working toward revising their approach to tsunami hazard assessment, but this will take time."

Source: University of Rhode Island

Drilling Into an Active Earthquake Fault in New Zealand

An aerial view of the Alpine Fault at Gaunt Creek, where the Deep Fault Drilling Project is scheduled to begin next month. Three University of Michigan geologists are participating in the $2.5 million international project, which will drill nearly a mile beneath the surface and return rock samples from an active fault known to generate major earthquakes. Credit: Photo by Ben van der Pluijm
Three University of Michigan geologists are participating in an international effort to drill nearly a mile beneath the surface of New Zealand this fall to bring back rock samples from an active fault known to generate major earthquakes.

The goal of the Deep Fault Drilling Project is to better understand earthquake processes by sampling the Alpine Fault, which is expected to trigger a large event in the coming decades.

"We're trying to understand why some faults are more earthquake-prone than others, and that requires fundamental knowledge about the processes at work," said Ben van der Pluijm, the Bruce R. Clark Collegiate Professor of Geology in the U-M Department of Earth and Environmental Sciences.

Van der Pluijm and two of his EES colleagues -- doctoral student Austin Boles and research scientist Anja Schleicher -- are part of the team scheduled to start the two-month drilling project early next month. Schleicher will spend October at the site, and Boles will be there for about six weeks starting in early November.

It will be only the second science project to drill deep into an active earthquake fault and return samples. Several years ago, scientists drilled a nearly 2-mile-deep hole into California's San Andreas Fault. Van der Pluijm was a member of that team, as well.

"I hope we find something different this time, a different rock signature that contrasts with what we saw at the San Andreas," he said.

The goal is to drill 0.8 miles (1.3 kilometers) into the 530-mile-long Alpine Fault, which marks the boundary between the Australian and Pacific tectonic plates, on New Zealand's South Island. Though most of the movement along the fault is lateral rather than vertical, the fault is responsible for lifting the Southern Alps, the rugged mountain range featured in the "Lord of the Rings" movies.

Earthquakes occur on the Alpine Fault every 200 to 400 years at magnitudes of 7.5 to 8.0, with an average time between successive large earthquakes of about 330 years. Though earthquakes of that size that originate at shallow depths are capable of tremendous damage, the region is sparsely populated.

The last Alpine Fault quake occurred in 1717, and the probability of another big one occurring there in the next 50 years has been calculated at about 28 percent. So the $2.5 million Deep Fault Drilling Project presents a rare opportunity to collect and analyze samples from a major fault before it breaks.

The task for van der Pluijm and his colleagues is to analyze the possible role of clay minerals and friction melting in the fault zone. Radiometric dating, X-ray studies and isotopic-analysis techniques will be used to determine how much clay is in the rock samples and when those clays formed, as well as the likely source of the water that helped produce them.

"The information we can extract from these clays is remarkably rich," said Boles, who will use data from the New Zealand study in his doctoral dissertation. "These clay minerals are a key tool that we can use to better understand the physical and chemical processes happening in an active fault."

Clay minerals can help reduce friction and heat generation along a fault, lubricating it so that pressure is released through steady, relatively small and nondestructive "creeping" motions rather than the periodic violent jolts known as earthquakes.

Creeping motions were observed along the portion of the San Andreas Fault drilled by scientists several years ago. Temperatures in that fault were relatively low, and clay-rich rocks from the active zone were returned to the surface.

"We think that clays are a significant player in making faults less earthquake-prone," van der Pluijm said. "We know that the section of the Alpine Fault we'll be drilling has a history of producing large earthquakes. So finding little clay and, instead, evidence for frictional melting in the rock would better fit the large-earthquake scenario. That would be a fantastic breakthrough."

In addition to sampling the fault during the two-month drilling program, researchers will install permanent pressure, temperature and seismic-monitoring sensors in the borehole.

The U-M researchers are hoping to obtain a rock sample about the volume of a baseball from deep within the Alpine Fault. That would be plenty to complete their various studies, which are funded by the National Science Foundation and the International Continental Scientific Drilling Program.

"Getting the right samples is more important than the amount," van der Pluijm said. "Returning samples to the surface from depth is always a challenge, but I'm confident that it will work."

Source: University of Michigan

The New explanation for origin of plate tectonics: What set Earth's plates in motion?

The image shows a snapshot from the film after 45 million years of spreading. The pink is the region where the mantle underneath the early continent has melted, facilitating its spreading, and the initiation of the plate tectonic process. Credit: Patrice Rey, Nicolas Flament and Nicolas Coltice
The mystery of what kick-started the motion of our earth's massive tectonic plates across its surface has been explained by researchers at the University of Sydney.

"Earth is the only planet in our solar system where the process of plate tectonics occurs," said Professor Patrice Rey, from the University of Sydney's School of Geosciences.

"The geological record suggests that until three billion years ago the Earth's crust was immobile so what sparked this unique phenomenon has fascinated geoscientists for decades. We suggest it was triggered by the spreading of early continents then eventually became a self-sustaining process."

Professor Rey is lead author of an article on the findings published in Nature on Wednesday, 17 September.

The other authors on the paper are Nicolas Flament, also from the School of Geosciences and Nicolas Coltice, from the University of Lyon.

There are eight major tectonic plates that move above Earth's mantle at rates up to 150 millimetres every year.

In simple terms the process involves plates being dragged into the mantle at certain points and moving away from each other at others, in what has been dubbed 'the conveyor belt'.

Plate tectonics depends on the inverse relationship between density of rocks and temperature.

At mid-oceanic ridges, rocks are hot and their density is low, making them buoyant or more able to float. As they move away from those ridges they cool down and their density increases until, where they become denser than the underlying hot mantle, they sink and are 'dragged' under.

But three to four billion years ago, Earth's interior was hotter, volcanic activity was more prominent and tectonic plates did not become cold and dense enough to spontaneously sank.

"So the driving engine for plate tectonics didn't exist," said Professor Rey said.

"Instead, thick and buoyant early continents erupted in the middle of immobile plates. Our modelling shows that these early continents could have placed major stress on the surrounding plates. Because they were buoyant they spread horizontally, forcing adjacent plates to be pushed under at their edges."

"This spreading of the early continents could have produced intermittent episodes of plate tectonics until, as the Earth's interior cooled and its crust and plate mantle became heavier, plate tectonics became a self-sustaining process which has never ceased and has shaped the face of our modern planet."

The new model also makes a number of predictions explaining features that have long puzzled the geoscience community.

Source: University of Sydney

The Wastewater injection is culprit for most earthquakes in southern Colorado and northern New Mexico, study finds

The deep injection of wastewater underground is responsible for the dramatic rise in the number of earthquakes in Colorado and New Mexico since 2001, according to a study to be published in the Bulletin of the Seismological Society of America (BSSA).


The Raton Basin, which stretches from southern Colorado into northern New Mexico, was seismically quiet until shortly after major fluid injection began in 1999. Since 2001, there have been 16 magnitude > 3.8 earthquakes (including M 5.0 and 5.3), compared to only one (M 4.0) the previous 30 years. The increase in earthquakes is limited to the area of industrial activity and within 5 kilometers (3.1 miles) of wastewater injection wells.

In 1994, energy companies began producing coal-bed methane in Colorado and expanded production to New Mexico in 1999. Along with the production of methane, there is the production of wastewater, which is injected underground in disposal wells and can raise the pore pressure in the surrounding area, inducing earthquakes. Several lines of evidence suggest the earthquakes in the area are directly related to the disposal of wastewater, a by-product of extracting methane, and not to hydraulic fracturing occurring in the area.

Beginning in 2001, the production of methane expanded, with the number of high-volume wastewater disposal wells increasing (21 presently in Colorado and 7 in New Mexico) along with the injection rate. Since mid-2000, the total injection rate across the basin has ranged from 1.5 to 3.6 million barrels per month.

The authors, all scientists with the U.S. Geological Survey, detail several lines of evidence directly linking the injection wells to the seismicity. The timing and location of seismicity correspond to the documented pattern of injected wastewater. Detailed investigations of two seismic sequences (2001 and 2011) places them in proximity to high-volume, high-injection-rate wells, and both sequences occurred after a nearby increase in the rate of injection. A comparison between seismicity and wastewater injection in Colorado and New Mexico reveals similar patterns, suggesting seismicity is initiated shortly after an increase in injection rates.

Source: Seismological Society of America

Mega-quake possible for subduction zones along 'Ring of Fire,' new study suggests

The magnitude of the 2011 Tohoku quake (M 9.0) caught many seismologists by surprise, prompting some to revisit the question of calculating the maximum magnitude earthquake possible for a particular fault. New research offers an alternate view that uses the concept of probable maximum magnitude events over a given period, providing the magnitude and the recurrence rate of extreme events in subduction zones for that period. Most circum-Pacific subduction zones can produce earthquakes of magnitude greater than 9.0, suggests the study.

The idea of identifying the maximum magnitude for a fault isn't new, and its definition varies based on context. This study, published online by the Bulletin of the Seismological Society of America (BSSA), calculates the "probable maximum earthquake magnitude within a time period of interest," estimating the probable magnitude of subduction zone earthquakes for various time periods, including 250, 500 and 10,000 years.

"Various professionals use the same terminology -- maximum magnitude -- to mean different things. The most interesting question for us was what was going to be the biggest magnitude earthquake over a given period of time?" said co-author Yufang Rong, a seismologist at the Center for Property Risk Solutions of FM Global, a commercial and industrial property insurer. "Can we know the exact, absolute maximum magnitude? The answer is no, however, we developed a simple methodology to estimate the probable largest magnitude within a specific time frame."

The study's results indicated most of the subduction zones can generate M 8.5 or greater over a 250-return period; M 8.8 or greater over 500 years; and M 9.0 or greater over 10,000 years.

"Just because a subduction zone hasn't produced a magnitude 8.8 in 499 years, that doesn't mean one will happen next year," said Rong. "We are talking about probabilities."

The instrumental and historical earthquake record is brief, complicating any attempt to confirm recurrence rates and estimate with confidence the maximum magnitude of an earthquake in a given period. The authors validated their methodology by comparing their findings to the seismic history of the Cascadia subduction zone, revealed through deposits of marine sediment along the Pacific Northwest coast. While some subduction zones have experienced large events during recent history, the Cascadia subduction zone has remained quiet. Turbidite and onshore paleoseismic studies have documented a rich seismic history, identifying 40 large events over the past 10,000 years.

"Magnitude limits of subduction zone earthquakes" is co-authored by Rong, David Jackson of UCLA, Harold Magistrale of FM Global, and Chris Goldfinger of Oregon State University. The paper will be published online Sept. 16 by BSSA as well as in its October print edition

Source: Seismological Society of America

The Major earthquake may occur off coast of Istanbul, seismic shifts suggest

When a segment of a major fault line goes quiet, it can mean one of two things: The "seismic gap" may simply be inactive -- the result of two tectonic plates placidly gliding past each other -- or the segment may be a source of potential earthquakes, quietly building tension over decades until an inevitable seismic release.

Researchers from MIT and Turkey have found evidence for both types of behavior on different segments of the North Anatolian Fault -- one of the most energetic earthquake zones in the world. The fault, similar in scale to California's San Andreas Fault, stretches for about 745 miles across northern Turkey and into the Aegean Sea.

The researchers analyzed 20 years of GPS data along the fault, and determined that the next large earthquake to strike the region will likely occur along a seismic gap beneath the Sea of Marmara, some five miles west of Istanbul. In contrast, the western segment of the seismic gap appears to be moving without producing large earthquakes.

"Istanbul is a large city, and many of the buildings are very old and not built to the highest modern standards compared to, say, southern California," says Michael Floyd, a research scientist in MIT's Department of Earth, Atmospheric and Planetary Sciences. "From an earthquake scientist's perspective, this is a hotspot for potential seismic hazards."

Although it's impossible to pinpoint when such a quake might occur, Floyd says this one could be powerful -- on the order of a magnitude 7 temblor, or stronger.

"When people talk about when the next quake will be, what they're really asking is, 'When will it be, to within a few hours, so that I can evacuate?' But earthquakes can't be predicted that way," Floyd says. "Ultimately, for people's safety, we encourage them to be prepared. To be prepared, they need to know what to prepare for -- that's where our work can contribute"

Floyd and his colleagues, including Semih Ergintav of the Kandilli Observatory and Earthquake Research Institute in Istanbul and MIT research scientist Robert Reilinger, have published their seismic analysis in the journal Geophysical Research Letters.

In recent decades, major earthquakes have occurred along the North Anatolian Fault in a roughly domino-like fashion, breaking sequentially from east to west. The most recent quake occurred in 1999 in the city of Izmit, just east of Istanbul. The initial shock, which lasted less than a minute, killed thousands. As Istanbul sits at the fault's western end, many scientists have thought the city will be near the epicenter of the next major quake.

To get an idea of exactly where the fault may fracture next, the MIT and Turkish researchers used GPS data to measure the region's ground movement over the last 20 years. The group took data along the fault from about 100 GPS locations, including stations where data are collected continuously and sites where instruments are episodically set up over small markers on the ground, the positions of which can be recorded over time as the Earth slowly shifts.

"By continuously tracking, we can tell which parts of the Earth's crust are moving relative to other parts, and we can see that this fault has relative motion across it at about the rate at which your fingernail grows," Floyd says.

From their ground data, the researchers estimate that, for the most part, the North Anatolian Fault must move at about 25 millimeters -- or one inch -- per year, sliding quietly or slipping in a series of earthquakes.

As there's currently no way to track the Earth's movement offshore, the group also used fault models to estimate the motion off the Turkish coast. The team identified a segment of the fault under the Sea of Marmara, west of Istanbul, that is essentially stuck, with the "missing" slip accumulating at 10 to 15 millimeters per year. This section -- called the Princes' Island segment, for a nearby tourist destination -- last experienced an earthquake 250 years ago.

Floyd and colleagues calculate that the Princes' Island segment should have slipped about 8 to 11 feet -- but it hasn't. Instead, strain has likely been building along the segment for the last 250 years. If this tension were to break the fault in one cataclysmic earthquake, the Earth could shift by as much as 11 feet within seconds.

Although such accumulated strain may be released in a series of smaller, less hazardous rumbles, Floyd says that given the historical pattern of major quakes along the North Anatolian Fault, it would be reasonable to expect a large earthquake off the coast of Istanbul within the next few decades.

"Earthquakes are not regular or predictable," Floyd says. "They're far more random over the long run, and you can go many lifetimes without experiencing one. But it only takes one to affect many lives. In a location like Istanbul that is known to be subject to large earthquakes, it comes back to the message: Always be prepared."

Source: Massachusetts Institute of Technology

The New study reconstructs mega-earthquakes timeline in Indian Ocean

UM Rosenstiel School Geologist Kelly Jackson documents sediments deposited by the 2004 Indian Ocean tsunami on the southeastern coast of Sri Lanka. Credit: UM Rosenstiel School
A new study on the frequency of past giant earthquakes in the Indian Ocean region shows that Sri Lanka, and much of the Indian Ocean, is affected by large tsunamis at highly variable intervals, from a few hundred to more than one thousand years. The findings suggest that the accumulation of stress in the region could generate as large, or even larger tsunamis than the one that resulted from the 2004 magnitude-9.2 Sumatra earthquake.


Researchers from the University of Miami (UM) Rosenstiel School of Marine and Atmospheric Science and the University of Peradeniya in Sri Lanka collected and analyzed 22 sediment cores from Karagan Lagoon, Hambantota in southeastern Sri Lanka, to expand the historical record of giant earthquakes along the Sumatra-Andaman subduction zone, where the Indo-Australian plate and Eurasian plate meet. Using sand deposited in the lagoon during the 2004 Indian Ocean tsunami and seven older paleo-tsunami deposits as proxies for large earthquakes in the region, the scientists reconstructed the timeline for mega-earthquakes along the Indian Ocean's plate boundary from Myanmar to Indonesia, assuming that the tsunamis were all generated by large earthquakes.

"In Sri Lanka, coastal lagoons were inundated by this tsunami and others that occurred over thousands of years," said Gregor Eberli, professor of Marine Geosciences and director of UM's CSL -- Center for Carbonate Research. "These lagoons are ideal repositories for tsunami sand layers because after deposition, the tsunami sands were sealed with mud."

The Dec. 26, 2004 M-9.2 Sumatra earthquake resulted in a trans-oceanic tsunami, with wave heights up to 100 feet (30 meters) in some places, which impacted much of the Indian Ocean region causing widespread damage in southeastern Sri Lanka.

During the a 7,000-year record of Indian Ocean tsunamis preserved in the sediment, the research team found evidence that estimated the time period between consecutive tsunamis from 181 (up to 517) years and 1045 (± 334) years. The longest period was nearly twice the time period prior to the 2004 earthquake.

"These results are very important to better understand the tsunami hazard in Sri Lanka," said Kelly Jackson, UM Rosenstiel School Ph.D. candidate and lead author of the study.

"A scary result is a 1000-year time period without a tsunami, which is nearly twice as long as the lull period prior to the 2004 earthquake," said Falk Amelung, professor of geophysics within the department of Marine Geosciences at the UM Rosenstiel School. "This means that the subduction zone is capable of generating earthquakes almost twice as big as in 2004, although we don't have any evidence yet that this actually happened."

"The 2004 tsunami caught us completely by surprise, although we should have known better because there is a Sri Lankan legend in which the sea came ashore in 200 B.C.," says Chandra Jayasena, a geologist at the University of Peradeniya. "We now need to study other lagoons to further expand the historical record of large tsunami-generating earthquakes in the region and get a better understanding of the earthquake frequency in this highly populated region."

The region's subduction zone exhibits great variability in rupture modes, putting it on the list with the Cascadia Subduction Zone, which stretches from Vancouver Island to northern California and Chile, according to the authors.

Source: University of Miami Rosenstiel School of Marine & Atmospheric Science

The Textbook theory behind volcanoes may be wrong

Tungurahua volcano eruption. Credit: © Sunshine Pics / Fotolia
In the typical textbook picture, volcanoes, such as those that are forming the Hawaiian islands, erupt when magma gushes out as narrow jets from deep inside Earth. But that picture is wrong, according to a new study from researchers at Caltech and the University of Miami in Florida.

New seismology data are now confirming that such narrow jets don't actually exist, says Don Anderson, the Eleanor and John R. McMillian Professor of Geophysics, Emeritus, at Caltech. In fact, he adds, basic physics doesn't support the presence of these jets, called mantle plumes, and the new results corroborate those fundamental ideas.

"Mantle plumes have never had a sound physical or logical basis," Anderson says. "They are akin to Rudyard Kipling's 'Just So Stories' about how giraffes got their long necks."

Anderson and James Natland, a professor emeritus of marine geology and geophysics at the University of Miami, describe their analysis online in the September 8 issue of the Proceedings of the National Academy of Sciences.

According to current mantle-plume theory, Anderson explains, heat from Earth's core somehow generates narrow jets of hot magma that gush through the mantle and to the surface. The jets act as pipes that transfer heat from the core, and how exactly they're created isn't clear, he says. But they have been assumed to exist, originating near where Earth's core meets the mantle, almost 3,000 kilometers underground -- nearly halfway to the planet's center. The jets are theorized to be no more than about 300 kilometers wide, and when they reach the surface, they produce hot spots.

While the top of the mantle is a sort of fluid sludge, the uppermost layer is rigid rock, broken up into plates that float on the magma-bearing layers. Magma from the mantle beneath the plates bursts through the plate to create volcanoes. As the plates drift across the hot spots, a chain of volcanoes forms -- such as the island chains of Hawaii and Samoa.

"Much of solid-Earth science for the past 20 years -- and large amounts of money -- have been spent looking for elusive narrow mantle plumes that wind their way upward through the mantle," Anderson says.

To look for the hypothetical plumes, researchers analyze global seismic activity. Everything from big quakes to tiny tremors sends seismic waves echoing through Earth's interior. The type of material that the waves pass through influences the properties of those waves, such as their speeds. By measuring those waves using hundreds of seismic stations installed on the surface, near places such as Hawaii, Iceland, and Yellowstone National Park, researchers can deduce whether there are narrow mantle plumes or whether volcanoes are simply created from magma that's absorbed in the sponge-like shallower mantle.

No one has been able to detect the predicted narrow plumes, although the evidence has not been conclusive. The jets could have simply been too thin to be seen, Anderson says. Very broad features beneath the surface have been interpreted as plumes or super-plumes, but, still, they're far too wide to be considered narrow jets.

But now, thanks in part to more seismic stations spaced closer together and improved theory, analysis of the planet's seismology is good enough to confirm that there are no narrow mantle plumes, Anderson and Natland say. Instead, data reveal that there are large, slow, upward-moving chunks of mantle a thousand kilometers wide.

In the mantle-plume theory, Anderson explains, the heat that is transferred upward via jets is balanced by the slower downward motion of cooled, broad, uniform chunks of mantle. The behavior is similar to that of a lava lamp, in which blobs of wax are heated from below and then rise before cooling and falling. But a fundamental problem with this picture is that lava lamps require electricity, he says, and that is an outside energy source that an isolated planet like Earth does not have.

The new measurements suggest that what is really happening is just the opposite: Instead of narrow jets, there are broad upwellings, which are balanced by narrow channels of sinking material called slabs. What is driving this motion is not heat from the core, but cooling at Earth's surface. In fact, Anderson says, the behavior is the regular mantle convection first proposed more than a century ago by Lord Kelvin. When material in the planet's crust cools, it sinks, displacing material deeper in the mantle and forcing it upward.

"What's new is incredibly simple: upwellings in the mantle are thousands of kilometers across," Anderson says. The formation of volcanoes then follows from plate tectonics -- the theory of how Earth's plates move and behave. Magma, which is less dense than the surrounding mantle, rises until it reaches the bottom of the plates or fissures that run through them. Stresses in the plates, cracks, and other tectonic forces can squeeze the magma out, like how water is squeezed out of a sponge. That magma then erupts out of the surface as volcanoes. The magma comes from within the upper 200 kilometers of the mantle and not thousands of kilometers deep, as the mantle-plume theory suggests.

"This is a simple demonstration that volcanoes are the result of normal broad-scale convection and plate tectonics," Anderson says. He calls this theory "top-down tectonics," based on Kelvin's initial principles of mantle convection. In this picture, the engine behind Earth's interior processes is not heat from the core but cooling at the planet's surface. This cooling and plate tectonics drives mantle convection, the cooling of the core, and Earth's magnetic field. Volcanoes and cracks in the plate are simply side effects.

The results also have an important consequence for rock compositions -- notably the ratios of certain isotopes, Natland says. According to the mantle-plume idea, the measured compositions derive from the mixing of material from reservoirs separated by thousands of kilometers in the upper and lower mantle. But if there are no mantle plumes, then all of that mixing must have happened within the upwellings and nearby mantle in Earth's top 1,000 kilometers.

The paper is titled "Mantle updrafts and mechanisms of oceanic volcanism."

Source: California Institute of Technology

The New, inexpensive method for understanding earthquake topography

Using high-resolution topography models not available in the past, geologists can greatly enrich their research. However, current methods of acquisition are costly and require trained personnel with high-tech, cumbersome equipment. In light of this, Kendra Johnson and colleagues have developed a new system that takes advantage of affordable, user-friendly equipment and software to produce topography data over small, sparsely vegetated sites at comparable (or better) resolution and accuracy to standard methods.

Their workflow is based on structure from motion (SfM), which uses overlapping photographs of a scene to produce a 3-D model that represents the shape and scale of the terrain. To acquire the photos, Johnson and colleagues attached a camera programmed to take time-lapse photos to a helium balloon or small, remote-controlled glider. They augmented the aerial data by recording a few GPS points of ground features that would be easily recognized in the photographs.

Using a software program called Agisoft Photoscan, they combined the photographs and GPS data to produce a robust topographic model.

Johnson and colleagues note that this SfM workflow can be used for many geologic applications. In this study for Geosphere, Johnson and colleagues focused on its potential in studying active faults that pose an earthquake hazard.

They targeted two sites in southern California, each of which has existing topography data collected using well-established, laser-scanning methods.

The first site covers a short segment of the southern San Andreas fault that historically has not had a large earthquake; however, the ground surface reveals evidence of prehistoric ruptures that help estimate the size and frequency of earthquakes on this part of the fault. The team notes that this evidence is more easily quantified using high-resolution topography data than by geologists working in the field.

The second site covers part of the surface rupture formed during the 1992 Landers earthquake (near Palm Springs, California, USA). Johnson and colleagues chose this site to test the capability of their workflow as part of the scientific response that immediately follows an earthquake.

At each site, they compared their SfM data to the existing laser scanner data and found that the values closely matched. Johnson and colleagues conclude that their new SfM workflow produces topography data at sufficient quality for use in earthquake research.

Source: Geological Society of America

Can a stack of computer servers survive an earthquake?


The rack of servers shook, but did not fall, during a simulation that mimicked 80 percent of the force of 1994's Northridge earthquake. Credit: Cory Nealon, University at Buffalo
How do you prevent an earthquake from destroying expensive computer systems?

That's the question earthquake engineer Claudia Marin-Artieda, PhD, associate professor of civil engineering at Howard University, aims to answer through a series of experiments conducted at the University at Buffalo.

"The loss of functionality of essential equipment and components can have a disastrous impact. We can limit these sorts of equipment losses by improving their seismic performance," Marin-Artieda said.
In buildings such as data centers, power plants and hospitals, it could be catastrophic to have highly-sensitive equipment swinging, rocking, falling and generally bashing into things.

In high-seismic regions, new facilities often are engineered with passive protective systems that provide overall seismic protection. But often, existing facilities are conventional fixed-base buildings in which seismic demands on sensitive equipment located within are significantly amplified. In such buildings, sensitive equipment needs to be secured from these damaging earthquake effects, Marin-Artieda said.
The stiffer the building, the greater the magnification of seismic effects, she added.

"It is like when you are riding a rollercoaster," she said. "If your body is relaxed, you don't feel strong inertial effects. But if you hold your body rigid, you'll feel the inertial effects much more, and you'll get knocked about in the car."

The experiments were conducted this month at the University at Buffalo's Network for Earthquake Engineering Simulation (NEES), a shared network of laboratories based at Purdue University.
Marin-Artieda and her team used different devices for supporting 40 computer servers donated by Yahoo Labs. The researchers attached the servers to a frame in multiple configurations on seismically isolated platforms. They then subjected the frame to a variety of three-directional ground motions with the servers in partial operation to monitor how they react to an earthquake simulation.

Preliminary work confirmed, among other things, that globally and locally installed seismic isolation and damping systems can significantly reduce damage to computer systems and other electronic equipment.

Base isolation is a technique that sets objects atop an energy-absorbing base; damping employs energy-absorbing devices within the object to be protected from an earthquake's damaging effects.
Marin-Artieda plans to expand the research by developing a framework for analysis, design and implementation of the protective measures.

The research is funded by the National Science Foundation. In addition to Yahoo Labs, industry partners include Seismic Foundation Control Inc., The VMC Group, Minus K Technology Inc., Base Isolation of Alaska, and Roush Industries Inc. All provided in-kind materials for the experiments.

Video showing one of the tests, which mimics 80 percent of the force of 1994's Northridge earthquake: https://www.youtube.com/watch?v=hTkemnt8hR4

Source: University at Buffalo

The Seismic hazards reassessed in the Andes

Gulf of Guayaquil and Andes. Credit: © IRD / L. Audin
Although being able to predict the date on which the next big earthquake will occur is still some way off becoming a reality, it is now possible to identify the areas where they will occur. IRD researchers and their French, Ecuadorian and Peruvian partners have just measured the current deformation in the northern part of the Andes for the first time using GPS, where the tectonics of the Pacific and South American plates govern the high seismic activity in the region. The scientists then identified the areas where the fault, located at the interface of these two plates, is capable of generating large earthquakes or not.

This work, which was published in Nature Geoscience, also shed light on the formation of large tectonic structures such as the Bolivian highlands and the Gulf of Guayaquil in Ecuador, with the discovery of a continental microplate in Peru and southern Ecuador.

First measurement of the deformation in the northern Andes
The Andes have had three of the largest earthquakes ever recorded: on the border between Colombia and Ecuador in 1906, as well as in Chile, in 1960 and again in 2010. When will one of these major earthquakes happen there again? It is impossible to say... But scientists can now identify the areas where it will occur. Researchers from the Géoazur, ISTerre and ISTEP laboratories and their partners from geophysical and geographical institutes in Ecuador and Peru, have just measured the deformation in the northern Andes caused by the subduction of the Pacific Oceanic plate under the South American continental plate. Using a vast GPS network which has been deployed since 2008 and observational data collected since the 1990s, they have quantified the movements of 100 measurement points from central Peru to southern Colombia, with an accuracy of about one millimetre per year.

Clearly determined seismic areas
The researchers were able to locate the areas at risk. Only two fault segments can produce mega-earthquakes (greater than 8.5 on the Richter scale), potentially accompanied by tsunamis: the first is located in central Peru and the second is further north, extending from northern Ecuador to southern Colombia. In between these two active segments, the research team identified a third subduction segment. Somewhat surprisingly, this is characterised by sliding that is mainly "aseismic." So in this area spanning more than 1,000 km from the north of Peru to the south of Ecuador, or 20% of the length of the Andean subduction, the accumulated energy seems insufficient to produce a mega-earthquake. Across the region, earthquakes remain more superficial and more modest in magnitude, as shown in recent history.

Andean structures explained
These studies have also enabled the researchers to discover a large continental block, wedged between the Pacific and South American plates. This piece of continent was called the "sliver Inca" by the authors of the study and is more than 1,500 km long and 300 to 400 km wide. It is separated from the continental plate and moves 5 to 6 mm per year towards the south-east in relation to it. This finding suggests that the current deformation of the Andes from Venezuela to southern Chile, and the seismic activity in the region are dominated by the movements of several microplates of that type.

The discovery of the "sliver Inca" also explains the location of major tectonic structures. For example, the Bolivian highlands, the second highest plateau in the world, was created by the "sliver Inca" and the central Andes microplate coming together. In contrast, the opening of the Gulf of Guayaquil in Ecuador is a result of the divergence of the Inca block and the northern Andes microplate.

These studies allow a better understanding of recent developments in the Andes and their continental margins. They therefore make better estimates of seismic hazards in the region possible.

Source: Institut de Recherche pour le Développement (IRD)

Likely near-simultaneous earthquakes complicate seismic hazard planning for Italy

Before the shaking from one earthquake ends, shaking from another might begin, amplifying the effect of ground motion. Such sequences of closely timed, nearly overlapping, consecutive earthquakes account for devastating seismic events in Italy's history and should be taken into account when building new structures, according to research published in the September issue of the journal Seismological Research Letters (SRL).

"It's very important to consider this scenario of earthquakes, occurring possibly seconds apart, one immediately after another," said co-author Anna Tramelli, a seismologist with the Istituto Nazionale di Geofisica e Vulcanologia in Naples, Italy. "Two consecutive mainshocks of magnitude 5.8 could have the effect of a magnitude 6 earthquake in terms of energy release. But the effect on a structure could be even larger than what's anticipated from a magnitude 6 earthquake due to the longer duration of shaking that would negatively impact the resilience of a structure."

Historically, multiple triggered mainshocks, with time delays of seconds to days, have caused deadly earthquakes along the Italian Apennine belt, a series of central mountain ranges extending the length of Italy. The 1997-98 Umbria-March seismic sequence numbered six mainshocks of moderate magnitude, ranging M 5.2 -- 6.0. The 1980 Irpinia earthquakes included a sequence of three events, occurring at intervals within 20 seconds of each other. The 2012 Emilia sequence started with an M 5.9 event, with the second largest mainshock (M 5.8) occurring nine days later, and included more than 2000 aftershocks.

In this study, Tramelli and her colleagues used the recorded waveforms from the 2012 Emilia seismic sequence to simulate a seismic sequence that triggered end-to-end earthquakes along adjacent fault patches, observing the affect of continuous ruptures on the resulting ground motion and, consequently, its impact on critical structures, such as dams, power plants, hospitals and bridges.

"We demonstrated that consecutively triggered earthquakes can enhance the amount of energy produced by the ruptures, exceeding the design specifications expected for buildings in moderate seismic hazard zones," said Tramelli, whose analysis suggests that the shaking from multiple magnitude 5.0 earthquakes would be significantly greater than from an individual magnitude 5.0 event.

And back-to-back earthquakes are more than theoretical, say the authors, who note that this worst-case scenario has happened at least once in Italy's recent history. Previous studies identified three sub-events at intervals of 20 seconds in the seismic signals recorded during the 1980 Irpinia earthquake sequence, whose shared ground motion caused more than 3000 deaths and significant damage to structures.
A "broader and modern approach" to seismic risk mitigation in Italy, suggest the authors, would incorporate the scenario of multiple triggered quakes, along with the present understanding of active fault locations, mechanisms and interaction.

Source: Seismological Society of America

The Pacific plate shrinking as it cools

A map produced by scientists at the University of Nevada, Reno, and Rice University shows predicted velocities for sectors of the Pacific tectonic plate relative to points near the Pacific-Antarctic ridge, which lies in the South Pacific ocean. The researchers show the Pacific plate is contracting as younger sections of the lithosphere cool.
Credit: Corné Kreemer and Richard Gordon
The tectonic plate that dominates the Pacific "Ring of Fire" is not as rigid as many scientists assume, according to researchers at Rice University and the University of Nevada.

Rice geophysicist Richard Gordon and his colleague, Corné Kreemer, an associate professor at the University of Nevada, Reno, have determined that cooling of the lithosphere -- the outermost layer of Earth -- makes some sections of the Pacific plate contract horizontally at faster rates than others and cause the plate to deform.

Gordon said the effect detailed this month in Geology is most pronounced in the youngest parts of the lithosphere -- about 2 million years old or less -- that make up some the Pacific Ocean's floor. They predict the rate of contraction to be 10 times faster than older parts of the plate that were created about 20 million years ago and 80 times faster than very old parts of the plate that were created about 160 million years ago.

The tectonic plates that cover Earth's surface, including both land and seafloor, are in constant motion; they imperceptibly surf the viscous mantle below. Over time, the plates scrape against and collide into each other, forming mountains, trenches and other geological features.

On the local scale, these movements cover only inches per year and are hard to see. The same goes for deformations of the type described in the new paper, but when summed over an area the size of the Pacific plate, they become statistically significant, Gordon said.

The new calculations showed the Pacific plate is pulling away from the North American plate a little more -- approximately 2 millimeters a year -- than the rigid-plate theory would account for, he said. Overall, the plate is moving northwest about 50 millimeters a year.

"The central assumption in plate tectonics is that the plates are rigid, but the studies that my colleagues and I have been doing for the past few decades show that this central assumption is merely an approximation -- that is, the plates are not rigid," Gordon said. "Our latest contribution is to specify or predict the nature and rate of deformation over the entire Pacific plate."

The researchers already suspected cooling had a role from their observation that the 25 large and small plates that make up Earth's shell do not fit together as well as the "rigid model" assumption would have it. They also knew that lithosphere as young as 2 million years was more malleable than hardened lithosphere as old as 170 million years.

"We first showed five years ago that the rate of horizontal contraction is inversely proportional to the age of the seafloor," he said. "So it's in the youngest lithosphere (toward the east side of the Pacific plate) where you get the biggest effects."

The researchers saw hints of deformation in a metric called plate circuit closure, which describes the relative motions where at least three plates meet. If the plates were rigid, their angular velocities at the triple junction would have a sum of zero. But where the Pacific, Nazca and Cocos plates meet west of the Galápagos Islands, the nonclosure velocity is 14 millimeters a year, enough to suggest that all three plates are deforming.

"When we did our first global model in 1990, we said to ourselves that maybe when we get new data, this issue will go away," Gordon said. "But when we updated our model a few years ago, all the places that didn't have plate circuit closure 20 years ago still didn't have it."

There had to be a reason, and it began to become clear when Gordon and his colleagues looked beneath the seafloor. "It's long been understood that the ocean floor increases in depth with age due to cooling and thermal contraction. But if something cools, it doesn't just cool in one direction. It's going to be at least approximately isotropic. It should shrink the same in all directions, not just vertically," he said.
A previous study by Gordon and former Rice graduate student Ravi Kumar calculated the effect of thermal contraction on vertical columns of oceanic lithosphere and determined its impact on the horizontal plane, but viewing the plate as a whole demanded a different approach. "We thought about the vertically integrated properties of the lithosphere, but once we did that, we realized Earth's surface is still a two-dimensional problem," he said.

For the new study, Gordon and Kreemer started by determining how much the contractions would, on average, strain the horizontal surface. They divided the Pacific plate into a grid and calculated the strain on each of the nearly 198,000 squares based on their age, as determined by the seafloor age model published by the National Geophysical Data Center.

"That we could calculate on a laptop," Gordon said. "If we tried to do it in three dimensions, it would take a high-powered computer cluster."

The surface calculations were enough to show likely strain fields across the Pacific plate that, when summed, accounted for the deformation. As further proof, the distribution of recent earthquakes in the Pacific plate, which also relieve the strain, showed a greater number occurring in the plate's younger lithosphere. "In the Earth, those strains are either accommodated by elastic deformation or by little earthquakes that adjust it," he said.

"The central assumption of plate tectonics assumes the plates are rigid, and this is what we make predictions from," said Gordon, who was recently honored by the American Geophysical Union for writing two papers about plate movements that are among the top 40 papers ever to appear in one of the organization's top journals. "Up until now, it's worked really well."

"The big picture is that we now have, subject to experimental and observational tests, the first realistic, quantitative estimate of how the biggest oceanic plate departs from that rigid-plate assumption."

The National Science Foundation supported the research. Gordon is the Keck Professor of Geophysics and chairman of the Earth Science Department at Rice.

Source: Rice University
Environment Now
Technology+Physics
Health + Medicine
Plants + Animals
SPACE + TIME
Science + Society

 
BREAKING NEWS