JavaScript disabled. Please enable JavaScript to use My News, My Clippings, My Comments and user settings.

If you have trouble accessing our login form below, you can go to our login page.

If you have trouble accessing our login form below, you can go to our login page.

ANU researchers gain greater insight into speed, source of tsunamis

Date

Megan Gorrey

Zoom in on this story. Explore all there is to know.

<i></i>

Researchers at the Australian National University have used new data to tweak tsunami simulation models and provide more accurate information about the timing and shape of the deadly giant waves.

Phil Cummins and Sebastien Allgeyer have used the country's biggest supercomputer, nicknamed "Raijin", at the National Computational Infrastructure centre at ANU, to investigate the scientific models used to detect and monitor tsunamis.

The pair used the supercomputer to simulate the giant wave that hit Chile in 2010 and the Japanese tsunami of 2011. They noticed the tsunamis travelled more slowly than the computerised models predicted.

They discovered the standard model did not take into account the huge amount of energy generated during a tsunami, which was enough to "bend" the ocean floor.

That extra data meant scientists could more accurately predict the time a tsunami would arrive at shore and the shape of the wave for the first time, Professor Cummins said.

''This is the first time this effect has been included in practical tsunami simulation,'' he said.

''We believe the source models from tsunamis will change significantly once this effect is taken into account.''

Dr Allgeyer said the findings were important for understanding how tsunamis originated.

''The quality of tsunami data has improved tremendously over the past decade, but our deep-ocean tsunami modelling hasn't,'' he said.

''In many cases the only data we have to model the tsunami excitation is far-field data, and now we can use that far-field data to better understand the tsunami source.''

Professor Cummins said scientists across the world saw a pressing need for better observational data on the giant waves following the devastating Boxing Day tsunami in 2004, which killed more than 230,000 people.

It prompted researchers at the National Oceanic and Atmospheric Administration in the United States to develop a worldwide monitoring system known as DART, or Deep-ocean Assessment and Reporting of Tsunamis.

It is made up of a network of buoys that sit on the ocean floor, with most located around the volatile "Ring of Fire" in the Pacific Ocean, and contain pressure gauges that detect changes in the surrounding water.

Data collected from the buoys has been used to create scientific models used to predict the timing and impact of a tsunami.

However, researchers noticed the model did not account for a couple of discrepancies that they could not explain.

''The shape looked different,'' Professor Cummins said.

''The tsunami always had this trough in front of it, and we couldn't work out why, and it was also delayed a bit.''

Professor Cummins said the standard model assumed the sea floor was rigid, and did not factor in the elasticity of the earth's surface. 

The amount of energy generated during a tsunami caused the ocean floor to buckle and shift by a matter of millimetres, and generally no more than one centimetre. 

Including those tiny movements in the scientific data brought the simulations much closer to real-life behaviour of the tsunamis the researchers studied.

Professor Cummins said he did not expect the new modelling would change the emergency warning systems in place. 

''But we're trying to use some of the data to get better source models for some of the tsunamis that have taken place in the past few years.''

Advertisement
Featured advertisers
Advertisement