Battling data in the eXtreme

Battling data in the eXtreme

By Nathan Statz

March/April Edition, 2008: Some of the most remote places on the planet are also home to sophisticated IT deployments for everything from monitoring the ash plumes of an active volcano to tidal levels at tsunami warning centres.

The first reaction humans have to something out of the ordinary is to gaze intently and soak up as much information as our confused brains can handle. It’s no surprise then that the sight of waves at the beach suddenly receding fifty metres and coastline that’s usually nestled beneath a blank of water suddenly becoming exposed draws all the beachgoers forward to have a better look.

Unfortunately for most people, the water being sucked out to sea is a strong indication that a Tsunami is about to hit and by then it’s too late to out-run the speed of the wave on foot, with only those in the know realising that this is a sign you should climb to the highest ground you can find.

Relying on the ocean to hike its skirts up right before splashing all over the countryside is hardly a reliable detection method, particularly when this phenomenon only occurs if the initial impact of the Tsunami is a trough rather than an actual wave.

Paul Whitmore, director at the West Coast/Alaska Tsunami Warning Center explains that Tsunami detection is based on seismic and sea level data that is streamed directly to research teams every day. The warnings come from the real-time processing of data, where even a few minutes of delay could mean the difference between a coastline being evacuated in time or not.

On the hardware side, Whitmore and his team are running standard off-the-shelf technology, though it’s a different story for the actual monitoring equipment. “We have had to design and install most of our recording sites. There is no turn-key system for this type of work,” he says.

Keeping a firm eye on the sea means everything from sea levels, coastal tide gage data and deep ocean pressure sensor levels, all of which are transmitted and processed at the Warning Centre.

Software wise it’s a little hard to buy a shrink-wrapped copy of “Tsunami Warning for Dummies” so most of the software gets written in-house. “We use the USGS (United States Geological Survey) earthworm architecture as a base platform, and create code as necessary. Software includes seismic data analysis, sea level data analysis, GIS, message and graphic creation, tsunami forecast models, and some other specialized codes,” says Whitmore.

Most of the data analysis starts being generated when underground seismic activity is first recorded which often begins far out to sea with an underground earthquake, landslide or other types of volcanic activity. Whitmore explains that the data is extremely important, without it there could be no predictions made on when a wave may impact, or how large it is when it arrives. Tsunami’s will often travel at more than 500km/hour so it isn’t unheard of waves to cross entire oceans at top speed before expending their energy on dry land.

“Tsunami formation is not directly monitored. We base our warnings initially on seismic data, and refine them based on sea level readings, forecast models, and historical data. Unfortunately, there is not a one-to-one correspondence between earthquake recordings and tsunami size,” said Whitmore.

Tsunami monitoring operates a highly regionalised system, with the absence of a centralised location to act as the brains of a global co-operation scheme. Whitmore explains that while there is some collaboration between centres, each centre has the responsibility for a certain area of the world and reports to that region. These reports will go out to all levels of authority, from local, state, federal and emergency management agencies as well as the coast guard and military units. According to Whitmore, the centre records several tsunami’s each year, but on average only one or two are actually damaging.

When the ground starts shaking?

When you don’t live in an area like eastern Tokyo or the southern tip of Florida it’s easy to forget that many communities live under the constant threat of the ground getting a good shaking. IT has its place in this extreme area as without it the speed at which pens could fly across paper wouldn’t be anywhere near as fast as supercomputers crunching electrons and spitting out data that saves lives.

Dr. Harley Benz, scientist-in-charge at the National Earthquake Information Centre (NEIC) explains that monitoring is a globally linked affair, with seismic data streaming in from 1,000 different stations worldwide.

“We integrate seismic data in real-time and generate about 10 Gigabytes (GB) of data per day from what we acquire,” says Benz.

This data load is run on a server nest of Sun and Linux boxes, with several USGS IBM servers adding to the mix. The NEIC is connected to the Global Seismic Network which is run by the United States Geological Survey (USGS) and encompasses 140 stations worldwide and a further 100 real-time broadband stations within North America.

“These data streams are viewed as critical networks, aimed at complimenting the data we get from national providers like Geoscience Australia who provide us with a link to their high quality network as part of their tsunami monitoring mission,” says Benz.

The stations the NEIC is setting up are generally seismometers for recording tremors in the earth’s crust and located in extremely remote locations so as to cut down background noise.

Benz explains that the background noise, known as ‘cultural noise’ in the industry can corrupt seismic readings and has motivated the USGS to bury stations in abandoned mine shafts or on extremely remote and hard to access islands. Undersea stations are also in operation but are being phased out due to the cost of maintaining them and the high probability of losing one.

“There’s very few, if any stations left under the sea, with almost all now coming from island sites, in the past there has been a few cases of ocean-floor seismometers but these are difficult to deploy and maintain and can be unreliable,” says Benz.

The earthquake monitoring networks are now all using common protocols, which Benz believes is a major step forward as now no one has to do it entirely on their own.

The remote earthquake monitoring stations are completely automated, with a data logger recording the seismic activity which is then digitised into IP type packets and sent via wireless technology like VSAT’s to the NEIC headquarters.

Monitoring stations are now so numerous that in Japan alone there are tens of thousands of them according to Benz, though the NEIC doesn’t take in data from these sources if the earthquake is weaker than 4.5 on the Richter’s Scale. There are other reasons why earthquakes might not find their way into NEIC databases, such as an event failing to be recorded by a minimum of 14 stations. When this happens the lack of stations to compare and contrast the data will mean it’s not accurate enough to gather intelligence about and process the information.

This also affects the reaction time in earthquake monitoring as Benz explains that a recent earthquake in Indonesia took 8 minutes to register because it took 8 minutes from the first tremor before the 14th station picked it up and the event was properly recorded. This places even more importance on the IT deployment as many lives could be lost should critical monitoring stations fail.

“We’re heavily dependent on technology, operationally it’s a very small staff but from an IT perspective it’s a highly automated flow through process and in order to make it far more efficient, everything we do is in real time and that’s why we are so reliant on the data,” says Benz.

The NEIC database hooks directly into an email and SMS messaging service that shoots out earthquake warnings. This system would send out over 900,000 emails during the infamous Solomon Island earthquakes with the savage magnitude 6 after-shocks that went thundering across the island.

“It’s surprising that 75% of our subscribers are .com or .net registrants, so public users in effect, we’re also the primary contact for a number of federal agencies,” says Benz.

Mountains of Fire

While the fires of Mount Doom in J.R.R Tolkien’s classic Lord of the Rings were a fictional creation, the shadow of the active volcanic mountains that many millions of people live under today are far from story telling.

For those that study the ruptures in the earth’s crust that spew forth molten rock, the perils to man are also perils to the IT equipment that monitor them. Specifically designed sensors relay potentially life saving information ranging from the seismic activity sensors to the satellite imagery of ash plumes.

According to Dr. Jonathan Dehn, associate research professor at the Alaska Volcano Observatory (AVO), the main body of work in Volcano monitoring is around thermal anomalies and ash plume detection with the main limitation being programmer time. This constraint is born out of the need for all software to be written in-house, along with most of the algorithms, with only some coming from international collaborators.

The AVO has been entrenched in an automated detection project in an attempt to alleviate this bottleneck, with constantly improving success rates, particularly around ash plume detection during the day. This is of particular importance because some volcano’s give little to no warning before they start spewing forth pyroclastic material.

”Some volcanoes may give us days of warnings, others none at all and our first clue is a full fledged eruption. Gathering the best baseline data to look for any deviation is the best predictive tool. We have over 15 years of satellite data for these volcanoes, and are now able to generate very accurate profiles of their activity,” says Dehn.

The AVO has working relationships with other volcano observatories such as Cascades, Kamchatka and Sahkalin, although in many cases they receive AVO alerts rather than assist in automated detection.

“We provide the remote sensing service that they do not have. In monitoring sadly, support is hard to obtain, so we avoid duplicating effort. Our only redundancy should be in systems needed to make sure the data keeps flowing.”

Guy Tytgat, researcher at the AVO explains that one of the biggest variables with volcano monitoring is the activity of the menacing mountains themselves, “along the Aleutian Islands, there are approximately 45 active volcanoes, but some of them are much more active than others,” he says. “Some of the active volcanoes will have an eruption every few years; others won’t do anything for hundreds of years.

“The computerization of the data flow and data analysis has helped the monitoring of volcanoes in a tremendous way,” Continues Tytgat. “It has allowed us to look at details from signals and has allowed us to look at it in totally new ways, which would have been impossible before with plain analog data. This in turn allows us to “see” the first signs of unrest much earlier than before, and we are more confident in projecting scenarios of eruptive behaviors than without those additional views of the data stream.”

This theme is common amongst the extreme situations that monitoring and research stations need to use IT, and for the countless millions living in high-risk areas, there is a great deal of relief that they do. While many executives worry about their datacenters and redundancy, spare a thought for those monitoring 30 metre high waves or rivers of molten fire and what they have to put up with should critical systems fail.

Comment on this story.