World’s largest science project gets some local Supercomputing help

World’s largest science project gets some local Supercomputing help

By XXXXXX

Month Date, 2007: 2008 is shaping up to be an interesting year for scientific breakthrough and the University of Melbourne’s will be no stranger to the action. With supercomputing power on hand, the Melbourne Physics Department will assist in the Large Hadron Collider (LHC) project, a massive physics experiment expected to become the world’s largest high energy particle accelerator and provide answers to a vital missing link in standard physics.

While most of us don’t often focus on what physicists and other scientists are up to with their intricate experiments, one such project is so massive in scale that it’s difficult not to notice. Australian involvement in the international the Large Hadron Collider (LHC) project focus’s on ATLAS, which is the largest of six particle detector experiments being conducted in Geneva, Switzerland and forms an integral part of the world’s biggest science project.

This project aims to shed light on questions about a wide range of topics including the fundamental nature of matter and the basic forces that shape our universe. Priced around $6 billion, the project is financed by thirty four different countries, universities and laboratories and will be housed in a 27km underground facility built under both France and Switzerland.

The experiment is being conducted using the LHC which is a giant particle accelerator and collider. The LHC will be fired up in 2008, when it will begin to generate 14 Petabytes of Data, the equivalent of 14,000 Terabytes or 14 million Gigabytes. It’s this staggering data requirement which has motivated the University of Melbourne to build a supercomputer array in order to become a part of the project.

“In 2008 when the LHC gets turned on it will become the world’s largest scientific instrument. There will be a number of experiments running; largest of those is the ATLAS experiment which we’re involved in” says Dr Glen Maloney from the University of Melbourne’s Physics Department.

The supercomputer runs on sixteen Dell PowerEdge servers, eight of which run on Intel Woodcrest processors and eight on AMD Opteron processors with 2 GB of ram per core. The entire setup uses Dell MD1000 storage units which will be required to endure the punishment of several terabytes of data streaming through the setup.

While there is an incredible level of co-operation occurring to get the experiment up and running, once the switch is flicked and the process begins, a competitive race to new scientific discoveries will begin. According to Maloney, this has led to an enormous amount of pressure to get to the data as quickly as possible.

The project is being carried out by The European Organisation for Nuclear Research, which you would naturally assume would be known as EONR, but is officially known as CERN. The name reflects the original name for the site being French, and the acronym for the French translation has been kept in use.

“The University of Melbourne will become a major node of a global supercomputer, we will be able to process data on the computers of other centres all over the globe if we need too and vice versa,” says Maloney.

Although Melbourne will feature as the main hub for Australia, the University of Sydney will also help out in the experiment. Co-operation for the project is on a far grander scale than just Australia, however, as CERN has over 2,600 full-time employee’s, 7,931 scientists and engineers hailing from more then 500 universities and half of the world’s particle physics community involved in projects at the site.

In order to become a part of the project, the physics department at Melbourne University had to meet a standard set down by CERN to ensure they could meet the data processing power and broadband speed requirements. “We require essential nominal data speeds of around 300mb/sec between the University and the Asia Pacific Tier one centre in Teipei,” says Maloney.

According to Maloney, the LHC will feed data into a worldwide computing grid which incorporates a federation of 3 different scientific grids around the world. To meet this monstrous data requirement, both Sydney and Melbourne University have been put on high priority international data links to cope with the demand. These links have already been put to test as part of an experiment testing of the data network for the project.

Since the LHC’s conception, the construction of the device and facility has gone through a number of setbacks. The original schedule had the LHC going live in 2007, but due to unforeseen circumstances this was pushed back to 2008. Considering the enclosure on its own is colossal in size and underneath a decent chunk of Europe, it’s no great surprise a number hurdles stood in the way. These setbacks led to a $957 million budget blowout for the project, but the official line is that the LHC will be operational in 2008.

“There were problems with the commissioning phases of the accelerator plus the pilot study in 2007 was delayed,” says Maloney. “There was also a design flaw in one of the very final focus magnets which had to be redesigned.”.

The project itself will explore experiments that were previously beyond the reach of modern physics. The LHC project will also encompass a string of world firsts; including the world’s biggest science project and the world’s biggest scientific instrument. Meanwhile the ATLAS experiment will have the world’s biggest super-conducting magnet.

It’s hoped the LHC project will be able to produce the ‘God’ particle, otherwise known as the Higgs-Boson particle, which thus far has never been produced or proven to exist. The God particle, if produced, will be able to shed light on the origins of mass of other particles. Another part of the project involves a detector codenamed ALICE, which stands for A Large Ion Collider Experiment and will explore time projection in a containment chamber.

Whether or not the experiments actually find any divine particles, the project will still bring in reams of data about previously unexplored topics. The potential for discovery coming from the project has the scientific community buzzing with expectation and excitement. Assuming the project suffers no further delays, 2008 is shaping up to be an interesting year for scientific breakthrough.

Comment on this story.

Business Solution: