Hawking Team Uses Supercomputer To Explore Space
COSMOS is using one of the world’s fastest supercomputers to analyse its data and research into the origins of the universe
As the science of cosmology unravels the secrets of the universe, researchers are increasingly making use of supercomputers to help them answer questions and prove theories about how the universe came to be, of what it is made, how it has evolved and what the future holds.
Today, a team led by renowned astrophysicist Professor Stephen Hawking at the UK Computational Cosmology Consortium (COSMOS) in Cambridge, has announced its choice of SGI’s Altix UV 1000 – described as the world’s fastest supercomputer – to support its research into the origins of space.
“Recent progress towards a complete understanding of the universe has been impressive, but many puzzles remain,” said Hawking. “Cosmology is now a precise science, and we need supercomputers to calculate what our theories of the early universe predict and test them against observations of the present universe.”
Scalable, big memory supercomputing
According to SGI, the Altix UV 1000 is capable of high performance, scalable, big-memory supercomputing to facilitate vast amounts of data analysis. The fully integrated unit is based on Intel Xeon 7500 series processors, and contains up to 256 sockets (2,048 cores) and 16TB of global shared memory in four racks. SGI claims this is a significant upgrade from the Altix 4700 supercomputer currently used by COSMOS, which supports only half the number of cores, (up to 1024).
Its rapid time-to-solution also means that research findings can be processed extremely quickly, delivering up to 18.5 teraflops of compute power in a single system image.
“The new Altix UV system gives us a strategic advantage as we seek to advance the confrontation between fundamental and observational cosmology,” said Professor Paul Shellard, director of COSMOS. “This flexible, scalable and cost-effective architecture will ensure that COSMOS maintains international leadership.”
COSMOS is not the only organisation using supercomputers for space-related research. In July, Dell signed a $5.1 million deal with NASA to provide the space agency with PowerEdge C6100 servers to revamp a high-performance computer facility dedicated to examining the impact of climate change on the planet.
The C6100 servers are used by researchers to create better computer models of how climate change is affecting the planet, and will help in establishing the impact climate change has on the oceans and the atmosphere.
Meanwhile, an Intel supercomputer, known as “Encanto”, was used to monitor the oil from BP’s broken underwater well, which began gushing as many as 60,000 barrels of crude a day into the Gulf of Mexico after it burst on 20 April. The supercomputer ran a complex piece of software called the Parallel Ocean Program model – which simulates ocean currents – in order to predict the movement of the slick.
“A number of us were discussing why there were no longer-term scenarios about the impact of the spill,” said Synte Peacock, an NCAR scientist. “Then we realised we had a perfect model to do just that. We basically dropped a ‘virtual dye’ in the water, and then watched to see where it would go.”
Professor Stephen Hawking has been in the news this week, after saying in an interview that mankind’s only chance of long-term survival is to spread out into space. “It will be difficult enough to avoid disaster in the next hundred years, let alone the next thousand or million,” he said. “If we want to continue beyond the next hundred years, our future is in space.”