Thursday, May 22, 2014

Breaking the unknown ...to know ...(ZDNet)

You make it, we break it: CERN's particle smashers list their toughest tech challenges

Summary: Around 96 percent of the universe is still unknown and the Large Hadron Collider needs new tech to help it solve the mysteries.
By  |
1302162_02-A5-at-72-dpi
CERN's computer centre. Image: CERN
Researchers at CERN have detailed some of the big technology challenges they need to solve to help the Large Hadron Collider (LHC) solve some of the fundamental questions about the nature of the universe.
CERN openlab is a partnership between CERN and IT companies including Intel and Oracle to develop new technology for scientific researchers. CERN provides access to its engineers and hugely complex IT infrastructure to test new hardware and software, and in return gets access to new products at an early stage in their development to assess them for use in future. 'You make it, we break it' is the CERN openlab motto.
The work of the lab has focused on different areas from databases and security to networking, and is currently investigating cloud computing, business analytics, next generation hardware, and security for networked devices.
The lab has set out its take on the future IT challenges in scientific research in a document which will shape its next three-year research phase starting in 2015.
It has identified six new areas of research and development that it wants to look at: data acquisition, computing platforms, data storage architectures, compute management and provisioning, networks and connectivity, and data analytics.
CERN's LHC is the world's most powerful particle accelerator and also the largest and most complex scientific instrument ever built. Located in a 27km-long circular tunnel buried 50m to 150m below the ground, it is able to accelerate particles to more than 99.9 percent of the speed of light.
Four very large detectors – effectively gigantic 100megapixel 3D cameras - record up to 600 million times per second the 'mini big bangs' created by the particle beam collisions. This generates huge amounts of data to be analysed: in the last weeks of the LHC's first run, it stored 100PB of data gathered at the rate of a petabyte a second.
The LHC has already confirmed the existence of the Higgs boson, but there are still plenty of questions for which it seeks to find answers. To do this, it needs to operate at higher energies and collision rates; this mean that more data will be produced by the experiments, which means upgrades to storage and systems to analyse that information.
And it's not just CERN that needs new ways of handling enormous datasets: neurology, radio astronomy, and genetics projects with tools as diverse as Earth observation satellites, high-performance genomic sequencers, neutron diffractometers and X-ray antennas all need new ways of working with the data they are creating.
As such the CERN lab has worked with other European research labs to identify the "ambitious challenges" covering the most crucial needs of their IT infrastructures. The CERN labs documentcan be found here.
Steve Ranger is the UK editor-in-chief of ZDNet and TechRepublic, and has been writing about technology, business and culture for more than a decade. Previously he was the editor of silicon.com.

No comments:

Post a Comment