Download 334 kb

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts

Strangeness production wikipedia , lookup

Elementary particle wikipedia , lookup

Standard Model wikipedia , lookup

Data analysis wikipedia , lookup

Faster-than-light neutrino anomaly wikipedia , lookup

Business intelligence wikipedia , lookup

ALICE experiment wikipedia , lookup

Peter Kalmus wikipedia , lookup

ATLAS experiment wikipedia , lookup

Compact Muon Solenoid wikipedia , lookup

Future Circular Collider wikipedia , lookup

Large Hadron Collider wikipedia , lookup

Transcript
www.fbbva.es
DEPARTMENT OF COMMUNICATION
AND INSTITUTIONAL RELATIONS
PRESS RELEASE
BBVA Foundation lecture series on particle physics
The LHC is the biggest producer of data
in science, and can also “learn” to select
the most important, explain Maite
Barroso and Pippa Wells

According to Maite Barroso, Deputy Head of CERN’s Information Technology
Department, in its current phase, the LHC accelerator should pour out some 50
million gigabytes of data per year, doubling the productivity of its initial run. Barroso
has played a key role in developing the grid computing network.

Pippa Wells, of the CERN Physics Department, affirms that “there are many ways
that the ability to accumulate large data sets and then analyze them will have an
impact on humanity.”

This edition of the series comprises six encounters with experts from CERN and its
collaborating institutions, aimed at exploring current and future challenges in
particle physics, introducing the technologies used in large scientific facilities and
highlighting the benefits that science brings to society.
Madrid, May 13, 2016.- “In the world of science, the LHC is right now the biggest
producer of data,” says Maite Barroso, Deputy Head of CERN’s Information Technology
Department. The numbers she refers to are practically off the scale: each second, the
large hadron collider (LHC) captures the equivalent of forty million high-definition images,
and has to decide instantaneously which ones to store for subsequent analysis. Meeting
this challenge has called for specific mathematical algorithms, and the world’s largest
network of computing resources. Barroso and her colleague Pippa Wells, from the CERN
Physics Department, talked about the technology involved in the BBVA Foundation.
The two were in Madrid to take part in the lecture series “CERN Resumes LHC Operation
and Prepares Its Future,” with a talk titled “CERN Computing for Science and Its Impact on
Society.”
In the LHC, subatomic particles – protons and antiprotons – are accelerated to almost the
speed of light in a 27-kilometer underground ring. On colliding, the energy they carry
produces new, more massive particles, and it is to study these events that physicists must
collect, store and process the vast quantities of data generated. In the LHC’s first run “the
data collected summed 30 million gigabytes a year, the equivalent of nine million highdefinition movies,” Barroso remarks.
In the current phase, known as Run 2, the accelerator is operating at higher levels of
energy and intensity, and this means more data: “We expect almost double the amount
of data produced in Run 1,” she adds, “around 50 million gigabytes a year.”
From the start of the LHC project, CERN scientists knew that dealing with such torrents of
data would demand new computing techniques and capacities. Their response was to
lead the setup of a new global computing network to process the data in coordinated
fashion, sharing computational resources. The resulting Worldwide LHC Computing Grid, or
simply “grid”, is, Barroso explains, “a global collaboration, coordinated by CERN, and
made up of over 170 data centers in 42 countries, linking up national and international
grid infrastructures.”
The CERN computing center provides 30 percent of the necessary CPU and is in charge of
“the filing and reprocessing of these data and their swift and efficient round-the-world
distribution,” she continues. “The remainder comes from the computing network that is the
Worldwide LHC Computing Grid.”
The grid handles an average of 2 million jobs per day, and the figures keep on rising: “For
Run 2, we needed more storage space and computing capacity to analyze the data,”
Barroso points out. The solution was to open a new data center in Budapest and to set up
a private computing cloud based on the Openstack open-source platform.
One of the big challenges is to know which data to store and process and which are the
dross that hides the gold. The first step, then, is for computers to quasi-instantaneously
reconstruct the particle collisions using complex synchronization and filtering algorithms.
This process enables physicists to identify the particles generated and the most interesting
collisions so they can be stored for more detailed analysis.
One percent of relevant information
“Filtering leads to 99% of data being discarded,” informs Barroso. “The algorithms
developed for this purpose and their deployment via machine-learning techniques are
subject to regular upgrade to ensure that the near-1% conserved is truly key information.”
This is among the responsibilities of Pippa Wells, who adds: “We use sophisticated
algorithms to pick out the events of interest. (…) The biggest challenge, as I see it, is to
make sure we are triggering on the right events. Once the data have been recorded,
then we have chance to improve our knowledge of the detector response and run
through the events more than once to be sure nothing is missed.”
In her view: “There are many ways that the ability to accumulate large data sets and then
analyze them will have an impact on humanity. The techniques at CERN improve our
ability to handle big data samples, and to pick out certain patterns or characteristics.
Where the future is less predictable is in bringing together data from different aspects of
life, and inferring new information about people and their behavior.”
Barroso refers to another concern at the heart of CERN computing: “It is a fact that hacker
attacks are a constant at the CERN data center, and the same is true for the other
members of the WLCG grid. Security is therefore a priority, the goal being to achieve a
balance between the freedom due to academic centers and an acceptable level of
computer security.”
Bio notes
Maite Barroso studied telecommunications engineering at the Universidad Politécnica de
Madrid because she enjoyed math and physics, “and wanted a degree that combined
the two but also had a practical side; a direct, visible application in daily life,” she affirms.
She joined CERN in 2001, after a time in a private telephony company: “It was an
opportunity. I was working in Geneva when they started the grid development projects,
and I put myself forward.”
She is currently Deputy Head of Information Technology, the CERN department that
handles data processing and storage and communication and support networks for the
whole experimental program. Its remit also takes in R&D on future technologies in
partnership with private corporations and other research centers worldwide.
From her first years with CERN, she has worked in grid computing as a researcher and
coordinator on CERN-led R&D projects financed by the European Union: DataGrid, EGEE
and the Worldwide LHC Computing Grid (LCG).
Pippa Wells, a British physicist, has played the violin since childhood and lists playing in
orchestras among her passions in life – in recent years she has performed with the
Orchestre Symphonique Genevois. She is also an experimental physicist at CERN, where
she took charge of upgrade work on the ATLAS detector of the LHC accelerator, a key
piece in the 2012 discovery of the Higgs boson.
ATLAS – like the rest of the LHC – is subject to regular upgrades as part of the drive to keep
the particle accelerator at the forefront of science. The high-luminosity LHC (HL-LHC) is the
most ambitious project of the next ten years in high-energy physics.
Wells earned a PhD from the University of Cambridge in 1990 and was recruited by CERN
to work on the forerunner to the LHC. Her earlier research involved taking into account the
effects of Earth tides on the accelerator tunnel. As part of the ATLAS team, she was in
charge of measuring the tracks of charged particles as they emerged from the
accelerator, an essential step in their identification.
CERN and the BBVA Foundation
The collaboration between CERN and the BBVA Foundation dates from 2014, when the
supranational organization opted to celebrate its 60th anniversary in Spain in partnership
with the Foundation. The result was the lecture series “The Secrets of Particles.
Fundamental Physics in Everyday Life” whose closing speaker was CERN’s outgoing
Director-General, Rolf Heuer. This was followed by a second series where speakers
included Heuer’s successor, Fabiola Gianotti. All lectures are available in full on the
website fbbva.es.
This third edition will comprise six talks with the participation of fifteen experts from CERN
and various collaborating institutions. Its goal is to present current and future challenges in
the field of particle physics, introduce the technologies used in major scientific facilities
and raise awareness of the multiple benefits that science brings society. The format of the
series responds to the close relationship between CERN and the Spanish universities and
research centers working in the field.
For more information, contact the BBVA Foundation Department of Communication and Institutional Relations
(+34 91 374 5210; 91 537 3769/[email protected]) or visit www.fbbva.es