Download BioMedical Applications - Indico

Survey
yes no Was this document useful for you?
   Thank you for your participation!

* Your assessment is very important for improving the workof artificial intelligence, which forms the content of this project

Document related concepts
no text concepts found
Transcript
Enabling Grids for E-sciencE
Presentation of DNA4.3.2
Vincent Breton
On behalf of NA4
www.eu-egee.org
INFSO-RI-508833
V. Breton, 30/09/05, Génopôle Lille
Table of content
Enabling Grids for E-sciencE
• Introduction
• Executive summary
• Overview of migration to the production infrastructure
• List of deployed applications
• Migration report on preproduction service and testing
of gLite
• Update on integration of user communities
• Conclusion
INFSO-RI-508833
V. Breton, 30/9/05
2
Editorial team
Enabling Grids for E-sciencE
• Overall edition of the document
–
–
–
–
HEP: Massimo Lamanna, Frank Harris, Birger Koblitz
Biomed: Johan Montagnat
Generic: Roberto Barbera
V. B. & Florence Jacq
• Contributions from the application supervisors for
chapter 4
– Good overall feedback
– Some delayed contributions still missing (ESR, E-GRID)
INFSO-RI-508833
V. Breton, 30/9/05
3
Enabling Grids for E-sciencE
Overview of migration to the
production infrastructure
• 2-page overviews by application sector (HEP, biomed,
generic)
• Highlights
– HEP: Data and Service challenges
– Biomed: data challenge
– Generic: overall growth of the activity
• Main issues
–
–
–
–
Migration to gLite
Middleware efficiency of the order of 80%
Large scale usage is still labour intensive (e.g data challenge)
Relationship to external projects
INFSO-RI-508833
V. Breton, 30/9/05
4
HEP service challenges
Enabling Grids for E-sciencE
LCG Service Challenges –
ramp up to LHC start-up service
LCG Project, Service Challenges
June05 - Technical Design Report
Sep05 - SC3 Service Phase
May06 – SC4 Service Phase
Sep06 – Initial LHC Service in stable operation
Apr07 – LHC Service commissioned
2005
SC2
SC3
2006
2007
cosmics
SC4
LHC Service Operation
2008
First physics
First beams
Full physics run
SC2 – Reliable data transfer (disk-network-disk) – 5 Tier-1s, aggregate 500 MB/sec sustained at CERN
SC3 – Reliable base service – most Tier-1s, some Tier-2s – basic experiment software chain – grid data
throughput 500 MB/sec, including mass storage (~25% of the nominal final throughput for the
proton period)
SC4 – All Tier-1s, major Tier-2s – capable of supporting full experiment software chain inc. analysis –
sustain nominal final grid data throughput
LHC Service in Operation – September 2006 – ramp up to full operational capacity by April 2007 – capable
of handling twice the nominal data throughput
3
INFSO-RI-508833
V. Breton, 30/9/05
5
Biomed data challenge
Enabling Grids for E-sciencE
registered jobs July-August
Autodock – Chembridge
FlexX – drug like
FlexX - Chembridge
5000
4500
4000
nb of jobs
3500
3000
2500
2000
1500
1000
500
Intensive phase
Resubmission phase
INFSO-RI-508833
day
25
23
21
19
17
15
13
11
9
7
5
3
1
30
28
26
24
22
20
18
16
14
12
10
0
Source : JRA2 statistics
V. Breton, 30/9/05
6
Evolution of the number of users
Enabling Grids for E-sciencE
Name
Discipline
Number of users at Number of users at
PM9
PM18
ALICE
HEP
(LHC
experiment)
27
50
ATLAS
HEP
(LHC 203
experiment)
400
CMS
HEP
(LHC 161
experiment)
350
LHCb
HEP
(LHC
experiment)
41
58
ESR
Earth Sciences
18
33
Biomed
Biomed
33
67
CompChem
Chemistry
9
5
Magic
Astronomy
5
10
EGEODE
GeoPhysics
0
3
Planck
Astrophysics
0
5
TOTAL
INFSO-RI-508833
497
981
V. Breton, 30/9/05
7
List of deployed applications
Enabling Grids for E-sciencE
• All applications were invited to fill a 2 page template providing
–
–
–
–
Brief description of application
Resources
Issues
Results
• List of applications described in the deliverable
– High Energy Physics: Alice, ATLAS, CMS, LHCb, BaBar, CDF, D0
– Biomedical applications: Drug Discovery, GATE, CDSS, GPS@,
gPTM3D, SiMRI3D, Bronze Standards, GridGRAMM-GROCK,
Pharmacokinetics
– Generic applications: computational chemistry, astroparticle physics,
astrophysics, earth sciences (EGEODE), ...
– applications incubated on GILDA: Nemo and Antares, Patsearch,
GA4tS, SPM, Splatche, “sonification” of sismograms, gMOD
INFSO-RI-508833
V. Breton, 30/9/05
8
Migration report on preproduction
service and testing of gLite
Enabling Grids for E-sciencE
• Presentation of the work done within NA4 by
–
–
–
–
ARDA
Biomedical task force
Gilda team
NA4 test team
• Edited by B. Koblitz
• Main conclusions
– Lot of work done on PPS and GILDA
– gLite middleware offers the necessary functionalities
– PPS instability and scale has slowed down application
deployment
INFSO-RI-508833
V. Breton, 30/9/05
9
gLite brought to 100s by NA3/NA4
tutorials
Enabling Grids for E-sciencE
•
•
•
•
•
•
•
•
•
•
•
First gLite tutorial on GILDA, Catania, 13-15 June 2005
ESR retreat to move to gLite, Bratislava, 27-30 June 2005
GGF Grid School, Vico Equense, 10-22 July 2005
EGEE Summer School, Budapest, 11-16 July 2005
Healthgrid Workshop, Clermont-Ferrand, 25-27 July 2005
UK grid event, Swansea, 6 August 2005
EGEE Tutorial, Taipei, 22-23 August 2005
EGEE tutorial for summer students, CERN, 24 August 2005
EGEE Tutorial, Tokyo, 25-26 August 2005
EGEE Tutorial, Seoul, 29-30 August 2005
EGEE Tutorial for the Crossgrid Project, Lausanne, 5-9 September
2005
• Cern School of Computing, Saint-Malo, 12-15 September 2005
• GridKa School for Grid Application Developers, Karlsruhe, 26-30
September 2005
INFSO-RI-508833
V. Breton, 30/9/05
10
Enabling Grids for E-sciencE
Update on integration of user
communities
• Presentation of the user survey (achievement of
MNA4.3)
• Life cycle of an application on EGEE (work done in
preparation of EGEE-II)
–
–
–
–
Application incubation
Initial deployment as local applications
Transition to supported applications
Application deployment
• Main issues
– Slow uptake of gLite by generic applications
– Efficiency/stability of production and fact it is labour-intensive
INFSO-RI-508833
V. Breton, 30/9/05
11