community/application 1 ----------------------- scientific field macro area earth science specific area climate research

Community/Application 1
-----------------------
Scientific field
Macro area
Earth science
Specific area
Climate research
Application name – version - License
EC-EARTH 3
*
Atmosphere: IFS cycle 36r1 or later
*
Ocean: Nemo 3.3beta or later
*
Coupler: OASIS3 (OASIS4 under development)
Application web site
http://ecearth.knmi.nl
http://www.ecmwf.int
http://www.nemo-ocean.eu
https://oasistrac.cerfacs.fr
Short description of the involved communities:
In this part we expect to collect information about:
*
The group that has developed the proposed application
*
The main research groups worldwide that use the proposed
application
*
The relationship between the proponents and the developers team
*
The relationship between the proponents and one or more of the
research group adopting the application
*
The degree of involvement of the developers team in the present
project
*
The degree of involvement of one or more scientific users in the
present project
This project is a contribution to the European Network for Earth
System modeling (ENES) which gathers the climate modeling community.
This project focuses on one of the 6 coupled models gathered in
IS-ENES, the infrastructure project of ENES. It is expected to help
the use of Tier0 machines by the larger climate modeling community by
sharing developments and experiences, especially on Nemo, the coupler
and I/O. It has been discussed within a PRACE-ENES meeting organized
in Paris on December 1rst 2010 by the IS-ENES project which has, as
one of its main objectives, to foster the interaction with PRACE and
the use of high-end simulations. IS-ENES will contribute to this
project.
EC-EARTH is a project, a consortium and a model system. The EC-EARTH
model is a state-of-the-art numerical earth system model (ESM) based
on ECMWF's Seasonal Forecasting System. Currently, the EC-EARTH
consortium consists of 24 academic institutions and meteorological
services from 11 countries in Europe. EC-EARTH Version 3, which is the
most recent version, is a coupled ESM comprising ECMWF's atmospheric
model IFS, including H-TESSEL land model, the general ocean
circulation model Nemo, the LIM sea ice model, and the OASIS coupler.
The developers of EC-EARTH are keen to collaborate on scaling up their
application. Several groups that are willing to work in Task 7.2e
already contribute to EC-EARTH (SNIC-KTH) or either IFS (ICHEC) or
NEMO (SARA). EC-EARTH is used in many EU FP7 projects, including THOR,
COMBINE, EUCLIPSE, ECLISE and IS-ENES. The model is used for
developing research on understanding the earth system and for
developing climate scenarios. The model output will contribute to
IPCCs 5th Assessment. ECMWFs IFS atmosphere model and the NEMO ocean
model are the main components of EC-Earth. A variety of resolutions is
used. High resolution runs targets the simulation of physically
realistic climate phenomena. Coarse resolution runs are done in
ensembles (ie many, slightly different simulations) in order to
capture statistical uncertainty. Ideally, high resolution simulations
need to performed in an ensemble sense.
ECMWF is the European Center for Medium-range Weather Forecasts and
runs the global weather prediction model IFS every 12 hours at very
high resolution (at the moment 16km). The IFS model is used for
weather prediction only at ECMWF, but it is also used as the
atmosphere component in EC-EARTH. As such, the community is much
larger than ECMWF alone and it is used throughout Europe by many
institutes and universities for weather and climate forecasts and
research.
The core development team of NEMO consists of members of CNRS and
Mercator-Ocean in France and UKMO and NERC in the United Kingdom. The
NEMO ocean model is used in several climate models that are used for
the IPCC reports about climate change (notably HadGEM3, EC-EARTH and
IPSL-CM4). Furthermore, it is also used for standalone simulations.
Several computing centers are interested to work on improving the
performance and/or scalability of NEMO: e.g. Andrew Porter (STFC) is
working on implementing a decomposition with variable domains. The
IPSL team in Paris is involved in both the development of NEMO and the
scientific use of NEMO and have been involved in PRACE1PP as well.
The OASIS coupler, currently developed by CERFACS (France), DKRZ
(Germany), and Centre National de la Recherche Scientifique (France)
in the framework of the EU FP7 IS-ENES project is software allowing
synchronized exchanges of coupling information between numerical codes
representing different components of the climate system. Portability
and flexibility are the key design concepts of OASIS3 developed and
maintained since more than 15 years at CERFACS. OASIS3 is currently
used by approximately 30 climate modelling and operational weather
forecasting groups in Europe, USA, Canada, Australia, India and China.
As the climate modelling community is progressively targeting higher
resolution climate simulations run on massively parallel platforms,
the development of a new fully parallel coupler, OASIS4, started
during the EU FP5 PRISM project. The concepts of parallelism and
efficiency drove OASIS4 developments. OASIS4 is currently in a beta
testing phase.
Main publication related to the proposed application
*
Hazeleger, W. et al., 2010: EC-Earth: A Seamless Earth System
Prediction Approach in Action Bull. Amer. Meteor. Soc. 1357 –
1363.
*
Haarsma, R.J., F.M. Selten, B. van den Hurk, W. Hazeleger and X.
Wang, 2009: Drier Mediterranean Soils due to Greenhouse Warming
bring easterly Winds over Summertime Europe Geophys. Res. Lett.,
2009, 36, L04705, doi: 10.1029/2008GL036617.
*
IFS documentation on cycle 36r1 (see:
http://ecmwf.int/research/ifsdocs/CY36r1/index.html)
*
Madec G. 2008: "NEMO ocean engine". Note du Pole de modélisation,
Institut Pierre-Simon Laplace (IPSL), France, No 27 ISSN No
1288-1619.
*
A. Biastoch et al: Agulhas leakage dynamics affects decadal
variability in Atlantic overturning circulation. Nature, 2008.
*
Laurent Brodeau et al: An ERA40-based atmospheric forcing for
global ocean circulation models. Ocean Modelling, Volume 31, 2010.
*
S. Valcke, 2006: OASIS3 User Guide (prism_2-5). CERFACS Technical
Report TR/CMGC/06/73, PRISM Report No 3, Toulouse, France. 60 pp
*
R. Redler, S. Valcke and H. Ritzdorf, 2010: OASIS4 - A Coupling
Software for Next Generation Earth System Modelling, Geoscience
Model Development, 3, 87 - 104, DOI:10.5194/gmd-3-87-2010.
Confidentiality
The EC-Earth model has a license via ECMWF that is not open source,
but it allows for free 'Academic use' for users in member states of
ECMWF (most European countries). The NEMO ocean model is open source
(French CeCILL license) and as such there is no confidentiality.
However, specific setups might be confidential, since research groups
invested a lot of time into the preparation of files with initial- and
boundary conditions. This was also the case for the NEMO configuration
used in PRACE1PP. The OASIS3 and OASIS4 couplers uses the LGPL
open-source license.
Short description of the application (algorithms, I/O, parallelization
strategy, current performances, performances bottlenecks)
The ECMWF's IFS code is a parallel spectral weather model that is also
used for seasonal climate prediction. Its structure is similar to
climate codes from NCAR, including CCM, but its parallel execution
model is highly evolved. It uses domain decomposition in
two-dimensions and performs both spectral and Legendre transformations
on the grid data. Furthermore, it includes many state-of-the-art
physical parametrizations that are adjusted to scale with the
resolution. The file format for I/O is GRIB, which is an international
standard defined by the WMO. There are developments of a parallel I/O
library for GRIB outside ECMWF, but as of yet the I/O of IFS is
serial. This is a serious bottleneck for scaling up this application.
Since GRIB is a used worldwide, (preferrably open-source) solutions
would benefit the whole meteorological community.
The NEMO model consists of several components, all based around the
OPA physical ocean model. It uses MPI with a regular domain
decomposition and finite differences. It needs to calculate the free
surface height, which is now done using a less-than-ideal conjugate
gradient or overrelaxation method, which limits the scalability. A new
method is implemented that calculates the free surface using an
explicit method, which does not have the scalability issues of the
elliptic solvers. I/O is done using the IOIPSL library, which now uses
parallel I/O using NetCDF4. Detailed experiments have been conducted
to get a good output performance using e.g. chunking. It can run both
in a realistic topography and forcing, or an idealized setup. The
domain decomposition strategy should be adapted to the depth and
coastline in case of a realistic topography, which is work done by
STFC.
At run-time, OASIS3 acts as a separate executable, which main function
is to interpolate the coupling fields exchanged between the component
models, and as a communication library linked to the component models.
OASIS3 executable can run in parallel, each process regridding a
subset of the coupling fields, resulting in a pseudo-parallelisation
of OASIS3 on a field-per-field basis. OASIS4 has the same function
than OASIS3 but implements a fully parallel regridding of the coupling
fields into which the neighbourhood search is performed in the
parallel source model library using an efficient multigrid algorithm
on the intersections of source and target process domains. OASIS3 is
stable and well debugged while OASIS4 is newer and still need some
validation, especially in the fully parallel cases.
The case for Petascale
Describe
*
Why the code should be enabled to petascale (scientific cases,
proof of concepts, size of the community, who would exploit the
petascale version…)
*
The work to be accomplished on the code to be petascaled
*
Possible risks of failure and contingency plan
*
Expected effort (PMs) and who will do what: state both the total
effort (sum of contributes of all involved people - unlimited) and
the effort due to PRACE action (max 12 PM).
ECMWF plans to increase the resolution of the weather forecast model
(T4999, ~5km globally in 2020), which will also positively affect the
EC-EARTH development. It was shown that IFS can scale up to more than
10k cores at high resolution. KNMI performed ensemble simulations with
the atmospheric component of EC-Earth at T799 (~20 km) resolution
using about 1000 cores. Furthermore, SMHI and SNIC-KTH have set up a
high-resolution configuration of EC-EARTH 3 (currently 0.25 degrees in
both atmosphere and ocean) to study performance and scalability. This
version has shown good scalability up to at least 2000 cores. The
challenge in achieving high scalability lies in the coupled nature of
the EC-EARTH model, which makes it essential to find a balanced
configuration of three executables (IFS, NEMO, and OASIS)
communicating through MPI. An additional challenge is the
input/output, which uses one CPU currently. A parallel set up would
increase the performance of EC-Earth.
Recent work has shown that the OASIS3 coupler, allowing a
field-per-field parallelisation of the coupling, introduce no
significant overhead (in terms of elapse time of the coupled
simulation with respect to the elapse time of the slowest component)
when coupling global component models with resolution up to ~25 km. It
is however expected that at higher resolution and on massively
parallel platform with low memory per node, the fully parallel
coupling implemented by OASIS4 will be mandatory.
Finally, the post-processing and archiving of the output of the model
hampers its performance. Since the output is used by many scientists
within many applications, almost the entire model state is archived
regularly (every 6 hours at model levels). There is a strong need for
more efficient postprocessing tools (e.g. the tool 'CDO'). One option
is to perform the postprocessing within the model code itself.
Actions:
*
to make a performance analysis of the high-resolution
configuration of the coupled EC-EARTH 3 on different platforms. It
has been run has been run with good scalability on a Linux cluster
so far. Is the bottleneck I/O, communications, load-balancing in
the coupler? PRACE 2PM, IS-ENES 2PM.
*
*
Validate the OASIS4 coupler for the EC-Earth coupling
configuration ; IS-ENES/CERFACS 2PM
*
Assess the performance of and improve the coupling implementation,
including the potential benefits and costs of upgrading to OASIS4.
If beneficial, upgrade EC-EARTH 3 to OASIS 4. PRACE/KTH-SNIC 1PM,
IS-ENES/SMHI 2.5PM
*
Mapping of MPMD hybrid MPI+OpenMP threads and MPI-only tasks
efficiently onto cores on different architectures: Cray XT6 with
24 cores/node, Power6/7, possibly BG/P. PRACE 2PM+IS-ENES 2PM
*
Investigate if CUDA-enabled routines can improve the scalability
of the coupled or atmosphere-only model. This excludes rewriting
routines into CUDA, which is T7.5. PRACE/ICHEC 2PM.
*
Investigate and improve scalability of I/O within IFS and NEMO.
PRACE 4.5PM + IS-ENES 1PM.
*
Implement ensemble simulations into one application. The technical
issues of setting up the MPI configuration, I/O, etc are for
PRACE. PRACE 1PM + IS-ENES 4PM. It's not at all sure if this can
be done within 1+4PM. Monitoring and fault tolerance of such a
mega-model (which, in itself, is desirable, must be addressed). If
it turns out that this cannot be implemented, there should be a
plan with the description and functionality of such megamodel and
a plan as to how this can be implemented.
To run the NEMO ocean model at higher resolution (at least 0.1
degrees): the POP model has been shown to scale up to 32k cores when
used at that resolution. Although there is a NEMO setup at this
resolution, it is not yet tested at such scale. The ocean community is
interested to learn more about the impact of true eddy-resolving
models on the ocean circulation and heat transports.
The NEMO model is used by hundreds of scientists and 40 groups
worldwide.
The ocean physics and new components of the ocean model are not really
the focus of EC-EARTH development. Even though some of the points in
the last three paragraphs could positively influence the scalability,
they open up a rather large field of activities, which might be way
beyond what can be done in this context. However, as the NEMO model is
already part of the PRACE benchmark system, it will be upgraded to a
recent release and a high resolution as part of another PRACE task.
Actions:
*
To implement dynamical memory allocation and load-balanced domain
decomposition. STFC 6PM. (Note that this is work done outside the
PRACE context, but relevant for EC-EARTH. It's written here to
keep track of
*
Integrating a recent release of NEMO with a high resolution
configuration (<0.1degrees) into the PRACE benchmark suite and
port it to Jugene and Curie. PRACE Task 7.4 1PM.
*
Look at scalability of reading of forcing file for ocean-only run.
Note that this is not relevant for coupled runs with EC-EARTH, but
it is very relevant for the PRACE benchmark suite (part of PRACE
T7.4). PRACE 1PM
Project contacts (Max. 3):
John Donners
SARA
[email protected]
Uwe Fladrich
SMHI
[email protected]
Alastair McKinstry
ICHEC
[email protected]
Additional contacts are relevant to this project :
For NEMO : Claire Levy and Sebastien Masson
For OASIS : Sophie Valcke
For IS-ENES : Eric Maisonnave in interaction with the ENES HPC Task
Force
Any other relevant comment:
………….

  • LAMPASAS COUNTY 27TH DISTRICT COURT JULY 17 2020 900
  • PRÉFÈTE DU CHER PREFECTURE DIRECTION DE LA REGLEMENTATION
  • CAP CŒUR – MODÈLE OCR MONDE DU TRAVAIL LES
  • SPRING INDEX 12 INDEX OF THE TIMING OF FOUR
  • 3 DERES REF ARKIVKODE DATO    
  • PC13 DOC 31 (REV 7) CONVENCIÓN SOBRE EL COMERCIO
  • 1 FONDS SOCIAL EUROPÉEN EN FRANCE DOSSIER DE CANDIDATURE
  • SEARCHING FOR STUDIES USEFUL LINKS NATIONAL AND REGIONAL
  • THE LET’S GO GOLFING BRIDGESTONE SENIOR PLAYERS CONTEST OFFICIAL
  • E NABLE GLASGOW POLICY STATEMENT ON THE RECRUITMENT
  • TLAČOVÁ SPRÁVA FEBRUÁR 2016 UVEDENIE SYOSS GLOSS SENSATION ODHAĽTE
  • APPENDIX XII STORM WATER POLLUTION PREVENTION PLAN THIS TEMPLATE
  • 18052021HWW4 OFERTA WYKŁADÓW I WARSZTATÓW ONLINE DOLNOŚLĄSKIEJ SZKOŁY WYŻSZEJ
  • REQUESTS FOR RETRIEVAL OF VALUABLES FROM SHARPS CONTAINERS (APPROVED
  • BILAG 3 HISTORIK OG STATUS FOR DIODELYS OGSÅ
  • BANCO MUNDIAL Y COFIDES OFRECEN FINANCIACIÓN A LAS PYMES
  • ISCRIZIONE ALLALBO DEGLI AVVOCATI DI NOLA (ART 27 RDL
  • SINDICAT DE TREBALLADORS I TREBALLADORES DE L’ENSENYAMENT DEL PAÍS
  • LAB 6 CALCULATION OF WATER OF HYDRATION PURPOSE
  • RESTRICCIONES DE INTEGRIDAD UNA VEZ DEFINIDA LA ESTRUCTURA DE
  • APPROVED LIST OF PHILLIPINE FOOD BUSINESS OPERATORS (FBOS) FOR
  • ENTRE EL CUERPO Y EL ALMA BREVE HISTORIA DE
  • APUNTES DE REPASO DE LA SUBORDINACIÓN 1 LAS PROPOSICIONES
  • 0 CHRONIC DISEASE PREVENTION & CONTROL IN THE AMERICAS
  • ATTACHMENT 1 PRODUCT INFORMATION FOR AUSPAR REMICADE JANSSENCILAG PTY
  • 1 CALENDARIO HORARIO DE LAS PRUEBAS CONVOCATORIA DE
  • PRIMERA PARTE 2 APROXIMACIÓN A LA HISTORIA DE
  • MCOMMERCE TECHNOLOGIES SERVICES AND BUSINESS MODELS NORMAN SADEH JOHN
  • INFORME IDENTIFICACIÓN PROYECTO Nº 230 FECHA 14 DE OCTUBRE
  • REC ITUR BT7103 15 RECOMMENDATION ITUR BT7103 SUBJECTIVE ASSESSMENT