You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 4 Current »

1.1. 05.2.1. AL - Access Layer

  • API providing access methods (read/write) to an ITER physics Database based on the ITER Physics Data Model
  • Provided in Fortran, C++, Matlab, Java, Python
  • The only effort for using the Data Model is to map the input/output of your code to the Data Model and add some GET/PUT commands
  • The access methods are writing to a local database stored in your account
  • These local databases can be shared among users (for reading only) and can be accessed remotely

In order to cope with multiple languages and maintaining at the same time a unique structure definition, the AL architecture defines two layers. The top layer provides the external Application Programming Interface (API), and its code is automatically produced from the XML description of the ITM database structure. For each supported programming language, a high level layer is generated in the target language. The following sections will describe the language specific API, and they provide all the required information for simulation program developers.

The lower layer is implemented in C and provides unstructured data access to the underlying database. It defines an API which is used by all the high level layer implementations. Knowledge of this API (presented in a later section) is not necessary to end users, and is only required to the developers of new language specific high level implementations of the AL as well as the developers of support tools for ITM management.


1.2. First use case: User or code accessing Data Base through Access Layer

1.3. Second use case: codes coupled together directly (same language) or through AL (different languages)

1.4. Workflow Engine and Component Generator to facilitate the development of Integrated Modelling workflows

  • An Integrated Modelling simulation is described as a workflow with physics codes as components (modules)
  • The workflow engine allows users to 
    • Design the workflow
    • Choose its components and tune their code-specific parameters
    • Execute the workflow

  • The workflow engine will be used to help designing sophisticated workflows (e.g. Plasma Reconstruction chain, fully modularised Transport Solver, …)
    • It is intuitive enough for allowing “mere users” developing their own workflows
    • It hides the complexity of code coupling, data transfer, remote job submission, …
    • It allows sharing codes and workflows
    • It allows coupling to the PCS Simulation Platform

  • Component generator: is a user tool that turns an IDS-compliant physics code into a component of the workflow

1.5. Physics codes + Data Access wrapped into a workflow component

1.6. Workflow components coupled and executed within a workflow engine

1.7. The layered structured: from the physics solver to the launcher

  • The structure is layered so that functionalities are clearly separated
  • It is generic and independent of e.g. the launching script/workflow engine

  • The Physics solver part (dark green) is not changed, it is not linked to the ITER Data Model. It may use the ITER Data Model internally or not.
  • The architecture is identical to the case of a component called within a workflow engine
  • The Physics Subroutine can be directly reused to generate a workflow component – the IMAS Infrastructure provides a tool that generates the component (pink part) automatically

  • Exception: for codes handling massive amounts of data, Data Access is usually parallelised and must be done inside the physics_solver (no processor has enough memory to gather all data)


  • No labels