...
It's plug-in based modular design with clear separation of concerns allows to generate various types of actors and easily change data access paradigm (from dataset descriptor for AL to direct HDC data for instance)
For user conveniency it provides two kinds of interfaces: user friendly graphical interface that allows non-experienced users to define an actor in intuitive way and command line interface foreseen for more advanced users that may want to e.g. automatise actor generation process using scripts.iWrap goals:
- iWrap creates a Python script (aka an actor) that:
- calls a user code
- provides error handling
- calls debugger (if run in "debug" mode)
- runs MPI code
- iWrap generates a Fortran/CPP wrapper, which intermediates between Python script (workflow) and user code in terms of:
- reading/writing of in/out physical data (IDS)
- passing other arguments to/from the actor
iWrap interfaces:
For user conveniency it provides two kinds of interfaces:
- user friendly graphical interface that allows non-experienced users to define an actor in intuitive way and
- command line interface foreseen for more advanced users that may want to e.g. automatise actor generation process using scripts.
Preparation of code
A signature of user code must follow strict rules to allow interaction between it and wrapping actor. Please use following >>link<< to get detailed guidelines for integration of native code into workflows using iWrap
...
Code Block | ||
---|---|---|
| ||
# Import of the actor class from <actor name>.actor import <actor name> # Creation of actor object actor_object = <actor name>() # Reading input data ... # Setting up runtime properties (if necessary) ... # Actor initialisation actor_object.initialize() # Native code run <output IDS or list of IDSes> = actor_object(<input IDS/IDSes>) # Actor finalisation actor_object.finalize() # Saving output data ... |
Workflow example
Code Block | ||
---|---|---|
| ||
import sys import imas, os from core2dist.actor import core2dist from core2dist.python_common.job_settings import RunMode, DebugMode class ExampleWorkflowManager: def __init__(self): self.actor_cp2ds = core2dist() self.input_entry = None self.output_entry = None def init_workflow(self): # INPUT/OUTPUT CONFIGURATION shot = 134174 run_in = 37 input_user_or_path = 'public' input_database = 'iter' run_out = 10 output_user_or_path = os.getenv('USER') output_database = input_database # OPEN INPUT DATAFILE TO GET DATA FROM IMAS SCENARIO DATABASE print('=> Open input datafile') self.input_entry = imas.DBEntry(imas.imasdef.MDSPLUS_BACKEND,input_database,shot,run_in,input_user_or_path) self.input_entry.open() # CREATE OUTPUT DATAFILE print('=> Create output datafile') self.output_entry = imas.DBEntry(imas.imasdef.MDSPLUS_BACKEND,output_database,shot,run_out,output_user_or_path) self.output_entry.create() # # # # # # # # Initialization of ALL actors # # # # # # # # #self.actor_cp2ds.runtime_settings.debug_mode = DebugMode.STANDALONE self.actor_cp2ds.initialize() def execute_workflow(self): # READ INPUT IDSS FROM LOCAL DATABASE print('=> Read input IDSs') input_core_profiles = self.input_entry.get('core_profiles') # EXECUTE PHYSICS CODE print('=> Execute physics code') output_distribution_sources = self.actor_cp2ds(input_core_profiles) # SAVE IDSS INTO OUTPUT FILE print('=> Export output IDSs to local database') self.output_entry.put(output_distribution_sources) print('Done exporting.') def end_workflow(self): # Finalise ALL actors self.actor_cp2ds.finalize() #other finalisation actions self.input_entry.close() self.output_entry.close() manager = ExampleWorkflowManager() manager.init_workflow() manager.execute_workflow() manager.end_workflow() |
...