Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

High Level Interfaces & their Application Programming Interface - PDF presentation

High Level Interfaces and their API (Application Programming Interface)

The IMAS Data Access Layer exposes a couple of operations (so-called API) for writing or reading IDSs data.

These data access operations are available from users code in the currently supported programming languages, the so-called There are currently 5 High Level Interfaces (HLIs) available from the following programming languagesHLI):

  • Fortran
  • C++
  • Java
  • Python
  • Matlab
Info
Only Python and Matlab provide user interactive session for accessing IMAS data.

The HLI API covers all available Access Layer features:

  • creating a so-called new IMAS Data Entry
  • opening an existing IMAS Data Entry
  • writing data from an IDS to a Data Entry
  • reading data of an IDS from an existing Data Entry
  • deleting an IDS from an existing Data Entry
  • closing a Data Entry

A Data Entry is an IMAS concept for designating a pulse with given shot and run numbers located in some database (see below).

The methods exposed by HLIs are:

  • —Operations on data base entry:
    • —CREATE
    • —OPEN
    • —CLOSE
  • —Operations on IDSs:
    • —PUT
    • —GET
    • —PUT_SLICE
    • —GET_SLICE
    • —DELETE

In this tutorial, we will describe each method of the HLI API (section 1.1. HLI API).

We will use the Python HLI.  Documentation of all others HLIs is available in the User guide available from this page: https://confluence.iter.org/display/IMP/Integrated+Modelling+Home+Page

HLI API (Ludovic)

create

Creating a new Data Entry using the MDS+ backend consists in creating a new pulse file on disk.  Therefore, you need to have write permissions for the database specified in the create() command.

So, let's first create a new database belonging to the current user.

From a new shell, execute the following command:

Code Block
module load IMAS
imasdb data_access_tutorial

Now, the following code will create a new MDS+ pulse file for shot=15000, run=1 in the 'data_access_tutorial' database of the current user:

Code Block
import imas
import getpass
from imas import imasdef
#creates the Data Entry object 'data_entry' associated to the pulse file with shot=15000, run=1, belonging to database 'pcss_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1, user_name=getpass.getuser())
#creates the pulse file associated to the Data Entry object 'data_entry' previously created
data_entry.create()

Execution of the code above will create the pulse file at location ~/public/imasdb/data_access_tutorial/3/0:

Code Block
$ ls -alh ~/public/imasdb/data_access_tutorial/3/0
total 78M
drwxrwsr-x 2 fleuryl fleuryl 4.0K Aug 31 10:09 .
drwxrwsr-x 12 fleuryl fleuryl 4.0K Aug 31 10:09 ..
-rw-rw-r-- 1 fleuryl fleuryl 42M Aug 31 10:09 ids_150000001.characteristics
-rw-rw-r-- 1 fleuryl fleuryl 37 Aug 31 10:09 ids_150000001.datafile
-rw-rw-r-- 1 fleuryl fleuryl 36M Aug 31 10:09 ids_150000001.tree

open

The following code opens the existing MDS+ pulse file created previously for shot=15000, run=1, from the 'data_access_tutorial' database of the current user:

Code Block
languagepy
import imas
import getpass
from imas import imasdef
#creates the Data Entry object 'data_entry' associated  to the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1, user_name=getpass.getuser())
#opens the pulse file associated to the Data Entry object 'data_entry' previously created
data_entry.open() 

The pulse file is opened, however no data have been yet fetched from the pulse file.

put/close

IDSs are data containers described by the IMAS Data Dictionary. IDSs represent either a Diagnostics (like the 'bolometer' IDS), or a System (like the 'camera_ir'), or a concept like the 'equilibrium' IDS representing the plasma equilibrium.

In order to write IDS data to the pulse file, we will first use the put() operation which writes all static (non time dependent) AND dynamic data from an IDS. 

Let's add a 'magnetics' IDS to the pulse file previously created.

The first part of the code below is opening a data_entry then a magnetics IDS is created and written to the data_entry using the put() operation:

Code Block
import imas
import getpass
import numpy as np
from imas import imasdef
#creates the Data Entry object 'data_entry' associated  to the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1, user_name=getpass.getuser())
#opens the pulse file associated to the Data Entry object 'data_entry' previously created
data_entry.open() 

magnetics_ids = imas.magnetics() #creating a 'magnetics' IDS
magnetics_ids.ids_properties.homogeneous_time=1 #setting the homogeneous time (mandatory)
magnetics_ids.ids_properties.comment='IDS created for testing the IMAS Data Access layer'
magnetics_ids.time=np.array([0]) #the time(vector) basis must be not empty if homogeneous_time==1 otherwise an error will occur at runtime
data_entry.put(magnetics_ids, 0) #writing magnetics data to the data_entry associated to the pulse file. The second argument 0 is the so-called IDS occurrence.

#close the pulse file associated to the 'data_entry' object
data_entry.close() 	 

get/close

Let'read the 'magnetics' IDS previously created in put/close

Code Block
import imas
import getpass
import numpy as np
from imas import imasdef
#opens the Data Entry object 'data_entry' associated  to the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1, user_name=getpass.getuser())
#opens the pulse file associated to the Data Entry object 'data_entry' previously created
data_entry.open() 

magnetics_ids = imas.magnetics() #creating a 'magnetics' IDS
magnetics_ids.ids_properties.homogeneous_time=1 #setting the homogeneous time (mandatory)
magnetics_ids.ids_properties.comment='IDS created for testing the IMAS Data Access layer'
magnetics_ids.time=np.array([0]) #the time(vector) basis must be not empty if homogeneous_time==1 otherwise an error will occur at runtime
data_entry.put(magnetics_ids, 0) #writing magnetics data to the data_entry associated to the pulse file. The second argument 0 is the so-called IDS occurrence.

 #close the pulse file associated to the 'data_entry' object
data_entry.close() 	

can be divided into 2 sets of operations: 

  • operations applying on a Data Entry 
  • operations applying on an IDS
Info
A Data Entry is an IMAS concept which designates a collection of IDSs present in a local (pulse file) or remote data source. A Data Entry is associated with a shot and a run number.

The HLI API covers all available Access Layer features with the following exposed methods:

  • Operations on a Data Entry:
    • CREATE         (creation of a new Data Entry)
    • OPEN            (opening an existing Data Entry)
    • CLOSE           (closing a Data Entry)
  • Operations on an IDS:
    • PUT               (writing data from an IDS to a Data Entry)
    • GET               (reading data of an IDS from an existing Data Entry)
    • PUT_SLICE    (writing a IDS time slice to a Data Entry)
    • GET_SLICE    (reading a time slice of an IDS from an existing Data Entry)
    • DELETE         (deleting an IDS from an existing Data Entry)


In this tutorial, we will describe each method of the HLI API (section 1.1. HLI API). We will use the Python HLI.  

API functions name and their signature may differ from one HLI to another. Documentation of all others HLIs is available in the User guide available from this page: https://confluence.iter.org/display/IMP/Integrated+Modelling+Home+Page

HLI API (Ludovic)

create/close

The creation of a new Data Entry using the MDS+ or HDF5 backend:

  • consists in creating a new (MDS+/HDF5) pulse file on the disk
  • requires to have at least an existing 'database' for hosting pulse file(s)

So, let's first create a new database named 'data_access_tutorial' which will belong to the current user.

From a new shell, execute the following command:

Code Block
module load IMAS
imasdb data_access_tutorial

Let's check that the database (it's simply a directory with some subdirectories for MDS+) has been successfully created:

Code Block
<g2lfleur@s52 ~>ls -alh ~/public/imasdb/data_access_tutorial
total 6.0K
drwxr-xr-x  3 g2lfleur g2itmdev 2.0K Sep 16 13:29 .
drwxr-xr-x  5 g2lfleur g2itmdev 2.0K Sep 16 13:29 ..
drwxr-xr-x 12 g2lfleur g2itmdev 2.0K Sep 16 13:29 3

Some sub-directories have been created, they are used by MDS+ to organize the pulse files.

Now, the following code will create a new MDS+ pulse file for shot=15000, run=1 in the 'data_access_tutorial' database of the current user:

Code Block
import imas
import getpass
from imas import imasdef

#creating the Data Entry object 'data_entry' which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1)

#creating the pulse file handled by the Data Entry object previously created
data_entry.create()

#we could now perform some write operations using the put() operation
#...This will be dealt with later

#closing the Data Entry
data_entry.close()

Execution of the code above will create the pulse file at location ~/public/imasdb/data_access_tutorial/3/0:

Code Block
<g2lfleur@s52 ~>ls -alh ~/public/imasdb/data_access_tutorial/3/0/
total 78M
drwxr-xr-x  2 g2lfleur g2itmdev 2.0K Sep 16 15:28 .
drwxr-xr-x 12 g2lfleur g2itmdev 2.0K Sep 16 13:29 ..
-rw-r--r--  1 g2lfleur g2itmdev  42M Sep 16 15:28 ids_150000001.characteristics
-rw-r--r--  1 g2lfleur g2itmdev    0 Sep 16 15:28 ids_150000001.datafile
-rw-r--r--  1 g2lfleur g2itmdev  36M Sep 16 15:28 ids_150000001.tree


In the above example, the pulse file is created then closed. However no data have been yet saved to the pulse file (the file ids_150000001.datafilefile is empty).



Note

To use the HDF5 backend instead of the MDS+ backend, you need simply to replace imasdef.MDSPLUS_BACKEND by imasdef.HDF5_BACKEND in the code above. 
In this case, the HDF5 master pulse file is located in ~/public/imasdb/data_access_tutorial/3/15000/1:

<g2lfleur@s52 ~>ls -alh ~/public/imasdb/data_access_tutorial/3/15000/1
total 6.0K
drwxr-xr-x 2 g2lfleur g2itmdev 2.0K Sep 20 10:15 .
drwxr-xr-x 3 g2lfleur g2itmdev 2.0K Sep 20 10:15 ..
-rw-r--r-- 1 g2lfleur g2itmdev 2.0K Sep 20 10:15 master.h5

Only the master.h5 is present. No IDS file has been yet created since no IDS data have been yet written (the master.h5 file is referencing the IDS files when present, please see the User Guide documentation on the HDF5 backend files organization for more details).


open/close

The following code opens the existing MDS+ pulse file created previously for shot=15000, run=1, from the 'data_access_tutorial' database of the current user:

Code Block
languagepy
import imas
import getpass
from imas import imasdef

#creating the Data Entry which handles pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial, 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open()

#we could now perform some read/write operations using get/put() or get_slice()/put_slice() operations
#...This will be dealt with later

#closing the Data Entry
data_entry.close()

 The existing pulse file is opened then closed. However no data have been yet saved to the pulse file.

get

IMAS data are contained in IDSs which are containers described by the IMAS Data Dictionary (DD). An IDS represents either a tokamak subsystem (like 'camera_ir'), or a concept like the 'equilibrium' IDS representing a plasma equilibrium.

An IDS can contain 0D (scalar) data with integer, float or string type, and arrays of integers or floats with dimensions from 1 to 6. 

As an example, let's consider the 'magnetics' IDS. It's description is given by the Data Dictionary. Here is a snapshot:

Image Added

Like others IDSs, the 'magnetics' IDS contains data structures (like 'ids_properties') and array of structures (like 'flux_loop', 'bpol_probe', 'b_field_pol_probe', ...) which can contain (float, integer) data arrays with 1 to 6 dimensions, or scalars (0D). 

Let's focus on the 'flux_loop' array of structures (see snapshot below). It contains an array of 'flux' structures.

Each flux structure contains a FLT_1D data array named 'data', a INT_1D array named ' validity_timed', a INT_0D scalar named 'validity', a FLT_1D data array named 'time'.

FLT_1D designates a 1 dimension array containing floats, INT_1D designates a 1 dimension array containing integers and INT_0D designates an integer scalar.  

Image Added


Each IDS exposes the get() operation which reads all IDS data from an opened data_entry.

When calling the get() operation on a IDS, all scalars and data arrays contained in the IDS are read. All these data are put in memory. 

The code below reads an existing 'magnetics' IDS from a WEST pulse file:

Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#opening the Data Entry which handles the pulse file with shot=54178, run=0, belonging to database 'west' of user 'g2lfleur', using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'west, 54178, 0, user_name='g2lfleur')

#opening the pulse file handled by the Data Entry object previously created
data_entry.open()

#reading the 'magnetics' IDS from the data_entry object previously opened
magnetics_ids = data_entry.get('magnetics', 0) #The second argument 0 is the so-called IDS occurrence.

#closing the Data Entry
data_entry.close()

#printing some IDS attributes
print('Number of flux loops = ', len(magnetics_ids.flux_loop))
print('Data of first flux loop = ', magnetics_ids.flux_loop[0].flux.data)
print('Homogeneous time basis = ', magnetics_ids.time)

Running the code above gives the following output:

Code Block
Number of flux loops =  17
Data of first flux loop =  [ 0.00065229  0.00163073  0.00489218 ... -0.01761185 -0.01663342
 -0.01500269]
Homogeneous time basis =  [ 1.83570397  1.86847198  1.90123999 ... 90.13289642 90.16566467
 90.19843292]


Warning
Only the MDS+ backend can be used for testing the code above since the WEST shot 54178 is not yet available in HDF5 format.  It will be provided as soon as possible.

put

Let's create and initialize a new 'magnetics' IDS and let's add it to the pulse file previously created in section 1.1.1. 

Then, in order to write all data (scalars and data arrays) contained in the newly created IDS to the pulse file created previously, we will use the put() operation which writes all static (non time dependent) AND dynamic data present in the IDS. 

The code below:

  • opens the Data Entry created in section 1.1.1.
  • creates a 'magnetics' IDS object
  • intializes the 'magnetics' IDS object with some values (some are mandatory)
  • adds the IDS data to the Data Entry calling the put() operation


Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#creating the Data Entry object 'data_entry' associated  to the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open() 

#creating the 'magnetics' IDS and initializing it
magnetics_ids = imas.magnetics() #creates a 'magnetics' IDS
magnetics_ids.ids_properties.homogeneous_time=1 #setting the homogeneous time (mandatory)
magnetics_ids.ids_properties.comment='IDS created for testing the IMAS Data Access layer' #setting the ids_properties.comment attribute
magnetics_ids.time=np.array([0.]) #the time(vector) basis must be not empty if homogeneous_time==1 otherwise an error will occur at runtime

#writing the 'magnetics' IDS
data_entry.put(magnetics_ids, 0) #writes magnetics data to the data_entry associated to the pulse file. The second argument 0 is the so-called IDS occurrence.

#closing the Data Entry
data_entry.close() 	 



Note

If you are using the HDF5 backend, you can check the content of the magnetics.h5 file using h5dump:

h5dump ~/public/imasdb/data_access_tutorial/3/15000/1/magnetics.h5



Let's extend the above example by adding the WEST data of the 10 first flux loops to the newly created 'magnetics' IDS.

Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#creating the Data Entry which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open() 

#creating the 'magnetics' IDS and initializing it
magnetics_ids = imas.magnetics() #creates a 'magnetics' IDS
magnetics_ids.ids_properties.comment='IDS created for testing the IMAS Data Access layer' #setting the ids_properties.comment attribute

#adding the WEST data of the 10 first flux loops
nb_flux_loops = 10
west_data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'west', 54178, 0, 'g2lfleur')
west_magnetics_ids = west_data_entry.get('magnetics', 0) #reading occurrence 0
magnetics_ids.ids_properties.homogeneous_time=west_magnetics_ids.ids_properties.homogeneous_time #setting the homogeneous time (mandatory)
magnetics_ids.flux_loop.resize(nb_flux_loops)
for i in range(nb_flux_loops):
  magnetics_ids.flux_loop[i].flux.data = west_magnetics_ids.flux_loop[i].flux.data #copies data
  if west_magnetics_ids.ids_properties.homogeneous_time==0:
    magnetics_ids.flux_loop[i].flux.time = west_magnetics_ids.flux_loop[i].flux.time #copies the time basis in case WEST IDS arrays don't accept a common time basis

if west_magnetics_ids.ids_properties.homogeneous_time==1:
   magnetics_ids.time = west_magnetics_ids.time #copies the 'root' time basis in case WEST IDS arrays accept a common time basis
   
west_data_entry.close() #closing the WEST pulse file

#writing the 'magnetics' IDS
data_entry.put(magnetics_ids, 0) #writes magnetics data to the data_entry associated to the pulse file. The second argument 0 is the so-called IDS occurrence.

#closing the Data Entry
data_entry.close() 	 


put_slice

An IDS containg dynamic data structures can be built progressively using data time slices.

A dynamic data structure:

  • is either an array with type INT_nD or FLT_nD where n=1 to 6, which accepts a time coordinate
  • or a so-called dynamic array of structures AOS[i] where i runs along a time index

In the example below, we illustrate the use of put_slice() on a 'camera_visible' IDS which contains the dynamic array of structures 'frame'.

Let's first have a look to the Data Dictionary of the 'camera_visible' IDS description:

Image Added


As indicated in the DD description, 'frame(itime)' is a dynamic data structure representing a set of frames.

Each element of 'frame' is a structure containing:

  • 'image_raw', a INT_2D array
  • 'radiance', a FLT_2D array
  • 'time', a time basis (used only in case the IDS has no common time basis)

Adding a time slice using put_slice() on the 'camera_visible' will have for effect to add a new element to the dynamic array 'frame', that is, a new frame for a given time is appended to the array of structures 'frame'.   

The example below shows how to add few time slices to the 'camera_visible' IDS, for channel 0 and detector 0:

  • The number of time slices to be appended is given by the variable 'nb_slices'.
  • Each time slice is represented by the data structure: camera_visible_ids.channel[0].detector[0].frame[0]
  • In this example, only the data of the INT_2D 'image_raw" and the data of global time basis 'time' (since the 'camera_visible' has homogeneous_time = 1) are populated in the time slice to be appended


Warning
  • The first time slice needs to be appended using a call to put(), not put_slice(). Only subsequent time slices are appended using put_slice().
  • Time slices must be appended in strictly increasing time order


Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#creating the Data Entry object which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open()

#creating the 'camera_visible' IDS and initializing it
camera_visible_ids = imas.camera_visible()
camera_visible_ids.ids_properties.homogeneous_time = 1
camera_visible_ids.channel.resize(1) #using only 1 channel (channel 0) for this example
camera_visible_ids.channel[0].detector.resize(1) #using only 1 detector for channel 0
camera_visible_ids.channel[0].detector[0].frame.resize(1) #the array of structure 'frame' contains only 1 element, it is the slice to be appended to the IDS 

X = 3 #number of horizontal pixels of 2D 'image_raw' field
Y = 5 #number of vertical pixels of 2D 'image_raw' field

camera_visible_ids.channel[0].detector[0].frame[0].image_raw.resize(X,Y) #setting the size of the image
camera_visible_ids.time.resize(1) #the time vector contains only 1 element, it's the time of the slice

nb_slices=3 #number of time slices to be added

for i in range(nb_slices):
  camera_visible_ids.time[0] = float(i) #time of the slice
  for j in range(X):
    for k in range(Y):
       camera_visible_ids.channel[0].detector[0].frame[0].image_raw[j,k] = float(j + k +  i) #image_raw is a 2D array
  if i==0:
    data_entry.put(camera_visible_ids) #the first slice has to be added using put() in order to store static data as well
  else:
    data_entry.put_slice(camera_visible_ids)  #appending the current slice to the IDS 'frame' array of structures

#closing the Data Entry
data_entry.close() 	 

Let's check the time slices we have just appended to the 'camera_visible' IDS using the code below:

Code Block
import imas
import getpass
from imas import imasdef

#creating the Data Entry object which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open()

#reading the 'camera_visible' IDS data using get()
camera_visible_ids= data_entry.get('camera_visible',0)

#printing some IDS attributes
print('homogeneous_time = ', camera_visible_ids.ids_properties.homogeneous_time)

for i in range(3):
   print("Frame : ", i)
   print(camera_visible_ids.channel[0].detector[0].frame[i].image_raw) #prints the content of this 2D array
   print("-----")

#closing the Data Entry
data_entry.close() 	 

Running the code above gives the following ouptut:

Code Block
homogeneous_time =  1
Time slice :  0.0
Image raw:
[[0 1 2 3 4]
 [1 2 3 4 5]
 [2 3 4 5 6]]
-----
Time slice :  1.0
Image raw:
[[1 2 3 4 5]
 [2 3 4 5 6]
 [3 4 5 6 7]]
-----
Time slice :  2.0
Image raw:
[[2 3 4 5 6]
 [3 4 5 6 7]
 [4 5 6 7 8]]
-----


get_slice

The get_slice() operation has the following signature:

Code Block
get_slice(<idsname>, time_requested, interpolation_method, occurrence = 0) 

Let's consider an IDS containing dynamic data as the 'camera_visible' IDS described above. 

Calling get_slice('camera_visible', t, interpolation_method, 0) will take a slice (at given time 't' and for a given interpolation method) of each dynamic data structure contained in the IDS, static data structures are ignored.

Therefore, get_slice() returns a time slice over all IDS dynamic data structures.

The following code takes a slice of the IDS dynamic data at time=1s using the closest time slice interpolation:


Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#creating the Data Entry object which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#opening the pulse file handled by the Data Entry object previously created
data_entry.open()

#getting a slice at time=1s using the closest time slice interpolation
time_requested=1.
slice = data_entry.get_slice('camera_visible', time_requested, imasdef.CLOSEST_INTERP)

print("Slice time : ", time_requested)
print("Image raw:")
print(slice.channel[0].detector[0].frame[0].image_raw)
print("-----")

#closing the Data Entry
data_entry.close()

Running the code above gives the following output:

Code Block
Slice time :  1.0
Image raw:
[[1 2 3 4 5]
 [2 3 4 5 6]
 [3 4 5 6 7]]
-----


 delete_data

A specific occurrence of an IDS can be deleted from an existing Data Entry using the delete_data() operation.

The following code erases the occurrence 0 of the 'magnetics' IDS previously created in previous sections:

Code Block
import imas
import getpass
import numpy as np
from imas import imasdef

#creating the Data Entry object which handles the pulse file with shot=15000, run=1, belonging to database 'data_access_tutorial' of the current user, using the MDS+ backend
data_entry = imas.DBEntry(imasdef.MDSPLUS_BACKEND, 'data_access_tutorial', 15000, 1)

#removing occurrence 0 of the 'magnetics' IDS previously appended to the Data Entry
data_entry.delete_data('magnetics', 0) #opens previously the Data Entry if it is closed, then delete occurrence 0 of the 'magnetics' IDS

#closing the Data Entry 
data_entry.close()

...