SciFlo Job

Page Navigation:

 

Confidence Level High  This article been formally reviewed and is signed off on by a relevant subject matter expert. 

Confidence Level High  This article been formally reviewed and is signed off on by a relevant subject matter expert. 


As briefly mentioned before SciFlo is a workflow framework. It uses a XML workflow definition document to know the sequence of steps. Each step of a workflow is known as a process. The workflow definition documents defines the flow of the processes and how to run each process. More about the workflow definition documents can be found here: SciFlo Workflow Definition

 

SciFlo Bash Script

Workflow jobs are executed by the run_sciflo.sh script (https://github.com/hysds/chimera/blob/develop/chimera/run_sciflo.sh).

It sets up the BASE_PATH, PYTHONPATH and other environment requirements for the SciFlo job.

It takes 3 positional arguments:

  1. MODULE_PATH – Location to the adaptation module

  2. WF_DIR – Path to the directory containing the workflow definition document

  3. WF_NAME – Name of the Algorithm/Job/PGE to run

MODULE_PATH="$1" WF_DIR="$2" WF_NAME="$3"

It exports the adaptation module’s path to the python path so the scripts can resolve dependency imports.

export PYTHONPATH=${MODULE_PATH}:$PYTHONPATH

It constructs the location of the workflow definition document with WF_DIR and WF_NAME i.e. $WF_DIR/$WF_NAME.sf.xml

It runs the run_sciflo.py script with the required inputs. See section “SciFlo Python Script”.

python $BASE_PATH/run_sciflo.py $WF_DIR/$WF_NAME.sf.xml _context.json output > run_sciflo_$WF_NAME.log 2>&1 STATUS=$? echo -n "Finished running $PGE run_sciflo.py: " 1>&2 date 1>&2 if [ $STATUS -ne 0 ]; then echo "Failed to run $PGE run_sciflo.py" 1>&2 cat run_sciflo_$WF_NAME.log 1>&2 echo "{}" exit $STATUS fi

SciFlo Python Script

The run_sciflo.py is the python script that executes the SciFlo job.

It takes 3 positional arguments:

  1. sfl_file – Workflow definition document - has the file extension .wf.xml

  2. context_file – Job’s context file aka _context.json

  3. output_folder – Directory to write outputs of the job i.e. output

This script imports the sciflo_util and calls the run_sciflo function.

The command to run SciFlo is $HOME/verdi/bin/sflExec.py -s -f -o output --args "sfl_context=_context.json", "pge.wf.xml"

In the sciflo_util.run_sciflo function:

This command is constructed and run.

Based on status of the SciFlo run it logs out the error and copies the files generated by the SciFlo run into the specified output directory.

 


Related Articles:

Have Questions? Ask a HySDS Developer:

Anyone can join our public Slack channel to learn more about HySDS. JPL employees can join #HySDS-Community

JPLers can also ask HySDS questions at Stack Overflow Enterprise

Search HySDS Wiki

Page Information:

Was this page useful?

Yes No

Contribution History:

Subject Matter Expert:

@Namrata Malarout

Find an Error?

Is this document outdated or inaccurate? Please contact the assigned Page Maintainer:

@Namrata Malarout

Note: JPL employees can also get answers to HySDS questions at Stack Overflow Enterprise: