Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Page Navigation:

Table of Contents


(blue star) Confidence Level High  This article been formally reviewed and is signed off on by a relevant subject matter expert. 

We are working on porting our python 2 PGE's to python 3. Individual PGE's will be tested and validated against equivalent datasets in C-cluster. Once all PGE's have been validated, a system level test of the pipeline will be performed to ensure all trigger rules are in place and all expected products are generated.

Pipeline

<update pipeline diagram>

PGE's Ported to Python 3

  •  AOI Ops Report
  •  ariamh
    •  AOI Merged Track Stitcher
    •  Interferogram Auditor
    •  Registered SLC Pair Auditor
    •  Feature Extractor
    •  Predictor
    •  Reprocess Interferogram POE Org
    •  S1 coherence and amplitude
    •  S1 log amplitude ratio 
    •  Interferogram stack evaluator
    •  timeseries stack evaluator
    •  dense offset product
    •  Interferogram product
    •  registered SLC pair
    •  MRPE registered SLC Pair
    •  standard product sciflo S1 IFG single scene
    •  topsapp intergram
    •  topsapp registered SLC pair
    •  datasling extract for ASF
    •  datasling extract for Scihub
  •  Standard Product Completeness Evaluator
    •  AOI completeness evaluator
    •  S1_GUNW completeness evaluator
    •  S1_GUNW completeness evaluator (Greylist)
  •  SLC sling
  •  Standard Product
    •  AOI track acquisition enumerator
    •  acquisition evaluator
      •  SLC localizer
  •  SAR avail
  •  SLCP-COR
  •  SLCP-PM
    •  S1-LAR 
  •  COD
  •  evaluator
  •  localizer
  •  multi acquisition localizer
    •  acquisition localizer
  •  topsstack
  •  orphan datasets finder
  •  update_aoi
    •  add machine tag
    •  add user tag
    •  check AOI expiration
    •  update AOI time
    •  update AOI track
  •  Standard Product Report
    •  enumeration report
    •  AOI OPS report
    •  AOI OPS report email submission
  •  USGS-neic
    •  query usgs publish
    •  query usgs earthquake feed
    •  query earthquake feed time range
  •  USGS evaluator
    •  usgs event evaluator
  •  Scihub acquisition scraper
    •  acquisition ingest by AOI
    •  ingest acquisition from ASF
    •  ingest acquisition from Scihub
    •  Ingest acquisition from Scihub daily
    •  Ingest acquisition by ID from Scihub
    •  AOI based submission of acquisition jobs
    •  AOI based scraper
    •  AOI validate acquisitions
    •  IPF scraper ASF
    •  IPF scraper Scihub
  •  S1 qc ingest
    •  calibration crawler
    •  orbit crawler
    •  orbit ingest
  •  BOS SARCAT
    •  ingest acquisitions from BOS
    •  scrub outdated acquisitions from BOS
  •  AOI enumerator submitter
    •  enumerator submitter 
  •  leaflet ingester
    •  displacement timeseries ingest
  •  ASF SNS handshaking
    •  update dataset with CMR metadata
  •  SLCP2COD
    •  S1 COD facet selection
    •  S1 COD network selector
  •  Product delivery
    •  Product delivery
  •  Standard Product Validator
    •  blacklist GUNW from topsapp job
    •  greylist GUNW from topsapp job
    •  IFG tagger
    •  evaluate for blacklist
    •  submit enumeration job for blacklist product
  •  Create AOI rules
    •  Create COD rule 
    •  Create COR rule
    •  create LAR rule
    •  create SLCP rule

PGE I/O

Info

PGE test list is incomplete. We are currently focusing on the main pipeline


Expand
titlePGE Inputs & Outputs

AOI based submission of acq scraper jobs

  • build: develop

  • input:

  • output:

AOI based submission of ipf scraper jobs

  • build: develop

  • input:

  • output:

AOI Enumerator Submitter

  • build: develop

  • input:

  • output:

AOI Merged Track Stitcher

  • build: python3

  • input:

  • output:

AOI Validate Acquisitions

  • build: develop

  • input:

  • output:

AWS Get Script

  • build: v0.0.8

  • input:

  • output:

Data Sling Extract for asf

  • build: python3

  • input:

  • output:

Data Sling Extract for Scihub

  • build: python3

  • input:

  • output:

Test Procedures

Expand
titleAOI Enumerator Submitter

Test Description:

Run the enumerator submitter as an on-demand job to generate the acquisition lists. Compare output acquisition lists to those generated over the same AOI on C-cluster.

Github Repo: https://github.com/aria-jpl/aoi_enumerator_submitter

Pass Criteria:

  • All expected acquisition lists were successfully generated

Input dataset:  <link to dataset used> (a single AOI)

Output dataset: <link to slc's generated> (acquisition lists)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleAOI Track Acquisition Enumerator

Test Description:

Run the AOI track acquisition enumerator as an on-demand job to generate the acquisition lists. Compare output acquisition lists to those generated over the same AOI on C-cluster.

Github Repo: https://github.com/aria-jpl/standard_product.git

Pass Criteria:

  • All expected acquisition lists were successfully generated

Input dataset:  <link to dataset used> (S1-AUX_POEORB)

Output dataset: <link to slc's generated> (acquisition lists)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleStandard Product S1-GUNW - slc_localizer

Test Description:

Run the localizer as an on-demand job to generate the sling jobs for the missing slc's.

Github Repo: https://github.com/aria-jpl/standard_product.git

Pass Criteria:

  • All identified missing slc's from acquisition list input set have sling jobs spun up for them.

  • There are no sling jobs generated for slc's already in the system.

Input dataset:  <link to dataset used> (acquisition lists)

Output dataset: <link to slc's generated> (sling jobs)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleEvaluator

Test Description:

Run the evaluator as an on-demand job to ensure all expected ifg-cfg's are generated.

Github Repo: https://github.com/aria-jpl/standard_product.git

Pass Criteria:

  • Compare ifg-cfg datasets that were generated to those that were generated on C-cluster over the same set of SLC's.

Input dataset:  <link to dataset used> (slc's)

Output dataset: <link to slc's generated> (ifg-cfgs)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleTopsApp

Test Description:

Run the topsapp PGE as an on-demand job to generate GUNW's.

Github Repo: https://github.com/aria-jpl/ariamh.git

Pass Criteria:

  • GUNW's for all input ifg-cfg's are generated and compared to those that were generated over the same ifg-cfg set.

Input dataset:  <link to dataset used> (ifg-cfg)

Output dataset: <link to slc's generated> (s1-gunw)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleGUNW Completeness Evaluator

Test Description:

Run the completeness evaluator as an on-demand job to generate the aoi_track dataset if the track is complete.

Github Repo: https://github.com/aria-jpl/standard_product_completeness_evaluator.git

Pass Criteria:

  • If any GUNW is missing over a single date-pair spatial extent of aoi, aoi_track dataset is not generated.

  • If all GUNW's over a single date-pair spatial extent of aoi exist, aoi_track is generated.

Input dataset:  <link to dataset used> (gunw)

Output dataset: <link to slc's generated> (aoi track)

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleData Sling Extract for asf

Test Description:

Ensures this PGE pulls SLC products from ASF endpoint and ingests them into S3/ES when PGE is run as an on-demand job.

Pass Criteria:

  • Job must complete successfully

  • SLCs must be registered in S3

  • SLCs must be found when faceting on them in tosca

Input dataset:  <link to dataset used>

Output dataset: <link to slc's generated>

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Expand
titleData Sling Extract for Scihub

Test Description:

Ensures this PGE pulls SLC products from SciHub endpoint and ingests them into S3/ES when PGE is run as an on-demand job.

Pass Criteria:

  • Job must complete successfully

  • SLCs must be registered in S3

  • SLCs must be found when faceting on them in tosca

Input dataset:  <link to dataset used>

Output dataset: <link to slc's generated>

Summary:

<summary of test results; quick description of any validation done with datasets from c-cluster>



Process

For this example we are porting the aoi ops report.

  1. Create a python3 virtual environment.
     virtualenv env3 -p python3 to create the virtual environment

  2. Start the python3 virtual environment.
    source ~/env3/bin/activate to start a python 3 virtual environment

  3. Pull contents of repo on a new branch named python3

  4.  Run futurize over the contents of the repo.
    pip install future
    cd <repo>
    futurize -w -n -p .

    Output will show what has been changed.

  5. ssh into e mozart to add the python3 converted pge : sds ci add_job -k -b develop https://github.com/aria-jpl/standard_product_report.git s3

  6. Go to e jenkins and click configure. Specify the branch as python3 branch and build. check the docker file change FROM to the latest branch. wait for build to complete successfully. may take a few minutes. will publish job to e cluster automatically.

  7. Go to e cluster and run job. step into container if you need to debug stuff.

  8. Once job runs successfully, push changes to dev.



(lightbulb) Have Questions? Ask a HySDS Developer:

Anyone can join our public Slack channelto learn more about HySDS. JPL employees can join #HySDS-Community

(blue star)

JPLers can also ask HySDS questions atStack Overflow Enterprise

(blue star)

Live Search
placeholderSearch HySDS Wiki