Page Navigation: | |
---|---|
|
Confidence Level High This article been formally reviewed and is signed off on by a relevant subject matter expert. |
---|
We are working on porting our python 2 PGE's to python 3. Individual PGE's will be tested and validated against equivalent datasets in C-cluster. Once all PGE's have been validated, a system level test of the pipeline will be performed to ensure all trigger rules are in place and all expected products are generated.
Pipeline
<update pipeline diagram>
PGE's Ported to Python 3
- AOI Ops Report
- ariamh
- AOI Merged Track Stitcher
- Interferogram Auditor
- Registered SLC Pair Auditor
- Feature Extractor
- Predictor
- Reprocess Interferogram POE Org
- S1 coherence and amplitude
- S1 log amplitude ratio
- Interferogram stack evaluator
- timeseries stack evaluator
- dense offset product
- Interferogram product
- registered SLC pair
- MRPE registered SLC Pair
- standard product sciflo S1 IFG single scene
- topsapp intergram
- topsapp registered SLC pair
- datasling extract for ASF
- datasling extract for Scihub
- Standard Product Completeness Evaluator
- AOI completeness evaluator
- S1_GUNW completeness evaluator
- S1_GUNW completeness evaluator (Greylist)
- SLC sling
- Standard Product
- AOI track acquisition enumerator
- acquisition evaluator
- SLC localizer
- SAR avail
- SLCP-COR
- SLCP-PM
- S1-LAR
- COD
- evaluator
- localizer
- multi acquisition localizer
- acquisition localizer
- topsstack
- orphan datasets finder
- update_aoi
- add machine tag
- add user tag
- check AOI expiration
- update AOI time
- update AOI track
- Standard Product Report
- enumeration report
- AOI OPS report
- AOI OPS report email submission
- USGS-neic
- query usgs publish
- query usgs earthquake feed
- query earthquake feed time range
- USGS evaluator
- usgs event evaluator
- Scihub acquisition scraper
- acquisition ingest by AOI
- ingest acquisition from ASF
- ingest acquisition from Scihub
- Ingest acquisition from Scihub daily
- Ingest acquisition by ID from Scihub
- AOI based submission of acquisition jobs
- AOI based scraper
- AOI validate acquisitions
- IPF scraper ASF
- IPF scraper Scihub
- S1 qc ingest
- calibration crawler
- orbit crawler
- orbit ingest
- BOS SARCAT
- ingest acquisitions from BOS
- scrub outdated acquisitions from BOS
- AOI enumerator submitter
- enumerator submitter
- leaflet ingester
- displacement timeseries ingest
- ASF SNS handshaking
- update dataset with CMR metadata
- SLCP2COD
- S1 COD facet selection
- S1 COD network selector
- Product delivery
- Product delivery
- Standard Product Validator
- blacklist GUNW from topsapp job
- greylist GUNW from topsapp job
- IFG tagger
- evaluate for blacklist
- submit enumeration job for blacklist product
- Create AOI rules
- Create COD rule
- Create COR rule
- create LAR rule
- create SLCP rule
PGE I/O
Info |
---|
PGE test list is incomplete. We are currently focusing on the main pipeline |
Expand | ||
---|---|---|
| ||
AOI based submission of acq scraper jobs
AOI based submission of ipf scraper jobs
AOI Enumerator Submitter
AOI Merged Track Stitcher
AOI Validate Acquisitions
AWS Get Script
Data Sling Extract for asf
Data Sling Extract for Scihub
|
Test Procedures
Expand | ||
---|---|---|
| ||
Test Description: Run the enumerator submitter as an on-demand job to generate the acquisition lists. Compare output acquisition lists to those generated over the same AOI on C-cluster. Github Repo: https://github.com/aria-jpl/aoi_enumerator_submitter Pass Criteria:
Input dataset: <link to dataset used> (a single AOI) Output dataset: <link to slc's generated> (acquisition lists) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Run the AOI track acquisition enumerator as an on-demand job to generate the acquisition lists. Compare output acquisition lists to those generated over the same AOI on C-cluster. Github Repo: https://github.com/aria-jpl/standard_product.git Pass Criteria:
Input dataset: <link to dataset used> (S1-AUX_POEORB) Output dataset: <link to slc's generated> (acquisition lists) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Run the localizer as an on-demand job to generate the sling jobs for the missing slc's. Github Repo: https://github.com/aria-jpl/standard_product.git Pass Criteria:
Input dataset: <link to dataset used> (acquisition lists) Output dataset: <link to slc's generated> (sling jobs) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Run the evaluator as an on-demand job to ensure all expected ifg-cfg's are generated. Github Repo: https://github.com/aria-jpl/standard_product.git Pass Criteria:
Input dataset: <link to dataset used> (slc's) Output dataset: <link to slc's generated> (ifg-cfgs) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Run the topsapp PGE as an on-demand job to generate GUNW's. Github Repo: https://github.com/aria-jpl/ariamh.git Pass Criteria:
Input dataset: <link to dataset used> (ifg-cfg) Output dataset: <link to slc's generated> (s1-gunw) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Run the completeness evaluator as an on-demand job to generate the aoi_track dataset if the track is complete. Github Repo: https://github.com/aria-jpl/standard_product_completeness_evaluator.git Pass Criteria:
Input dataset: <link to dataset used> (gunw) Output dataset: <link to slc's generated> (aoi track) Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Ensures this PGE pulls SLC products from ASF endpoint and ingests them into S3/ES when PGE is run as an on-demand job. Pass Criteria:
Input dataset: <link to dataset used> Output dataset: <link to slc's generated> Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Expand | ||
---|---|---|
| ||
Test Description: Ensures this PGE pulls SLC products from SciHub endpoint and ingests them into S3/ES when PGE is run as an on-demand job. Pass Criteria:
Input dataset: <link to dataset used> Output dataset: <link to slc's generated> Summary: <summary of test results; quick description of any validation done with datasets from c-cluster> |
Process
For this example we are porting the aoi ops report.
Create a python3 virtual environment.
virtualenv env3 -p python3 to create the virtual environmentStart the python3 virtual environment.
source ~/env3/bin/activate to start a python 3 virtual environmentPull contents of repo on a new branch named python3
Run futurize over the contents of the repo.
pip install future
cd <repo>
futurize -w -n -p .
Output will show what has been changed.ssh into e mozart to add the python3 converted pge : sds ci add_job -k -b develop https://github.com/aria-jpl/standard_product_report.git s3
Go to e jenkins and click configure. Specify the branch as python3 branch and build. check the docker file change FROM to the latest branch. wait for build to complete successfully. may take a few minutes. will publish job to e cluster automatically.
Go to e cluster and run job. step into container if you need to debug stuff.
Once job runs successfully, push changes to dev.