Skip to content

SNAP input data

Zhang Yunjun edited this page May 5, 2020 · 10 revisions

SNAP + MintPy workflow for InSAR time series analysis

SNAP is a popular application to process SAR data. It allows for the creation of all the required input data for Mintpy. Therefore if MintPy can read this data it becomes possible to apply Multi-temporal InSAR and atmospheric correction to SNAP outputs. SNAP processing can be done manually, in the command line, or using the SNAP python API allowing for automated scripts.

An effort has been made in MintPy to integrate the outputs of SNAP directly for ease of processing. Different from the SNAP extensional export for PSI with StaMPS, MintPy reads the native SNAP .dim files directly. This makes it possible to generate input files in SNAP and ingest it directly into MintPy. This recipe below should work as long as the correct workflow has been followed.

This aims to make the world of open-source Multi-temporal InSAR just a little bit more user friendly.

Notes

  • All required sensor / baseline data are written to the .dim during SNAP processing
  • Baseline attributes specifically is written during co-registration (SNAP back-geocoding) step.
  • All input files should contain all required metadata since each .dim file is parsed individually - this is why the DEM file is exported after co-registration
  • .dim files should only have one band
  • prep_snap.py was built and tested on data generated through the following workflow - more testing is required.
  • Feel free to add/modify more details/notes. A better-documented note helps everyone!

SNAP Workflow

This workflow makes it possible to work with an area of interest covered by any combination of Sentinel-1 bursts / swaths / slices. Should be done for entire redundant interferogram network of master and slave scenes.

  1. Read master and slave data
  2. Master and slave slice assembly (if required for sentinel-1)
  3. Master and slave Apply orbit file
  4. Master and slave Split product (extract relevant polarisation and subswaths)
  5. Following is done per subswath [IW1, IW2, IW3]
    • Back-geocoding co-registration
    • Enhanced spectral diversity (if more than one burst is present for Sentinel-1)
    • Interferogram generation
    • TOPSAR Deburst
    • Topo-phase removal
    • Goldstein phase filtering
  6. Merge subswaths of flattened and filtered interferogram products (if more than one swath was done)
  7. Subset by band and extract interferogram (This is the first MintPy input product - optional)
  8. Add elevation band to co-registered product (must be done here to capture baseline attribute data)
  9. Subset by band and extract only elevation band (This is the second MintPy input product)
  10. Generate Coherence
  11. Subset by band and extract coherence (This is the third MintPy input product)
  12. Snaphu export interferogram
  13. SNAPHU phase unwrapping (external program)
  14. Snaphu import and save unwrapped product (This is the fourth MintPy input product)
  15. Terrain correct all four Mintpy products
  16. Subset all four MintPy products by region - must be done after terrain correction to ensure identical extents.

The resulting directory should have a structure similar to the example here, which also include an example template file. The rest (running smallbaselineApp.py) is the same as shown in the example dataset.

Contributing

  • Only limited testing has been done on prep_snap.py. It is possible that different SNAP workflows would not create a .dim file that is not expected by the script. This is because prep_snap.py tries to extract attribute files from the .dim XML file using simple parsing which might be sensitive to different workflows. The script was developed on data prepared by the workflow detailed earlier using Python / SNAP / snappy.
  • So far it only accepts geocoded (SNAP terrain correction step) data - not in radar geometry. Although, this should be possible.
  • Only tested on Sentinel-1
  • No formal comparison / assessment has been done yet.

Welcome

Software Design

Clone this wiki locally