Skip to content

Commit

Permalink
Update README, clean up docs folder
Browse files Browse the repository at this point in the history
  • Loading branch information
dostuffthatmatters committed Feb 3, 2024
1 parent a0149ca commit 3638276
Show file tree
Hide file tree
Showing 5 changed files with 21 additions and 98 deletions.
6 changes: 3 additions & 3 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,13 +4,13 @@

We retrieve a lot of EM27/SUN data, produced mainly by [MUCCnet (Dietrich et al., 2021)](https://doi.org/10.5194/amt-14-1111-2021), and have used this pipeline since end of 2021.

This codebase provides an automated data pipeline for Proffast 1 and 2.X (https://www.imk-asf.kit.edu/english/3225.php). Under the hood, it uses the Proffast Pylot (https://gitlab.eudat.eu/coccon-kit/proffastpylot.git) to interact with Proffast 2 and an in-house connector to interact with Proffast 1. Whenever using this pipeline for Proffast retrievals, please make sure to also cite [Proffast](https://www.imk-asf.kit.edu/english/3225.php) and the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot) (for Proffast 2.X retrievals).
This codebase provides an automated data pipeline for [Proffast 1 and 2.X](https://www.imk-asf.kit.edu/english/3225.php). Under the hood, it uses the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot.git) to interact with Proffast 2 and an in-house connector to interact with Proffast 1. Whenever using this pipeline for Proffast retrievals, please make sure to also cite [Proffast](https://www.imk-asf.kit.edu/english/3225.php) and the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot) (for Proffast 2.X retrievals).

📚 Read the documentation at [em27-retrieval-pipeline.netlify.app](https://em27-retrieval-pipeline.netlify.app).<br/>
💾 Get the source code at [github.com/tum-esm/em27-retrieval-pipeline](https://github.com/tum-esm/em27-retrieval-pipeline).<br/>
🐝 Report Issues or discuss enhancements using [issues on GitHub](https://github.com/tum-esm/em27-retrieval-pipeline/issues).

**Related Projects:**

⚙️ Many of our projects use much functionality from the `tum-esm-utils` package: [github.com/tum-esm/utils](https://github.com/tum-esm/utils).<br/>
🤖 Our EM27/SUN systems are running autonomously with the help of Pyra: [github.com/tum-esm/pyra](https://github.com/tum-esm/pyra).
⚙️ Many of our projects use much functionality from the [`tum-esm-utils` package](https://github.com/tum-esm/utils).<br/>
🤖 Our EM27/SUN systems are running autonomously with the help of [Pyra](https://github.com/tum-esm/pyra).
14 changes: 7 additions & 7 deletions docs/next.config.js
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
const withNextra = require('nextra')({
theme: 'nextra-theme-docs',
themeConfig: './theme.config.jsx',
const withNextra = require("nextra")({
theme: "nextra-theme-docs",
themeConfig: "./theme.config.jsx",
});

module.exports = withNextra({
images: {
unoptimized: true,
},
output: 'export',
images: {
unoptimized: true,
},
output: "export",
});

// If you have other Next.js configurations, you can pass them as the parameter:
Expand Down
14 changes: 7 additions & 7 deletions docs/pages/index.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -4,16 +4,16 @@

We retrieve a lot of EM27/SUN data, produced mainly by [MUCCnet (Dietrich et al., 2021)](https://doi.org/10.5194/amt-14-1111-2021), and have used this pipeline since end of 2021.

This codebase provides an automated data pipeline for Proffast 1 and 2.X (https://www.imk-asf.kit.edu/english/3225.php). Under the hood, it uses the Proffast Pylot (https://gitlab.eudat.eu/coccon-kit/proffastpylot.git) to interact with Proffast 2 and an in-house connector to interact with Proffast 1. Whenever using this pipeline for Proffast retrievals, please make sure to also cite [Proffast](https://www.imk-asf.kit.edu/english/3225.php) and the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot) (for Proffast 2.X retrievals).
This codebase provides an automated data pipeline for [Proffast 1 and 2.X](https://www.imk-asf.kit.edu/english/3225.php). Under the hood, it uses the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot.git) to interact with Proffast 2 and an in-house connector to interact with Proffast 1. Whenever using this pipeline for Proffast retrievals, please make sure to also cite [Proffast](https://www.imk-asf.kit.edu/english/3225.php) and the [Proffast Pylot](https://gitlab.eudat.eu/coccon-kit/proffastpylot) (for Proffast 2.X retrievals).

📚 Read the documentation at [em27-retrieval-pipeline.netlify.app](https://em27-retrieval-pipeline.netlify.app).<br/>
💾 Get the source code at [github.com/tum-esm/em27-retrieval-pipeline](https://github.com/tum-esm/em27-retrieval-pipeline).<br/>
🐝 Report Issues or discuss enhancements using [issues on GitHub](https://github.com/tum-esm/em27-retrieval-pipeline/issues).

**Related Projects:**

⚙️ Many of our projects use much functionality from the `tum-esm-utils` package: [github.com/tum-esm/utils](https://github.com/tum-esm/utils).<br/>
🤖 Our EM27/SUN systems are running autonomously with the help of Pyra: [github.com/tum-esm/pyra](https://github.com/tum-esm/pyra).
⚙️ Many of our projects use much functionality from the [`tum-esm-utils` package](https://github.com/tum-esm/utils).<br/>
🤖 Our EM27/SUN systems are running autonomously with the help of [Pyra](https://github.com/tum-esm/pyra).

## EM27 Retrieval Pipeline vs. Proffast Pylot

Expand Down Expand Up @@ -56,13 +56,13 @@ The data flow from input to merged outputs:
The pipeline offers:

- **Easy configuration of using a validated `config.json` (and metadata files):** By "validated", we mean that before the processing starts, the config files content will be parsed and validated against a JSON schema. This way, you can be sure that the pipeline will not fail due to a misconfiguration, and you will immediately get precise error messages.
- **Opinionated management of station metadata:** We manage our EM27 metadata using JSON files instead of database tables, which has several benefits mentioned in the metadata repository https://github.com/tum-esm/em27-metadata
- **Filtering of interferogram files that Proffast cannot process:** When some interferograms are corrupted, Proffast will fail during preprocessing for whole days of data even when only a few out of thousands of interferograms are bad. The pipeline filters out these interferograms and only passes the valid ones to Proffast. A standalone version of this filtering functionality is included in our utility library: https://tum-esm-utils.netlify.app/api-reference#tum_esm_utilsinterferograms
- **Opinionated management of station metadata:** We manage our EM27 metadata using JSON files instead of database tables, which has several benefits mentioned in the [metadata repository](https://github.com/tum-esm/em27-metadata)
- **Filtering of interferogram files that Proffast cannot process:** When some interferograms are corrupted, Proffast will fail during preprocessing for whole days of data even when only a few out of thousands of interferograms are bad. The pipeline filters out these interferograms and only passes the valid ones to Proffast. A standalone version of this filtering functionality is included in our [utility library](https://tum-esm-utils.netlify.app/api-reference#tum_esm_utilsinterferograms)
- **Parallelization of the Proffast Pylot execution:** The Pylot already provides parallelization. However, we decided to isolate the approach of the Pylot more and run the retrieval execution for each station and date individually inside a containerized environment. This way, errors during the retrieval don't affect separate days/stations, and we have separated outputs and logs.
- **Fully automated interface to obtain Ginput Data:** The atmospheric profiles downloader of this pipeline automates the request for GGG2014 and GGG2020 data (supporting standard sites) from `ftp://ccycle.gps.caltech.edu`. The manual instructions can be found here: https://tccon-wiki.caltech.edu/Main/ObtainingGinputData.
- **Fully automated interface to obtain Ginput Data:** The atmospheric profiles downloader of this pipeline automates the request for GGG2014 and GGG2020 data (supporting standard sites) from `ftp://ccycle.gps.caltech.edu`. The manual instructions can be found [here](https://tccon-wiki.caltech.edu/Main/ObtainingGinputData).
- **Comprehensive logs and output data management:** It will store failed and succeeded containers. The output is the same as with the Pylot but also contains all config files the pipeline used to run this container and logs generated by the container.
- **Postprocessing of raw time series:** Data of correlated stations can be smoothed and merged into daily output files. The output files contain header sections with everything needed to reproduce the output file based on the raw interferograms ([read about the format here](/guides/directories#exports)).
- **Documentation and complete API reference:** hosted at https://em27-retrieval-pipeline.netlify.app/
- **Documentation and complete API reference:** hosted at [em27-retrieval-pipeline.netlify.app](https://em27-retrieval-pipeline.netlify.app/)

## Getting Started

Expand Down
8 changes: 4 additions & 4 deletions docs/postcss.config.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,8 +2,8 @@
// https://tailwindcss.com/docs/using-with-preprocessors
/** @type {import('postcss').Postcss} */
module.exports = {
plugins: {
tailwindcss: {},
autoprefixer: {},
},
plugins: {
tailwindcss: {},
autoprefixer: {},
},
};
77 changes: 0 additions & 77 deletions docs/style.css
Original file line number Diff line number Diff line change
Expand Up @@ -13,80 +13,3 @@
.nextra-banner-container {
@apply !bg-amber-200 !text-amber-950 !border-b !border-amber-300 dark:!border-0 !bg-none;
}

/*
.background-paper-pattern {
opacity: 0.8;
background-size: 50px 50px, 50px 50px, 10px 10px, 10px 10px;
background-position: -1px -1px, -1px -1px, -1px -1px, -1px -1px;
}
.nextra-nav-container {
@apply border-b border-slate-300 dark:border-slate-700;
}
.light #__next {
.background-paper-pattern {
background-image: linear-gradient(#e2e8f0 1px, transparent 1px),
linear-gradient(90deg, #e2e8f0 1px, transparent 1px),
linear-gradient(#f1f5f9 1px, transparent 1px),
linear-gradient(90deg, #f1f5f9 1px, transparent 1px);
}
}
.dark #__next {
.background-paper-pattern {
background-image: linear-gradient(#374151 1px, transparent 1px),
linear-gradient(90deg, #374151 1px, transparent 1px),
linear-gradient(#1f2937 1px, transparent 1px),
linear-gradient(90deg, #1f2937 1px, transparent 1px);
}
}
.dark\:nx-bg-dark,
#__next {
@apply dark:!bg-slate-900;
}
.nx-shadow-\[0_-12px_16px_\#fff\],
.dark\:nx-shadow-\[0_-12px_16px_\#111\] {
@apply !shadow-none;
}
.dark\:nx-bg-neutral-900,
:is(html[class~='dark']) .nextra-nav-container-blur {
@apply dark:!bg-slate-950;
}
.dark\:nx-bg-neutral-800 {
@apply dark:!bg-slate-900;
}
.nx-text-gray-500 {
@apply !text-slate-500 dark:!text-slate-400;
}
.nx-text-gray-700 {
@apply !text-slate-700 dark:!text-slate-300;
}
.nextra-sidebar-container {
@apply dark:!bg-slate-900;
}
.serif {
font-family: var(--next-font-google-crimson-pro), serif;
}
.nx-duration-500 {
transition-duration: 100ms;
}
.nx-transition-colors {
transition-duration: 50ms;
}
footer div.nx-py-12 {
@apply py-3 font-semibold text-sm text-gray-800 items-center justify-center;
}
*/

0 comments on commit 3638276

Please sign in to comment.