We use Apache Airflow to demonstrate container orchestration for a bioinformatics pipeline using the KubernetesPodOperator model.
NOTE: This code is unfinished
- Data is pulled from the rarecompute/airflow-data repository and placed into the workspace persistent volume
- Data is processed using established variables in Apache Airflow
- Artifacts are generated and stored in the workspace persistent volume and can be downloaded via SFTP