To develop and deploy Kestra flows, Kestra developers can use Terraform.
Doing so comes with great advantages:
- Scale the codebase by using the DRY best practices
- Manage multiple environments thanks to Terraform
- Make reproducibility at the core of your Kestra development
File structure follows :
.
└── environment/
├── development
├── production/ # Contains subfolders defining Kestra flows resources
│ ├── automation/
│ ├── cicd/
│ ├── dbt/
│ ├── main.tf # Instanciate each folder (automation, dbt ...)
│ └── ...
├── modules/ # Terraform modules to be used across environments
│ ├── dbt_run/
│ ├── postgres_query/
│ └── ...
└── subflows/ # Kestra subflows
├── main.tf
├── sub_query_my_postgres_database.yml
└── ...
- Each environment (i.e. development folder) is linked to a Kestra namespace
- we wrap flow definition in folders to separate use cases
- each TF module created in an
environmnent
sub-folder is called inmain.tf
to be instanciated triggers
are meant to be reusable to DRY our code. They should not be declared for a specific flow (except for flow trigger)
development
& production
are deployed on same Kestra instance.
development
allows you to test your flow before going in production. Make sure you are not working at the same time on it.
production
requires you to create a PR to validate your changes before being able to apply.
Note: Using namespaces to seperate environment within the same Kestra instance is not recommanded for production. You can read more about best practices for Kestra environment and deployment management here TODO: improve developer experience if concurrent access. Not needed now. Secure
terraform apply
in production using CI/CD.
-
Before creating your PR
To ensure your changes are correct and check how it impacts current state, run in
environment
:terraform init
terraform plan
-
Create your PR : fill the pull_request template
-
When PR has been reviewed and accepted
- Merge your PR
- Go on
main
branch - Run
terraform apply
with theterraform_apply
Kestra flow to apply your changes
In order to provide modular dev experience, we leverage Terraform modules and subflow pattern.
-
modules
are used as an abstraction layer for a whole use case (i.e. triggering an Airbyte sync and runnong dbt) without having to worry about Kestra syntax, authentication or connection details. It allows you to use all native Terraform features regarding input validation and passing from other ressources (i.e. DBT cloud job config Terraform resource outputs or Airbyte ones) outputs to seamlessly create dependencies between components etc. -
subflows
with terraform are best suited for generic and standalone tasks.- It contains direct YAML and are declared within subflows/main.tf and define Inputs and Outputs clearly.
- Subflows can depend on each other, this dependency should be materialized by
depends_on
field in subflows/main.tf. - Can be used by
modules
to hide some non-necessary complexity and to dry redundant tasks (GCP secret retrieval for example)