Replies: 2 comments
-
I've built a version of lovely plugin which will take It's barely tested, as I'm not a hera user. To give it a go in ArgoCD install the [ Put some hera python in a git directory with It should take the directory and render all the objects and put them into your cluster for you. I'm happy to help with wherever you get to doing this and make this a bit more formal. PRs also welcome! |
Beta Was this translation helpful? Give feedback.
-
Hi @menzenski , I just started to develop a blueprint hera repository which can be used by our teams. The basic idea is very simple: The workflows of the project are rendered as yamls (WorkflowTemplates actually) by a github pipeline. Would something like this work for you as well? |
Beta Was this translation helpful? Give feedback.
-
We adopted Argo Workflows for our ELT processes at the beginning of the year.
We're defining those workflows in Python using Hera, and these Python files live in our ELT repository alongside the definitions of the jobs that they invoke. All of our Kubernetes resources are managed with ArgoCD - I've set up a process to get the Hera-defined workflows into our Kubernetes clusters via ArgoCD, but it's not a great process as it's got some manual steps, and I don't want that.
I'm interested to connect with anyone else who's using both Hera to define workflows and a GitOps tool like ArgoCD or Flux to deploy those workflows. I'll be interested to know what other people are doing in this area - I'd like to build a more automated solution for our needs and am trying to do some research before I begin that work.
Here's an example of what we're doing:
We define each Argo Workflows resource (CronWorkflow or WorkflowTemplate) in its own Python file. Each file has an
if __name__ == "__main__":
block that prints the YAML manifest for that resource (using the Herato_yaml
method) when that Python script is invoked directly.These Python scripts are located in an
argo/src/
directory in our git repository. We have a helper script defined (we are using PDM to manage Python dependencies, these scripts are defined as PDM shell scripts) to generate the YAML manifests in a gitignoredargo/out/
directory:Build the workflows:
The repository where these Hera scripts live defines a GitHub Actions job that runs on pull requests - it generates the YAML from all the scripts and validates the resulting Kubernetes manifests against the Argo Workflows schemas using kubeconform. Aside from the obvious benefits of validating against the schemas, this check will error and fail if one of the Python scripts does not define that
if __name__ == "__main__":
block. So it helps enforce that convention as well:This is where the process becomes manual 😬
After a PR is merged, I pull the latest
main
branch of this repository, and I run thepdm run build-workflows
command to generate all the YAML manifests. Then, I copy them all into a separate repository where our ArgoCD-managed Argo Workflows manifests are located. We are running Argo Workflows in six environments (six Kubernetes clusters), so I copy the output files to each environment separately:We also have some per-environment complications (for example, some CronWorkflows shouldn't be defined in all environments), so after I run those copy commands, I do that cleanup manually.
Then I open a pull request in that ArgoCD-managed repository, and when that pull request is merged, ArgoCD gets my changes out into the appropriate clusters.
I think there are several opportunities to improve our current process:
main
in that other repository.Is anyone else using Hera with ArgoCD? What's your process look like?
Beta Was this translation helpful? Give feedback.
All reactions