Argo Workflows
Argo Workflows is an open-source, container-native workflow engine for orchestrating parallel jobs on Kubernetes. It is a CNCF graduated project that allows you to define workflows where each step is a container, model multi-step workflows as sequences of tasks or DAGs, and run compute-intensive jobs for machine learning, data processing, and CI/CD pipelines natively on Kubernetes. Governed by the Linux Foundation and the CNCF.
APIs
Argo Workflows API
The Argo Workflows REST API provides programmatic access to workflow lifecycle management, workflow templates, cron scheduling, archived workflow history, events, and cluster wo...
Capabilities
Argo Workflows Orchestration
Unified capability for container-native workflow orchestration on Kubernetes using Argo Workflows. Combines workflow lifecycle management, template reuse, cron scheduling, and w...
Run with NaftikoFeatures
Every workflow step runs as a Kubernetes container, providing complete isolation and reproducibility.
Define multi-step workflows as sequential steps or directed acyclic graphs (DAGs) with dependencies.
Run multiple workflow steps in parallel to maximize compute utilization and reduce execution time.
Store and reuse workflow definitions as templates across the cluster.
Schedule workflows to run on cron schedules directly on Kubernetes.
Pass artifacts between workflow steps via S3, GCS, Azure Blob, Artifactory, and more.
Persist workflow history to a database for long-term retention and querying.
Monitor and manage workflows through a rich graphical interface.
Namespace-based isolation with RBAC for multi-team environments.
Trigger workflows from Kubernetes events, webhooks, and custom event sources.
Define workflows in Python using the Hera SDK, the official Python SDK.
Extend with custom executor plugins and artifact driver plugins.
Use Cases
Orchestrate data preparation, model training, evaluation, and deployment as containerized steps.
Run parallel data transformation and ETL jobs at scale on Kubernetes.
Run CI/CD pipelines natively on Kubernetes without external CI tools.
Process large datasets in parallel with automatic resource management.
Automate infrastructure provisioning, testing, and validation workflows.
Orchestrate complex scientific computation and simulation jobs with dependencies.
Integrations
Official Python SDK for defining and submitting workflows programmatically.
Use Argo CD to deploy and manage Argo Workflows resources via GitOps.
Expose workflow metrics for Prometheus monitoring and alerting.
Visualize workflow performance metrics in Grafana dashboards.
Inject secrets into workflow containers securely via Vault integration.
Use S3 as artifact storage for passing data between workflow steps.
Use Google Cloud Storage as artifact backend.
Use Azure Blob Storage for artifact persistence.
Run Kubeflow ML pipelines using Argo Workflows as the underlying engine.
Orchestrate Apache Spark jobs as Argo Workflow steps.