Getting Started with Argo Events
Earlier, we introduced how to install and trigger tasks in Argo Workflow. This article mainly introduces a new tool:
What is ArgoEvents?
Argo Events is an event-driven Kubernetes workflow automation framework. It supports more than 20 different events (such as webhooks, S3 drops, cronjobs, message queues - such as Kafka, GCP PubSub, SNS, SQS, etc.)
Features:
Supports events from 20+ event sources and more than 10 triggers.
Ability to customize business-level constraint logic for workflow automation.
Manage everything from simple, linear, real-time to complex, multi-source events.
Compliant with CloudEvents.
Components:
- EventSource (similar to Gateway, sends messages to eventbus)
- EventBus (event message queue, based on high-performance distributed message middleware NATS, but according to the NATS official website, it will no longer be maintained after 2023, and it is estimated that the architecture will be adjusted later)
- EventSensor (subscribe to message queue, parameterize events and filter events)
ArgoEvents deployment and installation
argo-events deployment:
argo-eventbus deployment:
|
|
RBAC account authorization
Create an operate-workflow-sa account
Authorize operate-workflow-sa to create argo workflow tasks under the argo-events namespace, which will be used for EventSensor to automatically create workflows later.
|
|
Create a workflow-pods-sa account
Authorize workflow-pods-sa to automatically create pods through workflow under argo-events
|
|
ArgoEvents automated triggering tasks
Start an event-sources to accept requests:
|
|
Note: Here, the name in event-sources is example. For the actual production environment, you need to specify the name here to create a sensor
Create a webhook sensor consumption request:
|
|
Reference: https://raw.githubusercontent.com/argoproj/argo-events/stable/examples/sensors/webhook.yaml
forward local request to remote:
|
|
filter data to event-sources:
|
|
At this point, ArgoEvents can be used to automatically create workflow tasks.