How to use Dataflow with Argo Workflows.
Use a Argo Workflows resource template:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: create-pipeline-
spec:
entrypoint: main
templates:
- name: main
resource:
action: create
successCondition: status.phase == Succeeded
failureCondition: status.phase in (Failed, Error)
manifest: |
apiVersion: dataflow.argoproj.io/v1alpha1
kind: Pipeline
metadata:
name: main
spec:
steps:
- cat: {}
name: main
Use a HTTP source:
apiVersion: dataflow.argoproj.io/v1alpha1
kind: Pipeline
metadata:
name: http
spec:
steps:
- cat: { }
name: main
sources:
- name: http-0
http:
serviceName: http-svc
Use curl
to send messages:
apiVersion: argoproj.io/v1alpha1
kind: Workflow
metadata:
generateName: pump-pipeline-
spec:
entrypoint: main
templates:
- name: main
container:
command: [ curl ]
args: [ http://http-svc/sources/http-0, -d, my-msg ]
image: ubuntu
Use the Argo Server webhook endpoint. Follows this guide first to set-up an access token, and then you can create a pipeline like this:
apiVersion: dataflow.argoproj.io/v1alpha1
kind: Pipeline
metadata:
name: http
spec:
steps:
- cat: { }
name: main
sinks:
- http:
url: https://localhost:2746/api/v1/events/my-namespace/-
headers:
- name: Authorization
valueFrom:
# this secret should contain the access token with "Bearer " prefix
secretKeyRef:
name: my-secret-name
key: my-secret-key