Distributed task queue with Argo Workflows? #13450
Unanswered
stevefan1999-personal
asked this question in
Q&A
Replies: 1 comment
-
We use Argo Workflowa in this scenario to trigger workflows through products (event bridge, fc) or Argo Events. Upload files to OSS and automatically trigger the workflow. The whole use is very simple and smooth. |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
I want to make a video transcoding pipeline using Workflow, but I'm not sure if this is the right one to do.
Basically, I let user upload the video to a presigned S3 location, and then it will trigger the Argo Workflow and do all the transcoding using ffmpeg that re-upload to the same S3 bucket (albeit in a different location) and then let the user check the uploading progress (and notify the user when the video is ready to publish by email). This means I may need to write-back the transcoding status back to the Postgres database, so I'm not sure if I should use sidecar container to monitor the status and so on, I also need to think about how to split the workflows because there are things like score calculations that can be run in partial.
I have decided to go fully Kubernetes and mostly serverless and containerized, and right now I sit between Knative Eventing + TriggerMesh and just Argo Workflow. I find Argo Workflow interesting because it is way simpler to architect for, but I just want to know if anyone have similar experience and feel free to share about it as well.
Beta Was this translation helpful? Give feedback.
All reactions