Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[ACTION] Create Falco benchmark workflow #121

Open
6 tasks
Tracked by #117
rossf7 opened this issue Aug 28, 2024 · 6 comments
Open
6 tasks
Tracked by #117

[ACTION] Create Falco benchmark workflow #121

rossf7 opened this issue Aug 28, 2024 · 6 comments

Comments

@rossf7
Copy link
Contributor

rossf7 commented Aug 28, 2024

Part of Proposal-002 - Run the CNCF project benchmark tests as part of the automated pipeline

Task Description

This issue tracks the implementation part of proposal 2: create a GitHub Actions workflow in this repo that runs the 3 benchmark tests created by the Falco team.

The 3 manifest files can be found here https://github.com/falcosecurity/cncf-green-review-testing/tree/main/benchmark-tests

See the Design Details section of proposal 2 for more detail on the benchmark workflow and jobs.

#118 is related and is to implement the pipeline changes to call this benchmark workflow

Tasks

  • Add Falco benchmark workflow
  • Set the [benchmark_path](https://github.com/cncf-tags/green-reviews-tooling/blob/main/projects/projects.json#L6)
  • Add test steps to run 3 jobs that applies each manifest and waits 15 mins
  • Ensure that workflow also deletes kubernetes resources at end of pipeline run
  • Update Proposal 2 with any changes
  • Add documentation
@rossf7 rossf7 changed the title Create Falco benchmark workflow [ACTION] Create Falco benchmark workflow Aug 28, 2024
@nikimanoledaki nikimanoledaki self-assigned this Sep 25, 2024
@locomundo
Copy link
Contributor

Let's work together on this @nikimanoledaki

@nikimanoledaki
Copy link
Contributor

Completing this should resolve this error: #122 (comment)

@nikimanoledaki
Copy link
Contributor

nikimanoledaki commented Oct 8, 2024

TODO:

@nikimanoledaki
Copy link
Contributor

nikimanoledaki commented Oct 11, 2024

Hi folks, @locomundo, @rossf7 👋

We've come across a missing feature on GitHub Actions 😬

Problem

A reusable workflow that is in a separate repository can only be referenced in the top level uses, defined right after the job name:

benchmark-job:
	uses: nikimanoledaki/cncf-green-review-testing/.github/workflows/workflow.yml@nm/add-bench-workflow

However, this top-level uses does not evaluate expressions. So it is not possible to set the benchmark path as ${{ inputs.benchmark_path }}. Error: the 'uses' attribute must be a path, a Docker image, or owner/repo@ref - failed workflow

It is possible to evaluate expressions when using a nested uses, like below:

benchmark-job:
	uses: jenseng/dynamic-uses@v1
	with:
		uses: ${{ inputs.benchmark_path }}

However, unfortunately, a nested uses cannot reference a workflow that is in a different repository. It fails with the error: invalid value workflow reference: references to workflows must be rooted in '.github/workflows' - failed workflow

I tried to set only part of it as an expression that should evaluate but this is not possible either, see error:

error parsing called workflow
".github/workflows/benchmark-pipeline.yaml"
-> "nikimanoledaki/cncf-green-review-testing/.github/workflows/workflow.yml@${{inputs.benchmark_path}}"
: failed to fetch workflow: reference to workflow should be either a valid branch, tag, or commit

I found multiple issues referencing this problem so it seems to be a missing feature e.g. https://github.com/orgs/community/discussions/9050


I'm going to look into some elegant solutions to workaround this and report back.

@locomundo
Copy link
Contributor

What if we just keep the benchmark workflows in the green reviews repository? In the CNCF project repo they would have the k8s manifests that need to be applied for a benchmark test (in the case of Falco, stress-ng.yaml, redis-yaml and falco-event-generator.yaml), and in the GR repo workflow we would apply-f them and delete them.
It would definitely simplify things and solve this, right?

Maybe the downside would be that the CNCF project maintainer would have less flexibility and would have to ask us if there are new tests to be included. But we can decide that each file in the directory they specify is a test, and then make the github action iterate the files and run one test for each? Not sure if it's easy to do that in github actoins?

Does this make any sense?

@nikimanoledaki
Copy link
Contributor

nikimanoledaki commented Oct 21, 2024

This makes sense! I think that we can keep the benchmark workflows in the Green Reviews repository and point to a script in the run step. This script can live in the Falco repository and apply, wait, and delete each of the tests. That would give Falco maintainers the ability to do additional checks, for example, to validate that they have reached the optimal kernel rate, etc. What do you think?

POC here: #124 -- @locomundo could you review and merge it if you agree?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

3 participants