-
-
Notifications
You must be signed in to change notification settings - Fork 2.7k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add new fixture cache_result
param (default True) to ignore fixture cache and re-execute on each usage of the fixture
#12814
Comments
What you are asking for was coined as invocation scope back in 2016 when we tried to work on enabling that at the sprint in Freiburg Back then we failed due to tech debt While some of it is resolved, it's not clear if invocation scope can be done easily |
I have implemented the feature under the fork below, at the very least as a POC with tests. I am currently in a state where I have a very large infrastructure of many fixtures which are coupled and hard to maintain and my team is really feeling the pain on this one, so I am determined to do what I can to advance this :) |
Can you open a draft pull request for this, seeing a diff and potentially some tests will make this much easier to discuss And those discussions will be the starting point for more |
Sure! opened: #12816 |
Hey, any news on this?🙏 |
What's the problem this feature will solve?
Reduce code duplication. Allow leveraging the setup,teardown features of pytest fixtures while allowing fixtures to be a bit more re-usable, like functions
Describe the solution you'd like
An ability to declare fixtures to not use their stored cache, and instead run and re-calculate on every reference
i.e.
Say I were testing my backend server which managed a database with 2 objects definitions: Projects and ProjectConfigurations.
When testing basic get functionalities, I would declare a fixture which creates a project, and another which creates a project-configuration
This is already, a-lot of code, but fair. These are functionalities we must declare.
Now, when moving on to testing basic list functionalities, I would want more than 1 object in the db for each type during the test. I have to duplicate said fixtures above, or create a brand new fixture that creates X objects in my db as setup.
i.e.
Now my list test will use
project_config2
andproject2
Here the code starts getting clustered and confusing. And as tests pile on and my fixtures grow, I have to be careful that two fixtures that I am using are not coupled to each other when writing tests, as this could lead to unexpected behaviors. This makes a very large test code base that can be hard to manage.
In my opinion, there could be a better way!
At the end of the day, I want to create objects in the DB. I care for them being there, and I care for them being removed when my test finishes so that other tests won't be affected by this. In my case, I do not care that the references to it will access the same object, as I usually have a linear usage for each fixture.
If I could declare a fixture as to not use its cache, my issue would be solved, I could reference a fixture as many times as I want, without duplicating code and still getting new objects, while still enjoying the benefits of pytest's reliable setup, teardown flows. I would not need the change my exiting fixture infrastructure when testing new components of the same objects in my BE.
It would look something like this:
There are of course many more capabilities which come with this feature, this is just one of the useful usages.
The text was updated successfully, but these errors were encountered: