Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Snapshottest needs more tests #140

Open
medmunds opened this issue Oct 2, 2020 · 4 comments
Open

Snapshottest needs more tests #140

medmunds opened this issue Oct 2, 2020 · 4 comments

Comments

@medmunds
Copy link
Collaborator

medmunds commented Oct 2, 2020

It seems like there are a lot of areas where snapshottest's own test coverage could (and should) be improved. Examples:

  • Although we test the examples against their old snapshots, it would be great to also rerun each example with --snapshot-update—and then verify that the updated snapshots didn't change or do anything unexpected. (This would also ensure the example snapshots always match the latest code, since nothing in the current workflow ever updates the example snapshots.)
  • Aside from the examples, there's currently no testing of the unittest, nose, and Django wrappers. (And the examples are pretty limited in their usage of each framework. E.g., only examples/pytest creates more than one snapshot per file, and only examples/django generates more than one snapshot file.)
  • Tests in the tests dir largely cover only the things that have gone wrong in the past, so there's almost certainly some basic behavior going untested. (I'm the author of a bunch of those tests, so it's fair to blame me for their shortcomings.) E.g., module.py implements a big chunk of snapshottest, but test_module.py only covers the three specific bugs I ran into back in 2018. test_formatters.py has a little better coverage now, but is still missing things like GenericRepr. FileSnapshot is only tested through examples. Etc.
  • I'm sure there are others.

I don't have time to look at all of these, but wanted to at least start the discussion.

@paulmelnikow
Copy link
Collaborator

All these sound like great things to test!

Only thing I’ll add is that something else I’d like to see is 100% code coverage. It’s not a substitute for covering all the use cases (nor is 100% even enough – you can easily have 100% coverage without having all the use cases covered). Though working toward 100% is a useful way to find untested code, and once we get there it’s great for making sure we don’t inadvertently add new untested code.

@paulmelnikow
Copy link
Collaborator

I wonder if there’s a way we can break out individual tasks that contributors can pick up. There’s clearly a lot of work to do on this repo and the more hands we can bring to it, the better.

@medmunds
Copy link
Collaborator Author

medmunds commented Oct 3, 2020

I’d like to see is 100% code coverage

Agreed. Note that the current 28% understates actual coverage, because only the py.test run currently collects coverage data. (And as limited as the example testing on the other frameworks may be, it's more than zero.)

break out individual tasks that contributors can pick up

Yeah, definitely. I didn't want to spam the repository with a bunch of similar issues like "add tests for unittest," "add tests for nose," "add tests for reporting summaries," etc. Also wasn't really sure what to prioritize, thus this meta issue.

I don't know if you've had any experience with hacktoberfest. (I don't.) In years past, I've noticed some other projects where it seemed to attract new contributors. (But I guess there are quality issues this year?)

One thing I will try to do is add some documentation to the existing tests, to hopefully make it easier for others to expand them. pytest can be a little... intimidating if you don't work in it regularly. (OK, what I mean is: I looked at test_snapshot_test yesterday, had trouble figuring out what it was trying to do, and then was really surprised to discover who wrote that code.)

@paulmelnikow
Copy link
Collaborator

Yeah, definitely. I didn't want to spam the repository with a bunch of similar issues like "add tests for unittest," "add tests for nose," "add tests for reporting summaries," etc. Also wasn't really sure what to prioritize, thus this meta issue.

Yea, agreed, I think a meta issue is a good place to start. Your first bullet is a really well defined task already. Maybe we could try to turn the second and third into something similarly well scoped.

I'm not super convinced that hacktoberfest is a good way to recruit quality contributions. We could try asking on Twitter. Though I think issues here is probably still the best way to reach people.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants