-
-
Notifications
You must be signed in to change notification settings - Fork 66
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Generators that generate unique values #173
Comments
There's a big difference between
What you are looking for is a way for a generator to generate unique values across the generated values. It's a completely different implementation, as we need to keep a set of "seen" values when generating. Since the generators inherently generate random data, this is not too easy to achieve. We can't deterministically generate a next unique value that hasn't been generated before: we need to generate the next value, check if it's in the "seen set", and re-generate it if it's there, and so on. I'd love for you to take a stab at this if you want and I’m happy to provide assistance. When it comes to "shrinkability", you're probably right, this would end up being an unshrinkable generator. Making sure that the whole lazy tree (the data structure we use for shrinking) is unique might be a challenge. 🙃 |
Yeah, definitely some interesting issues here, especially considering that So here is what I'm wondering: could we potentially build in some concept of a "unique generator"? primarily because generating unique values for different types could involve different things, and they still might be shrinkable in some way. Ash could use this under the hood by just ensuring that (in the above example) that Here is the kind of thing I have in mind: def unique_integer(left..right = _range) do
# An implementation that does something like using `System.monotonic_integer`, to only produce increasing values
new(...., unique: true)
end We could store on the stream that it is capable of producing unique values each time. And then we could have unique streams only compose with other unique streams, i.e unique_map(%{id: unique_integer()}) As I'm typing this I'm honestly not sure if its just a crazy-bad idea or what 😆 . I'm not suggesting this is the solution I need/want, but in my (somewhat limited) experience with property testing, avoiding filters will drastically increase your quality of life, and using a filter to get unique results would, I imagine, be a pretty big performance hit and would make the rate of "too many filtered values" significantly higher. |
At the moment, I see
uniq_list_of/2
that allows for taking a generator and producing a unique list of items from that generator. But is there anything likemaps |> uniq_by(&Map.take(&1, [:a, :b])) |> ...
that ensures that no duplicate values are ever produced? It would need to also not reuse any values for shrinking, so perhaps would be unshrinkable (I'm out of my depth there).What I'm looking for:
The best I can find at the moment
The latter does not compose the way that I'm looking to compose the generator, as it produces lists.
The text was updated successfully, but these errors were encountered: