Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Improve robustness of curve fitting #289

Merged
merged 3 commits into from
Jun 27, 2023
Merged

Improve robustness of curve fitting #289

merged 3 commits into from
Jun 27, 2023

Conversation

raphlinus
Copy link
Contributor

The main improvement is to try to fit a line when the chord is very short. Very short segments can occur when the cusp finding reports imprecise results, and also the main cubic fit logic is not robust when the chord is short.

I believe the simple subdivision case is now fairly robust in that it won't keep recursing. This is in spite of not having an explicit bound on recursion depth; the theory is that each subdivision reduces the parameter space in half, and at some point you'll reach the point where the chord is shorter than the given tolerance. This is true even for an adversarial break_cusp, as it's bypassed in the short chord case.

The logic for the optimized case is not as careful, in particular break_cusp is not bypassed. I'm curious whether there are still failure cases.

This is likely not the end of the numerical robustness work. In particular, break_cusp can be better tuned to not over-report cusps, and the tangent detection can be improved for cusp and near-cusp source curves. But hopefully this commit gets us to where we at least return a valid fit.

Also adds a test lightly adapted from #269.

Progress toward #269 and #279

The main improvement is to try to fit a line when the chord is very short. Very short segments can occur when the cusp finding reports imprecise results, and also the main cubic fit logic is not robust when the chord is short.

I believe the simple subdivision case is now fairly robust in that it won't keep recursing. This is in spite of not having an explicit bound on recursion depth; the theory is that each subdivision reduces the parameter space in half, and at some point you'll reach the point where the chord is shorter than the given tolerance. This is true even for an adversarial break_cusp, as it's bypassed in the short chord case.

The logic for the optimized case is not as careful, in particular break_cusp is not bypassed. I'm curious whether there are still failure cases.

This is likely not the end of the numerical robustness work. In particular, break_cusp can be better tuned to not over-report cusps, and the tangent detection can be improved for cusp and near-cusp source curves. But hopefully this commit gets us to where we at least return a valid fit.

Also adds a test lightly adapted from #269.

Progress toward #269 and #279
@dfrg
Copy link
Contributor

dfrg commented Jun 8, 2023

This is a substantial improvement. We’re still hitting a stack overflow with tiger that is caused when hypot == 0 in CubicOffset::eval_offset. If I hack in a fix for that, this succeeds.

Further, if I remove all the point equality comparisons guarding cubics in the stroke converter, I get almost perfect strokes on tiger! There are a few very minor artifacts remaining that I haven’t investigated yet.

@raphlinus
Copy link
Contributor Author

Good to hear. I haven't run through tiger myself, but it sounds like that's a good set of test data, and we should try to nail down all the remaining artifacts.

This is one possible technique for improving the robustness of parallel curve generation - a method to detect cusps and "regularize" by perturbing the control points.

The implementation handles cases where the control points are coincident with the endpoints, and also simple cusps, but hasn't been explicitly tuned to handle cases where the control points are nearly co-linear.

It is likely that additional tuning of the "dimension" parameter would be worthwhile. As a first approximation, it can just be the accuracy parameter.
Copy link
Contributor

@dfrg dfrg left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think we still have more work to do but this improves the current state of things dramatically and I’m happy to merge as is and address further issues in a future PR.

@raphlinus raphlinus merged commit f8ba01e into master Jun 27, 2023
@raphlinus raphlinus deleted the fit_robust branch June 27, 2023 01:23
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants