-
-
Notifications
You must be signed in to change notification settings - Fork 39
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Fix rasterizing artifacts: offset vector operations inside InternalPath
so they center around zero
#5
Comments
InternalPath
so they center around zeroInternalPath
so they center around zero
Note: looks like there is valuable input in the discussion under #15. |
InternalPath
so they center around zeroInternalPath
so they center around zero
Can someone explain why the issue in #15 only occurs with antialiasing on? I would assume (naively) that calculating intersections is something that occurs before we draw the ant aliased line? |
Synopsis of issue: Details now for each subpixel row we perform an intersection check with the path and determine for each subpixel on that line whether they are inside or outside the path. We aggregate all the subpixels per pixel so that there is a level per subpixel of how much of it is inside vs outside the shape. When anti-aliasing is turned on that value is converted in to a blend percentage amount. now because we don't have a threshold amount in anti-alias then even a single subpixel triggering the intersection miss would cause it to render an output but with non-anti-aliased then it would have to be over 50% of the scanlines per pixel row to cause a visible output. So inconclusion non-anti-aliasing is technically effected if just has much higher probability of hiding it. |
Man... I don't get rasterizing at all... Currently reading this. http://nothings.org/gamedev/rasterize/ |
I think we should have a serious look at this implementation. It's well documented and appears to have excellent performance and quality characteristics. It's interesting enough that team behind Blend2d is looking at it also. https://gitter.im/blend2d/blend2d?at=5ea0228e71a34b0149013568 It also holds out well under extreme scrutiny here |
@tocsoft I believe the only real solution to achieve robustness here is to simplify polygons before rendering, removing sub-pixel details which are not visible, and only result in numerical artifacts. There is a simple (but An example implementation: |
For example, zooming into the US map artifacts and the original polygon, the island on the left contains unnecessary details. |
@antonfirsov Did you see the optimized implementation of the Peucker algorithm linked at the bottom of the Wiki? http://www.bowdoin.edu/~ltoma/teaching/cs350/spring06/Lecture-Handouts/hershberger92speeding.pdf |
@JimBobSquarePants good point, haven't noticed, thanks!
|
New issue to track in same topic: #106 |
To prevent issues around imprecise floating point maths on large numbers we should offset all points as they are processed into the
InternalPath
precalulation data so that all values are a close to zero as possible. Then after callingInternalPath.FindIntersection
and similar methods we apply the reverse offset into the correct space the original points applied to.The will reduce potential issues that can arise by the increased imprecision that occurs the further floating point values are away from zero.
SixLabors/Shapes#43 provided a bandaid fix by using double instead of a floats for some operations thus providing a simple fix for large shapes but offsetting will provide us with an algorithm that will support not just large shapes but ones with large offsets from zero.
The text was updated successfully, but these errors were encountered: