代码拉取完成,页面将自动刷新
同步操作将从 src-openEuler/ghostscript 强制同步,此操作会覆盖自 Fork 仓库以来所做的任何修改,且无法恢复!!!
确定后同步将在后台操作,完成时将刷新页面,请耐心等待。
From 9dc0eb31d79bf819c97420aaf2f6fc5cf3a52c10 Mon Sep 17 00:00:00 2001
From: Robin Watts <Robin.Watts@artifex.com>
Date: Wed, 18 Mar 2020 15:11:01 +0000
Subject: [PATCH] Bug 702196: Fix incorrect detection of "thin" lines while
stroking.
When stroking lines, we spot whether a line is 'thin' (i.e. the
perpendicular width of the line is less than 1 pixel), and handle
those cases specially by using custom 'thin line' routines. This
gives more pleasing results than slavishly following the 'any part
of a pixel' rule.
The current code makes this determination in 2 places. Firstly,
we calculate 'always_thin', by scaling the perpendicular vector
and seeing if all possible lines will be 'thin'. Secondly, in
the case when we aren't 'always_thin', we calculate it for each
line segment in turn by calling 'width_is_thin'.
Unfortunately, the threshold used for the simple early rejection
test in 'width_is_thin' is wrong. Rather than checking against
0.5, we should be checking against sqrt(1/8). This causes lines
near 45 degrees to be improperly treated as thin. This is a
simple fix.
This gives lots of progressions - so many that you wonder how we
never spotted this before.
Unfortunately, buried in these progressions, there are a few files
which, while improved, are still imperfect.
In some files, that use 'non-uniform' scales, (such as
(53 0 0 21 0 0 )) a stroke of constant width can 'pop' between
thick and thin as we move around the path; a near vertical
line segment may be thin, whereas a near horizontal line segment
might be thick due to the difference in scale. This is visually
jarring.
To fix this, therefore, we'd need to modify the width_is_thin
testing in non-uniform cases, so that it gives us the same results
all the way around. Doing this would be complex, and ultimately
actually ends up equivalent to us just relying on "always_thin"
(with the exception of strictly vertical and horizontal line
segements). We therefore disable the non-orthogonal test in
'width_is_thin' entirely.
---
base/gxstroke.c | 52 +++++++++++++++++++++++++++++++++++++++++++++++-----
1 file changed, 47 insertions(+), 5 deletions(-)
diff --git a/base/gxstroke.c b/base/gxstroke.c
index 52c1543..399c914 100644
--- a/base/gxstroke.c
+++ b/base/gxstroke.c
@@ -1093,12 +1093,51 @@ width_is_thin(pl_ptr plp)
if ((dx = plp->vector.x) == 0)
return any_abs(wx) < fixed_half;
- /*
- * If both horizontal and vertical widths are less than
- * 0.5, the line is thin.
+ /* For the longest time, we used to have a test here that
+ * attempted to trivially accept diagonal lines as being
+ * thin based on the components of the perpendicular
+ * width vector in device space as both being less than 0.5.
+ * Bug 702196 showed some examples where this was clearly
+ * wrong.
+ *
+ * The cause for this bug was that the 0.5 figure was wrong.
+ * For the point to be less than 1/2 a pixel perpendicular
+ * distant from the line, we'd need x^2 + y^2 < .5^2.
+ * For a 45 degree line, that'd be 2(x^2) < 1/4 = x^2 < 1/8
+ * or x < sqr(1/8). 45 degree line is the "worst case", so
+ * if both horizontal and vertical widths are less than
+ * sqr(1/8), the line is thin. sqr(1/8) = 0.35355339059.
+ * So, we should be using sqr(1/8) rather than 0.5.
+ *
+ * Fixing this did indeed produce many many progressions,
+ * but left just the odd file still showing problems.
+ *
+ * Further investigations show that those cases were due to
+ * the use of "non-uniform" scaling matrices, for example
+ * (83 0 0 51 0 0). With such matrices, it's possible for
+ * nearly horizontal lines to be thin, but nearly vertical
+ * ones to be thick (or vice versa). Having the style of
+ * line "pop" between thick and thin in a single stroke
+ * looks very noticeable.
+ *
+ * We could change the trivial optimisation below to only
+ * apply in the 'uniform' case, but that would never actually
+ * trigger (as tested on the cluster), because all such
+ * cases are caught by the "always_thin" condition in the
+ * caller.
+ *
+ * Just removing the trivial test and leaving the 'complicated'
+ * test below us would leave us vulnerable to "popping",
+ * so we disable both. In practice this makes no difference
+ * to the number of tests showing diffs in the cluster.
*/
- if (any_abs(wx) < fixed_half && any_abs(wy) < fixed_half)
- return true;
+#if 0 /* DISABLED TEST, see above */
+ {
+ /* thin_threshold = fixed sqr(1/8) - see above. */
+ const fixed thin_threshold = float2fixed(0.35355339059f);
+ if (any_abs(wx) < thin_threshold && any_abs(wy) < thin_threshold)
+ return true;
+ }
/*
* We have to do this the hard way, by actually computing the
@@ -1116,6 +1155,9 @@ width_is_thin(pl_ptr plp)
/* so we don't need to do any de-scaling for the test. */
return fabs(num) < denom * 0.5;
}
+#else
+ return false;
+#endif
}
/* Adjust the endpoints and width of a stroke segment along a specified axis */
--
1.8.3.1
此处可能存在不合适展示的内容,页面不予展示。您可通过相关编辑功能自查并修改。
如您确认内容无涉及 不当用语 / 纯广告导流 / 暴力 / 低俗色情 / 侵权 / 盗版 / 虚假 / 无价值内容或违法国家有关法律法规的内容,可点击提交进行申诉,我们将尽快为您处理。