pointfreeco/swift-snapshot-testing

Perceptual precision comparison does not appear to work correctly

jaanus opened this issue · 7 comments

Describe the bug

Testing with perceptual snapshot comparison fails because the perceptual precision computation returns a value that does not match my perception.

To Reproduce

struct ContentView: View {
  var body: some View {
    VStack {
      Image(systemName: "globe")
        .imageScale(.large)
        .foregroundColor(.accentColor)
      Text("Hello, world!")
    }
    .padding()
  }
}

final class PerceptualSnapshotBugTests: XCTestCase {
  func test_view() throws {
    let sut = ContentView()
    assertSnapshot(matching: sut, as: .image(perceptualPrecision: 0.5))
  }
}

Record a snapshot and then modify it very slightly with an image editing tool.

Expected behavior

The above test should pass, since the perceptual difference between reference and test result is small.

Screenshots

The above test fails. When I debug to the perceptual comparison code, I see that actualPixelPrecision makes sense, but actualPerceptualPrecision is reported much lower than expected.

Here is screenshot with Xcode and the reference image. You see that I modified the reference image by slightly cutting away from the “H” character.

Screenshot 2022-11-02 at 15 46 33

Environment

  • swift-snapshot-testing version 1.10.0
  • Xcode 14.1 (14B47b)
  • Swift: whatever is bundled with that Xcode
  • OS: macOS Ventura 13.0 (22A380)

I'm also having problems with perceptual precision. It's mostly related to shadows and corners.
In the example below, I set the perceptual precision to 0.98, and the actual precision to 0.99

actual

reference

diff

even though these two images look identical to me (ie. I can't easily see the difference), the test is failing with the following output:

Actual image precision 0.89907056 is less than required 0.99
Actual perceptual precision 0.43562502 is less than required 0.98

I'm having similar issues, any thoughts @ejensen?

Any update on this, I am seeing issues also. Images are identical (difference map has every single pixel as 0, 0, 0) and it fails with:
Actual perceptual precision 0.82234377 is less than required 0.98

I'm also having the same problem. No apparent difference but the perceptual precision is much lower.

FWIW Kaleidoscope 3 shows that these pixels are different in some way:
image

🤔 I think the difference that actually matters here is that one random pixel on the left side of the image, but kinda near the middle. It will look like a piece of dust on your screen, so zoom in and scroll and you'll see it move around. You can see the difference in the image. Not sure what would cause it, though.

Extra pixel:

image

No extra pixel:

image

Part of the issue seems to be related to how CILabDeltaE handles transparency. More specifically how transparent colors are evaluated as being more different then when they've been composited onto another color.

The two images seem to have slightly different shadow colors and opacities
rgba(0,22,44,0.0901961) rgba(11,32,43,0.0941176) which leads to CILabDeltaE reporting them as fairly different.

If the two images are changed so they're on a solid background, the resulting colors become nearly identical, so the required perceptualPrecision goes from ~0.93 to ~0.99.


The other part of the issue seems to be around the messaging when it doesnt match

Actual perceptual precision 0.43562502 is less than required 0.98

The value printed out here is (1 - delta e of the worst pixel) / 100, in this case the pixel that's blue in one image and white in the other as mentioned above. This is printing out a value that makes the accuracy of the snapshot look far worse than it actually is. By comparison, the average across the whole image would be more like 0.9949010414 which isn't really that bad.

I'd suggest removing that line entirely since it doesn't provide any actionable information. The value is the perceptualPrecision that would need to be used to make it pass only if it had a pixelPrecision of 1.0. Since perceptualPrecision is used on a per pixel basis, I'm not sure there's a number that could be printed out that would be useful.