xerial/snappy-java

MacOS loading of snappy library failing

milanaleksic opened this issue · 4 comments

Background

I am using spark that in turn uses your library to load snappy when saving parquet (by default snappy is turned on). I am on latest MacOS Catalina version.

Problem

it seems that what System.mapLibraryName("snappyjava") returns is snappyjava.dylib, but since the name of the native library in resources is snappyjava.jnilib the load doesn't happen, it fails with Caused by: org.xerial.snappy.SnappyError: [FAILED_TO_LOAD_NATIVE_LIBRARY] null

Workaround

If I override the library name with explicit -Dorg.xerial.snappy.lib.name=libsnappyjava.jnilib then everything seems to work.

Question

I was asking myself could your library perhaps ignore dylib and force jnilib in cases when it has to load it from JAR resources? I think I wouldn't see the issue I am currently seeing there.

Alternatively, please tell me what I might be doing wrong to cause this issue.

It seems we need to check Mac OS X version for Catalina. OSInfo class or SnappyLoader class needs some workaround

it seems that what System.mapLibraryName("snappyjava") returns is snappyjava.dylib, but since the name of the native library in resources is snappyjava.jnilib the load doesn't happen

At least in Mac, we always fallback into .jnilib if .dylib not found: https://github.com/xerial/snappy-java/blob/master/src/main/java/org/xerial/snappy/SnappyLoader.java#L328-L338

To reproduce the issue, could you share the release versions you used, e.g., snappy-java, Spark, java, ...

Just to check, I run the unit tests in my macOS Catalina and (the single test failed though), almost all the tests passed;

Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
[error] Test org.xerial.snappy.SnappyHadoopCompatibleOutputStreamTest.testXerialCompressionHadoopDecompressionCodec failed: java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support., took 0.253 sec
[error]     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:65)
[error]     at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:193)
[error]     at org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:178)
[error]     at org.apache.hadoop.io.compress.CompressionCodec$Util.createInputStreamWithCodecPool(CompressionCodec.java:157)
[error]     at org.apache.hadoop.io.compress.SnappyCodec.createInputStream(SnappyCodec.java:164)
[error]     at org.xerial.snappy.SnappyHadoopCompatibleOutputStreamTest.testXerialCompressionHadoopDecompressionCodec(SnappyHadoopCompatibleOutputStreamTest.java:111)
[error]     ...
[info] Test run finished: 1 failed, 0 ignored, 1 total, 0.286s
[info] ScalaTest
[info] Run completed in 32 seconds, 552 milliseconds.
[info] Total number of tests run: 1
[info] Suites: completed 1, aborted 0
[info] Tests: succeeded 1, failed 0, canceled 0, ignored 0, pending 0
[info] All tests passed.
[error] Failed: Total 84, Failed 1, Errors 0, Passed 83
[error] Failed tests:
[error] 	org.xerial.snappy.SnappyHadoopCompatibleOutputStreamTest
[error] (Test / test) sbt.TestsFailedException: Tests unsuccessful
[error] Total time: 39 s, completed 2020/04/02 16:24:18

Somewhere between versions 1.0.4.1 and 1.1.7.1 this was fixed as you said. I will close this issue and try to figure out how to handle our dependencies so that the newer comes up.
Thank you for your pointers