Does spark-metrics work with spark 2.3
acstevens opened this issue · 2 comments
I am trying to use spark-metrics with spark 2.3 but run into the follow error.
2018-05-02 20:49:09 INFO BlockManager:54 - Initialized BlockManager: BlockManagerId(driver, spark-pi-9e38353c8a563fdfa274d7151aa3d8b3-driver-svc.spark.svc, 7079, None)
2018-05-02 20:49:09 ERROR MetricsSystem:70 - Sink class com.banzaicloud.spark.metrics.sink.PrometheusSink cannot be instantiated
2018-05-02 20:49:09 ERROR SparkContext:91 - Error initializing SparkContext.
java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:423)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:200)
at org.apache.spark.metrics.MetricsSystem$$anonfun$registerSinks$1.apply(MetricsSystem.scala:194)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashMap$$anonfun$foreach$1.apply(HashMap.scala:99)
at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:230)
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:40)
at scala.collection.mutable.HashMap.foreach(HashMap.scala:99)
at org.apache.spark.metrics.MetricsSystem.registerSinks(MetricsSystem.scala:194)
at org.apache.spark.metrics.MetricsSystem.start(MetricsSystem.scala:102)
at org.apache.spark.SparkContext.<init>(SparkContext.scala:513)
at org.apache.spark.SparkContext$.getOrCreate(SparkContext.scala:2486)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:930)
at org.apache.spark.sql.SparkSession$Builder$$anonfun$7.apply(SparkSession.scala:921)
at scala.Option.getOrElse(Option.scala:121)
at org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:921)
at org.apache.spark.examples.SparkPi$.main(SparkPi.scala:31)
at org.apache.spark.examples.SparkPi.main(SparkPi.scala)
Caused by: java.lang.AbstractMethodError
at org.apache.spark.internal.Logging$class.initializeLogIfNecessary(Logging.scala:99)
at com.banzaicloud.spark.metrics.sink.PrometheusSink.initializeLogIfNecessary(PrometheusSink.scala:36)
at org.apache.spark.internal.Logging$class.log(Logging.scala:46)
at com.banzaicloud.spark.metrics.sink.PrometheusSink.log(PrometheusSink.scala:36)
at org.apache.spark.internal.Logging$class.logInfo(Logging.scala:54)
at com.banzaicloud.spark.metrics.sink.PrometheusSink.logInfo(PrometheusSink.scala:36)
at com.banzaicloud.spark.metrics.sink.PrometheusSink.<init>(PrometheusSink.scala:136)
It seems like there is a compatibility issue. Would you agree?
@acstevens this was built with Spark 2.2.1. Had no chance to build against Spark 2.3. Will try to make some time next week to compile it for Spark 2.3 as well.
Since the Sink
interface that we have to implement is not public in Spark codebase thus we need to compile Spark 2.3 first with the Sink
interface made public. Once we have the 'patched' jar we can rebuild spark-metrics
and make the necessary changes to bring it in line with the interface changes made to 2.3. I don't expect any changes in spark-metrics apart the ones required by interface changes made in Spark 2.3
Once the Sink
interface is made public in Spark building and releasing a new version of spark-metrics is going to much simpler.
On a side note if you're running long running Java apps on k8s and interested in how you can enable Prometheus monitoring check out: JMX Exporter Operator. If you're keen to learn what JMX Exporter Operator does behind the scenes read the Kubernetes Operator SDK blog.
@acstevens spark-metrics
has been rebuilt for Spark 2.3 now. Please give it a try and let us know should you find any issues?