tuplejump/calliope

issue when using calliope on spark 1.0

bobbych opened this issue · 8 comments

error: bad symbolic reference. A signature in RichByteBuffer.class refers to term time
in value org.joda which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling RichByteBuffer.class.

Thanks Rohit that worked but still having below issue

scala> val cas = CasBuilder.cql3.withColumnFamily("spark", "search").onHost("x.x.x.x.x")
cas: com.tuplejump.calliope.Cql3CasBuilder = com.tuplejump.calliope.Cql3CasBuilder@3b4d67

scala> val rdd = sc.cql3CassandraMap[String, String], Map[String, String]
:22: error: erroneous or inaccessible type
val rdd = sc.cql3CassandraMap[String, String], Map[String, String]

Have you imported RichByteBuffer._?

If yes, can you give me the script that is resulting in this error?

Hi Rohit,
I am just using spark shell and have all required files in spark class path
import com.tuplejump.calliope.utils.RichByteBuffer._
import com.tuplejump.calliope.Implicits._
import com.tuplejump.calliope.CasBuilder
val cas = CasBuilder.cql3.withColumnFamily("spark", "search").onHost("cass1")
val rdd = sc.cql3Cassandra[Map[String, String], Map[String, String]]

Shouldn't this,

val rdd = sc.cql3Cassandra[Map[String, String], Map[String, String]]

infact be, you are missing passing cas to the method

val rdd = sc.cql3Cassandra[Map[String, String], Map[String, String]](cas)

Actually I did that it's missing from snippets. This is only happening in spark 1.0 no issue in 0.9

Just to confirm, are you using the the snapshot build - 0.9.4-EA-SNAPSHOT with Spark 1.0.0? and you have the calliope-macros jar on the classpath too . . .

These are the jars you should have -

And which is the build of spark that you use? Is it with Hadoop 1.x or 2.x?

Ah.. that might be it I am missing macros jar. I will try with this
Thank you !!