swoop-inc/spark-alchemy

Specifying custom precision

Opened this issue · 0 comments

Hi,

I am not able to specify custom precision with below code and it errors out. Can some one please let me know the right way to pass custom precision? Please note i am using databricks for spark runtime.

import com.swoop.alchemy.spark.expressions.hll.functions._
val df1 = spark.table("hive_metastore.rwd_databricks.table_test")
df1.select("PATIENT_ID","CLAIM_ID","CODE").withColumn("patient_id_hll", hll_init("PATIENT_ID",0.02))  .select(hll_merge("patient_id_hll",0.02).as("patient_id_hll_m")).write.mode("overwrite").format("delta").saveAsTable("patient_id_hll_merge")