b96705008/custom-spark-pipeline

ImportError: cannot import name 'DefaultParamsReadable'

Closed this issue · 5 comments

Hello,

I'm using Spark 2.2 and tested also on Spark 2.3.0. and I get this error when importing DefaultParamsReadable and DefaultParamsWritable:


ImportError Traceback (most recent call last)
in ()
----> 1 from pyspark.ml.util import JavaMLReadable, JavaMLWritable, DefaultParamsReadable, DefaultParamsWritable

ImportError: cannot import name 'DefaultParamsReadable'

Hi, DefaultParamsReadable and DefaultParamsWritable are availiable since Spark 2.3.
Therefore, you can not use it in Spark 2.2.
I have a branch, called spark_2.1, and you may try it in spark 2.2.
However, DefaultParamsReadable and DefaultParamsWritable are keys to serialize custom models.
If you want that function, you probably should switch to spark 2.3.

Hello, I tried also with Spark 2.3.0, see below:

image

So...still cannot import DefaultParamsReadable/Writable.

You can check start-notebook.sh, especially PYTHONPATH.

#!/bin/sh
export SERVICE_HOME="$(cd "`dirname "$0"`"; pwd)"
export PYTHONPATH=${PYTHONPATH}:/${SERVICE_HOME}/app
export PYSPARK_DRIVER_PYTHON=jupyter
export PYSPARK_DRIVER_PYTHON_OPTS="notebook --port 8080"

pyspark \
	--name Jupyter_Spark \
	--master local[*]

Did you start your app by input below method?

./start-notebook.sh

Hello, I figured it out and now works. Thanks, Raluca

Great!