/spark-nlp-workshop

Public runnable examples of using John Snow Labs' NLP for Apache Spark.

Primary LanguageJupyter NotebookApache License 2.0Apache-2.0

Spark NLP Workshop

Build Status Maven Central PyPI version Anaconda-Cloud License

Showcasing notebooks and codes of how to use Spark NLP in Python and Scala.

Table of contents

Python Setup

$ java -version
# should be Java 8 (Oracle or OpenJDK)
$ conda create -n sparknlp python=3.6 -y
$ conda activate sparknlp
$ pip install spark-nlp pyspark==2.4.4

Colab setup

import os

# Install java
! apt-get update -qq
! apt-get install -y openjdk-8-jdk-headless -qq > /dev/null

os.environ["JAVA_HOME"] = "/usr/lib/jvm/java-8-openjdk-amd64"
os.environ["PATH"] = os.environ["JAVA_HOME"] + "/bin:" + os.environ["PATH"]
! java -version

# Install pyspark
! pip install -q pyspark==2.4.6
! pip install -q spark-nlp

Main repository

https://github.com/JohnSnowLabs/spark-nlp

Project's website

Take a look at our official spark-nlp page: http://nlp.johnsnowlabs.com/ for user documentation and examples

Slack community channel

Join Slack

Contributing

If you find any example that is no longer working, please create an issue.

License

Apache Licence 2.0