Teng Fu
Email: teng.fu@teleware.com
This is the docker image for NLP task, its baseImage is:
FROM tftwdockerhub/nlp_devel_docker_github:latest
Pytorch installed:
- PyTorch: 1.0
For NLP packages:
- StanfordNLP 0.1.1
- Oracle Jave 8
- CoreNLP 2018-10-05
CoreNLP installation is inspired by konradstrack/corenlp Java installation is based on java's official dockerfile
All the additional packages are also installed by using Python script in this directory stanfordnlp_model_download.py. Developer can customise the resources that is required in main() function.
There are two Jupyter Notebook in the Repo, provide the demo code for the usage of CoreNLP server with StanfordNLP:
-
coreNLP_server_start.ipynb for CoreNLP server startup. If developer wish to try the CoreNLP client feature, cmds in this notebook must be run priory to client code.
-
stanfordNLP_demo.ipynb for StanfordNLP code demo.
- tftwdockerhub/stanfordNLP-gpu:latest
on dsvm-gpu virtual machines
sudo docker pull tftwdockerhub/stanfordNLP-gpu:latest
remember the target port is 8889
sudo nvidia-docker run -it -p 8889:8888 -v \<project-dir-path\>:/app tftwdockerhub/stanfordNLP-gpu:latest
In local browser, remember the target port is 8889 and the token string on CLI screen
http://\<vm-ipaddress-or-dns\>:8889/?token=21c5e12xxxxxx