LING 5801 project component - ML based question answering system
- Download CoreNLP from here, and place inside a
models
directory inside the project. - From the CoreNLP directory (e.g.
stanford-corenlp-full-2018-10-05
), start the server withjava -mx4g -cp "*" edu.stanford.nlp.pipeline.StanfordCoreNLPServer -port 9000 -timeout 300000
. - Run the model evaluator with
python3 eval.py
.