Using the pretrained model programatically
gkiril opened this issue · 2 comments
Can you please provide a clean code where given a text as input, I can use the pre-trained model and get entity links as output?
Solved it.
I started gerbil first and then ran the experiments (it was not clear to me that I should do this).
More specifically, these were the steps:
#1:
#cd gerbil/
#./start.sh
#2: cd end2end_neural_el/gerbil-SpotWrapNifWS4Test/
#mvn clean -Dmaven.tomcat.port=1235 tomcat:run
#3:
#cd end2end_neural_el/code
#For EL:
#python -m gerbil.server --training_name=base_att_global --experiment_name=paper_models --persons_coreference_merge=True --all_spans_training=True --entity_extension=extension_entities
#For ED:
#python -m gerbil.server --training_name=base_att_global --experiment_name=paper_models --persons_coreference_merge=True --ed_mode --entity_extension=extension_entities
#Then, use the json format:
#{ "text": "Obama will visit Germany and have a meeting with Merkel tomorrow.", "spans": [] }
import requests, json
#myjson = { "text": "Obama will visit Germany and have a meeting with Merkel tomorrow.", "spans": [{"start":0,"length":5}, {"start":17,"length":7}, {"start":49,"length":6}] }
#myjson = { "text": "Obama will visit Germany and have a meeting with Merkel tomorrow.", "spans": [] }
myjson = "EMPTY"
with open('../data/test.json') as json_file:
myjson = json.load(json_file)
result = requests.post("http://localhost:5555", json=myjson)
result.text
I have made a jupyter notebook for it. which is already in a pull request.
Meanwhile you can get it here. notebook