emorynlp/nlp4j-old

only POS tagger

Closed this issue · 4 comments

Hi,

the command line tagging takes a lot of time,not suitable for real-time systems. Please provide API for single taggers like only for POS.

What is your input format? I believe NLPDecoder can be used for this but if you let me know the format, I can provide more fine-grained API.

we are calling the decoder with a text file, it is taking around 2 mins 30
seconds to process

On Wed, Jun 8, 2016 at 11:27 PM, Jinho D. Choi notifications@github.com
wrote:

What is your input format? I believe NLPDecoder
https://github.com/emorynlp/nlp4j/blob/master/src/main/java/edu/emory/mathcs/nlp/decode/NLPDecoder.java
can be used for this but if you let me know the format, I can provide more
fine-grained API.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
emorynlp/nlp4j#17 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/ANLoeue3kt7jwNdPnnS7T8Rc1aZyRF1mks5qJwKfgaJpZM4Iw9tY
.

I think most of the time is perhaps spent for model loading. Here is the API you can use for this purpose:

https://github.com/emorynlp/nlp4j-demo/blob/master/src/main/java/edu/emory/mathcs/nlp/demo/NLPDecodeRaw.java

Here is the configuration file you can use just for pos tagging:
https://github.com/emorynlp/nlp4j-demo/blob/master/src/main/resources/configuration/config-decode-en-pos.xml

The following line loads all the models, which should be done only once:

NLPDecoder decoder = new NLPDecoder(IOUtils.getInputStream(configurationFile));

Thanks.

Thanks a lot for the reply, I will check and let you know if we have
further issues

Regards,
Debanjan

On Wed, Jun 8, 2016 at 11:55 PM, Jinho D. Choi notifications@github.com
wrote:

I think most of the time is perhaps spent for model loading. Here is the
API you can use for this purpose:

https://github.com/emorynlp/nlp4j-demo/blob/master/src/main/java/edu/emory/mathcs/nlp/demo/NLPDecodeRaw.java

Here is the configuration file you can use just for pos tagging:

https://github.com/emorynlp/nlp4j-demo/blob/master/src/main/resources/configuration/config-decode-en-pos.xml

The following line loads all the models, which should be done only once:

NLPDecoder decoder = new NLPDecoder(IOUtils.getInputStream(configurationFile));

Thanks.


You are receiving this because you authored the thread.
Reply to this email directly, view it on GitHub
emorynlp/nlp4j#17 (comment), or mute
the thread
https://github.com/notifications/unsubscribe/ANLoevpQfdbT1GpK4ZmvqqbetnH3bMz1ks5qJwkVgaJpZM4Iw9tY
.