Franck-Dernoncourt/NeuroNER

ValueError: Tried to convert 'value' to a tensor and failed. Error: Cannot create a tensor proto whose content is larger than 2GB.

bluesea0 opened this issue · 0 comments

Hello, I try to run the baseline on NeuronNER with token_pretrained_embedding_filepath = ./data/word_vectors/wikipedia-pubmed-and-PMC-w2v.txt. And the word embedding text is 12.25GB. There is the error.
File "/local/lib/python3.5/site-packages/neuroner/entity_lstm.py", line 361, in load_pretrained_token_embeddings sess.run(self.token_embedding_weights.assign(initial_weights)) File "/local/lib/python3.5/site-packages/tensorflow/python/ops/variables.py", line 1762, in assign name=name) ValueError: Tried to convert 'value' to a tensor and failed. Error: Cannot create a tensor proto whose content is larger than 2GB.

Then I find one solution https://stackoverflow.com/questions/35394103/initializing-tensorflow-variable-with-an-array-larger-than-2gb.
However I don't know for sure whether I should modify the source code in entity_lstm.py of NeuroNER. How to solve this problem?