Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ValueError: Tried to convert 'value' to a tensor and failed. Error: Cannot create a tensor proto whose content is larger than 2GB. #145

Open
bluesea0 opened this issue Apr 16, 2019 · 0 comments

Comments

@bluesea0
Copy link

Hello, I try to run the baseline on NeuronNER with token_pretrained_embedding_filepath = ./data/word_vectors/wikipedia-pubmed-and-PMC-w2v.txt. And the word embedding text is 12.25GB. There is the error.
File "/local/lib/python3.5/site-packages/neuroner/entity_lstm.py", line 361, in load_pretrained_token_embeddings sess.run(self.token_embedding_weights.assign(initial_weights)) File "/local/lib/python3.5/site-packages/tensorflow/python/ops/variables.py", line 1762, in assign name=name) ValueError: Tried to convert 'value' to a tensor and failed. Error: Cannot create a tensor proto whose content is larger than 2GB.

Then I find one solution https://stackoverflow.com/questions/35394103/initializing-tensorflow-variable-with-an-array-larger-than-2gb.
However I don't know for sure whether I should modify the source code in entity_lstm.py of NeuroNER. How to solve this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant