You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jan 15, 2024. It is now read-only.
Title: ModuleNotFoundError: No module named 'gluonnlp' in Google Colab
Description:
While working in Google Colab, I have encountered a ModuleNotFoundError with the message "No module named 'gluonnlp'" despite installing the 'gluonnlp' module using !pip install gluonnlp==0.10.0 command.
Steps to Reproduce:
Open Google Colab.
Execute the command !pip install gluonnlp==0.10.0.
Import the 'gluonnlp' module using import gluonnlp.
Expected Behavior:
The module 'gluonnlp' should be successfully installed and imported without any errors.
Actual Behavior:
The code fails to import the 'gluonnlp' module and raises the following error:
ModuleNotFoundError: No module named 'gluonnlp'
Sign up for freeto subscribe to this conversation on GitHub.
Already have an account?
Sign in.
Description
Problems not being imported from colab
!pip install mxnet gluonnlp pandas tqdm
!pip install sentencepiece
!pip install transformers
!pip install torch
!pip install git+https://[email protected]/SKTBrain/KoBERT.git@master
import gluonnlp as nlp
Error Message
ImportError Traceback (most recent call last)
in <cell line: 6>()
4 import torch.optim as optim
5 from torch.utils.data import Dataset, DataLoader
----> 6 import gluonnlp as nlp
7 import numpy as np
8 from tqdm import tqdm, tqdm_notebook
6 frames
/usr/local/lib/python3.10/dist-packages/gluonnlp/init.py in
23
24 from . import loss
---> 25 from . import data
26 from . import embedding
27 from . import model
/usr/local/lib/python3.10/dist-packages/gluonnlp/data/init.py in
21 import os
22
---> 23 from . import (batchify, candidate_sampler, conll, corpora, dataloader,
24 dataset, question_answering, registry, sampler, sentiment,
25 stream, super_glue, transforms, translation, utils,
/usr/local/lib/python3.10/dist-packages/gluonnlp/data/corpora/init.py in
19 """Corpora."""
20
---> 21 from . import (google_billion_word, large_text_compression_benchmark, wikitext)
22
23 from .google_billion_word import *
/usr/local/lib/python3.10/dist-packages/gluonnlp/data/corpora/google_billion_word.py in
32 from ..._constants import EOS_TOKEN
33 from ...base import get_home_dir
---> 34 from ...vocab import Vocab
35 from ..dataset import CorpusDataset
36 from ..stream import SimpleDatasetStream
/usr/local/lib/python3.10/dist-packages/gluonnlp/vocab/init.py in
19 """Vocabulary."""
20
---> 21 from . import bert, elmo, subwords, vocab
22 from .bert import *
23 from .elmo import *
/usr/local/lib/python3.10/dist-packages/gluonnlp/vocab/bert.py in
22 import os
23
---> 24 from ..data.transforms import SentencepieceTokenizer
25 from ..data.utils import count_tokens
26 from .vocab import Vocab
/usr/local/lib/python3.10/dist-packages/gluonnlp/data/transforms.py in
46 from ..vocab.vocab import Vocab
47 from .utils import _extract_archive
---> 48 from .fast_bert_tokenizer import is_control, is_punctuation, is_whitespace
49 from .fast_bert_tokenizer import BasicTokenizer, WordpieceTokenizer
50
ImportError: /usr/local/lib/python3.10/dist-packages/gluonnlp/data/fast_bert_tokenizer.cpython-310-x86_64-linux-gnu.so: undefined symbol: _PyGen_Send
NOTE: If your import is failing due to a missing package, you can
manually install dependencies using either !pip or !apt.
To view examples of installing some common dependencies, click the
"Open Examples" button below.
To Reproduce
run in colab
Steps to reproduce
(Paste the commands you ran that produced the error.)
What have you tried to solve it?
Environment
The text was updated successfully, but these errors were encountered: