Skip to main content

All Questions

0 votes
1 answer
125 views

torch.OutOfMemoryError: CUDA out of memory. (Google Colab)

I tried to adapt the mBERT model to an existing code. However, I received the following issue even though I tried different solutions. torch.OutOfMemoryError: CUDA out of memory. Tried to allocate 20....
MarMarhoun's user avatar
0 votes
2 answers
1k views

Google Colab unable to Hugging Face model

I like to tag parts of speech using the BERT model. I used the Hugging face library for this purpose. When I run the model on Hugging face API I got the output However, when I run the code on Google ...
Encipher's user avatar
  • 3,168
2 votes
1 answer
119 views

BERTopic: add legend to term score decline

I plot the term score decline for a topic model I created on Google Colab with BERTopic. Great function. Works neat! But I need to add a legend. This parameter is not specified in the topic_model....
Simone's user avatar
  • 625
0 votes
1 answer
109 views

CUDA out of memory when running validation test for SciBERT

I am trying to train SciBERT with a not-too-big dataset (roughly 10000 rows). I have partitioned the dataset as train, validation and loss, with proportion of 0.6, 0.2 and 0.2 respectively. Then I ...
Hoang Cuong Nguyen's user avatar
9 votes
1 answer
36k views

ERROR: file:///content does not appear to be a Python project: neither 'setup.py' nor 'pyproject.toml' found

https://colab.research.google.com/drive/11u6leEKvqE0CCbvDHHKmCxmW5GxyjlBm?usp=sharing setup.py file is in transformers folder(root directory). But this error occurs when I run !git clone https://...
Paul's user avatar
  • 159
1 vote
0 answers
183 views

How to obtain the [CLS] sentence embedding of multiple sentences successively without facing a RAM crash?

I would like to obtain the [CLS] token's sentence embedding (as it represents the whole sentence's meaning) using BERT. I have many sentences (about 40) that belong to a Document, and 246 such ...
Aadithya Seshadri's user avatar
0 votes
0 answers
322 views

Got the answer TypeError: __call__() got an unexpected keyword argument 'max_length' when automatically generating meta descriptions using BERT

I am trying to automatically generate meta descriptions by using Python and BERT on a Google colab sheet, according to this tutorial https://lazarinastoy.com/automatically-generate-meta-descriptions-...
nybergan's user avatar
1 vote
1 answer
4k views

from summarizer import Summarizer it's not running in Colab

I'm trying to use the BERT Text Summarizer inside Colab but I'm getting the following error from summarizer import Summarizer I am getting the error as below, -------------------------------------...
bruno_barbosa's user avatar
2 votes
0 answers
1k views

BrokenProcessPool: A task has failed to un-serialize. Please ensure that the arguments of the function are all picklable

I am using BERTopic to perform the topic modelling on 174,827 rows by using the following commands: from bertopic import BERTopic topic_model = BERTopic(language="english", ...
AneesBaqir's user avatar
0 votes
1 answer
465 views

BERT Text Classification Tasks for Beginners

Can anyone list in simple terms tasks involved in building a BERT text classifier for someone new to CS working on their first project? Mine involves taking a list of paragraph length humanitarian aid ...
brentxphillips's user avatar
1 vote
2 answers
10k views

_batch_encode_plus() got an unexpected keyword argument 'return_attention_masks'

I am studying RoBERTA model to detect emotions in tweets. On Google colab. Following this Noteboook file from Kaggle - https://www.kaggle.com/ishivinal/tweet-emotions-analysis-using-lstm-glove-roberta?...
Shubhasmita Roy's user avatar
0 votes
1 answer
2k views

Slow training of BERT model Hugging face

I am training the binary classfier using BERT model implement in hugging face library training_args = TrainingArguments( "deleted_tweets_trainer", num_train_epochs = ...
Alex Kujur's user avatar
0 votes
0 answers
864 views

Training a BERT and Running out of memory - Google Colab

I keep running out of memory even after i bought google colab pro which has 25gb RAM usage. I have no idea why is this happening. I tried every kernel possible (Google colab, Google colab pro, Kaggle ...
barcaman's user avatar
  • 137
17 votes
5 answers
90k views

Transformer: Error importing packages. "ImportError: cannot import name 'SAVE_STATE_WARNING' from 'torch.optim.lr_scheduler'"

I am working on a machine learning project on Google Colab, it seems recently there is an issue when trying to import packages from transformers. The error message says: ImportError: cannot import ...
Spartan 332's user avatar
3 votes
3 answers
2k views

List index out of range when saving finetuned Tensorflow model

I'm trying to fine-tune a pre-trained BERT model from Huggingface using Tensorflow. Everything runs smoothly and the model builds and trains without error. But when I try to save the model it stops ...
Haag's user avatar
  • 47

15 30 50 per page