I
I
inalan2021-04-24 20:19:37
natural language processing
inalan, 2021-04-24 20:19:37

Running out of Memory on RuBERT?

I'm trying to solve a similar problem
dp_tutorials/Tutorial_3_RU_Fine_tuning_BERT_classifier.ipynb at master deepmipt/dp_tutorials GitHub.
(Multi class sentimen classifier), the only thing is that my dataset is in Russian, so I downloaded RuBert and changed some moments of the config such as the path to the dataset, the separator.
The dataset looks like this:

content emotions
0 An ordinary family: a husband is at sea, a wife is a baby… smile
1 I am familiar with this office and bypass it,… angry
2 I needed an apartment for a long time (about a year) until… smile
3 My friend: - Don't feed my dog ​​cheese he has a… smile
4 Tweet Mask: “If life is a computer game… smile
I do everything exactly the same as in the tutorial, but when I load this dataset + rubert, I have it run out of RAM memory, but when I use BERT-base, English, uncased, 12-layer + dataset in English, everything loads and trains fine. I can't figure out what could be the problem. Could you suggest what I'm doing wrong? Here's the code if you want to take a look: BERT/Ru_Bert.ipynb at main MuhammedTech/BERT GitHub
Thanks in advance

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question