V
V
Vitalionus2019-10-10 12:36:56
Neural networks
Vitalionus, 2019-10-10 12:36:56

What is the minimum amount of material needed to study a neural network for text generation?

There is a branch from the project: GPT-2 neural network from OpenAI.
Here on github from nshepperd
You can create your own trainable model in it, it is enough to prepare the initial information.
I want to do experiments, I need a neuron to generate logically meaningful texts on one topic, for example, this topic will be about "fishing on the river."
I'm thinking of buying a few books on this topic and donating the text for training. But the question is ...
Is there an average minimum? How much text do you need?
And yet, maybe there are other open source projects with support for the Russian language?

Answer the question

In order to leave comments, you need to log in

1 answer(s)
M
mayton2019, 2019-10-12
@mayton2019

Simon Haikin is usually read on neural networks. But I'm not sure what he wrote about text generation. It's very specific. The average reading minimum is a very strange question. It is unlikely that there is such a criterion.

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question