Answer the question
In order to leave comments, you need to log in
How to limit available alphabet in recurrent network for text generation?
There is a classic task - to train an RNN to generate text. Suppose I want each next character in the sequence to be selected not from the entire alphabet, but from some subset of it, which changes each time. The sampling rule for this subset can be specified by a bitmask for each initial character sequence from the training sample.
Tell me, how can this limitation be taken into account in the neural network architecture?
Answer the question
In order to leave comments, you need to log in
Didn't find what you were looking for?
Ask your questionAsk a Question
731 491 924 answers to any question