I
I
ivodopyanov2016-02-15 10:00:37
Neural networks
ivodopyanov, 2016-02-15 10:00:37

How to limit available alphabet in recurrent network for text generation?

There is a classic task - to train an RNN to generate text. Suppose I want each next character in the sequence to be selected not from the entire alphabet, but from some subset of it, which changes each time. The sampling rule for this subset can be specified by a bitmask for each initial character sequence from the training sample.
Tell me, how can this limitation be taken into account in the neural network architecture?

Answer the question

In order to leave comments, you need to log in

Didn't find what you were looking for?

Ask your question

Ask a Question

731 491 924 answers to any question