A TensorFlow-based LSTM model that can be used to mimic a sequence of any form with a common input/output method for all data types.
In essence, this is Andrej Karpathy's well-known char-rnn, modified to allow non-character inputs.
The RNN implementation is based on char-rnn-tensorflow, but packaged into a class and modified to allow the usage of any sequence of values that can be converted into integers.
This implementation uses a modified LSTM class from TensorFlow to output the trained model in a simple text format, making cross-platform usage of the model simpler.
My goal with this project was to better understand LSTMs and neural networks in general as well as to learn about how Tensorflow works.
Code: github.com/cqdinh/SequenceGenerator
Code: github.com/cqdinh/SequenceGenerator
Demo: Pre-trained character-by-character model
Options | Results |
---|---|
Training Data:
Sample Size:
Sample Seed:
|
|