top of page

Generative Fairy Tales

Using Recurrent Neural Networks

I trained a text-generating recurrent neural network (RNN) with a corpus of classic fairy tale stories. The outputs of the generated fairy tales were both funny and weird, and even a little bit dark...


I used Google Colab, a notebook environment hosted on a short term virtual machine that allowed me to train on a GPU for free (rather than training on CPUs locally with my terminal). Google Colab comes with Python packages to train your machine learning model and also allows you to install your own directly into the notebook, which is where I installed textgenrnn.

The aim was to generate text following certain rules so that the algorithm is judged by how well it follows an inherent structure. In this case because I trained it off of a corpus of fairy tales, there were certain rules the algorithm needed to follow besides just grammatical ones. Some of the obvious ones were generating a title for each story and starting with "Once upon a time...". Another one could have been ending each story with "And they lived happily ever after...". In order for me to have the algorithm understand and follow these rules, the dataset itself would have to follow these rules, meaning that each of the stories I fed it, would have to have a title, start with "Once upon a time..." and "And they lived happily ever after...". As with learning anything, the more these rules were repeated in the initial dataset, the more the model would mimic these rules into it's own generated stories.

More generally, stories have an explicit structure of beginning, middle and end with characters such as a protagonist and antagonist throughout the story aiming to achieve a certain goal. This structure was the hardest to implement into my model correctly and most of the outputs change characters, gender and story line randomly making it hard to follow a clear story line, but also making for funny story outputs.


List of Texts I Used to Train My Model:

  1. Grimm's Fairy Tales by Jacob Grimm and Wilhelm Grimm

  2. Andersen's Fairy Tales by Hans Christian Andersen

  3. The Fairy Tales of Charles Perrault by Charles Perrault

  4. American Fairy Tales by L. Frank Buam

  5. Children's Hour with Red Riding Hood and Other Stories

  6. Favorite Fairy Tales by Logan Marshall


Training & Model Tweaking:

After every ten epochs (one epoch being one full pass through the input data),

the network generated a sample text at different temperatures of 0.2, 0.5 and 1.0.

Each temperature represents the "creativity" of the text with 1.0 being the

"least creative" and closest related to the initial input text.

Normally, networks use a single 128-cell or 256-cell network, but textgenrnn's

architecture is slightly different because it has an attention layer which

incorporates all of the preceding model layers. This is why it is better to go deeper

rather than wider. I used a 128-cell Bidirectional LSTM with 4 rnn layers. Allowing

the rnn to be bidirectional meant that it processed the previous characters both

forward and backward, which I wanted because my text followed specific rules

(like titles for each story). I also trained my model on the word level (possible

because of Keras's tokenizers) rather than at the character level. This allowed the

model to use the n previous words/punctuation to predict the next word/punctuation. This helped prevent crazy typos from my model (because it predicted multiple characters at a time).

First Few Epochs:

These outputs were not perfect, far from it, but I had to appreciate the amusement they generated from the nonsense of words. I found these random word generations to be naive and childlike, which maybe was perfect for something like fairy tales. The slow advancement of sentence and story structure from epoch to epoch was witnessing the progression of learning, just like a child. So while the outputs were not perfect, there was something almost poetic about the outputs, even the way the text was structured and laid out.

Below are more output snippets from the early epochs ( first ten) kept in the same format:





once – eyes

once who had a man who had a little daughter who had a man who

had a man who had a little



a little daughter who had a king


once once

once who had a daughter who had true daughters - - a 


"i will not go to the ball, " said the princess . " i will never go to the ball . "

from its

dream at last ; so you may not come home again . "

More Training, More Epochs:

By now, the model was creating titles for each of it's stories as well as using "Once Upon A Time" for most of the outputs, learning things that were repeated the most in the corpus it was fed.

the sleeping beauty

there was once a man who had three sons , and the youngest of whom

was so beautiful, that he was his mother .

the old woman

once upon a time there was once a daughter who had a splendid feathers of a husband ,

and a daughter of a daughter , and most beautiful and very beautiful cottage was a girl .

the king and the daughter

once upon a long apples , and by day , the woman , who was

very beautiful and fair to set out evil , and they never a good daughter . 

the wolf

once upon a time there was a poor man who had a mother who had no one , and

when the little girl had three sons , who had a good meal , but one day , the

old woman , thanking was so ugly that she had made her to whom she had given

her most attention . 

" i will have a smile

Darker Outputs:

There were surprisingly a few darker outputs by the model, which made me question the origin and meaning fo some of the fairy tale stories I was initially feeding it. Some didn't make correct use of the word, such as death, which made it weird and funny and others (first one) was eerily poetic.

the dead , “ and so they thought

they were in heaven .

the shoemaker took the beautiful maiden – yard – woman , and told her that she was so dead .




jack pulled his head under his head , and fell on it with his tongue .

the little singing , the poor little thing thought it could not be so happy as before . 


 then they laughed and said , " we will not let the light fall out of

the fire , for they shall not be dead . "

he never lived happily ever after .

Creative Fairy Tale Titles:

some of the best parts of the outputs were the creative and funny titles the model generated:


festival of the jam


the metal pig

the donkey - skin


the wolf and the beanstalk


cinderalla and the beast


the king and the porcupine


the seven crows and his friends


the lions new clothes


The data that you train your model on will affect the quality of your outputs, so training data is key! Also, it's really hard to get perfect outputs that not only make grammatical sense, but overall sense on a story level. I want to continue exploring text generation within a classical structured story. Figuring out how to implement rules of structure that stories posses would be my next step. Despite these outputs not being perfect, it was fun to see the silly and even creative outputs the model generated, making me look at fairy tales in a new and refreshing way.



Code & Data Corpus:

I trained my model using textgenrnn by Max Wool (@minimaxir) with the original code based off of Andrej Karpathy's char-rnn model.

textgenrnn is a Python script that takes specified pre-trained word embeddings and outputs character embeddings in the same format. 

My corpus of fairy tale texts were taken from Project Gutenberg (see above for list of all texts used in training).

Training Model
bottom of page