top of page

NaNoGenMo Fairytales




I have been interested in computationally generated fairytales for some time, so I wanted to explore this theme in my NaNoGenMo for my computational approaches to narrative class. Fairytales follow a very straightforward narrative structure and have clear character descriptions (there is always an evil character, a good character and so on).


All good stories have a purpose and intention. Stories that resonate with us bring out emotional reactions through the values they convey through the narrative. Thus, when it comes computationally generated narrative, an important question we must keep in mind is:


How can the design of computational systems reflect these values and intentions?


In the case of using an machine learning (ML) model such as GPT-2, it is slightly harder to instill certain narrative rules. GPT-2 statistically guesses the next word that is most likely to come after. This is different from other computational tools such as Tracery or Markov chains where the author can still instill grammatical rules for the computer to follow and the idea is more of a collaboration with some elements of unexpected surprises. GPT-2 and other similar ML models have been made with the intention of replicating human writing so that a reader is not be able to tell the difference between the two (passing the Turing test). This attempts to trick the reader into thinking there is one sole author when there are two (the computer and human). Even in the case of GPT-2, the source text is taken from hundreds and thousands of human writings, that to say no human was part of the output would be an oversight. Any writer using a computational writing system that embraces this collaborative human computer aspect can most often make interesting work.


This was my first time using GPT-2 and I treated this assignment as a more technical exploration of new and different tools rather than attempting to come up with anything meaningful in a narrative context. I was curious about GPT-2, having read about it everywhere and wanted a chance to experiment with it myself. In a more general sense, I also wanted to explore the idea of narrative structure left to statistical probability. How does the narrative structure get lost when it becomes purely about predicting the next word? Even with characters and narration, how can the sense of narrative story get lost even when narrative aspects are present?


My source text of fairytales contained 381,545 words. They were pulled from Project Guttenberg and contained fairytales of Grimm’s and Anderson. The basic aim of NaNoGenMo was to create a computational output containing at least 50,000 words (similar to a short novel). I used gpt-2-simple created by Max Woolf on Google Colaboratory with the 124M model. The size of the model used should be considered when reading the outputs because I was not able to use the bigger models (one being that my source text wasn’t as big as it needed to be). First I fine-tuned the model with my source text of fairytales and then used it to generate outputs for my NaNoGenMo.


As a whole, the outputs are gibberish. They fail to maintain any coherent narrative or structure. Besides some grammatical errors, it’s like someone with commitment issues that can’t stick to one idea. The outputs still made me laugh and were entertaining, but in a narrative context, the model fails to understand it’s outputs past the fact that statistically these words may make sense together. Even if I was to use the larger models, I’d be pretty sure the model wouldn’t perform any better in terms of narrative structure, which is because it lacks an important key to developing meaningful narratives: understanding of concept and the ability to hold it long enough to develop a point or meaning.


Overall, I think this project has shown the limitations of ml models such as GPT-2 in narrative and storytelling contexts. Reducing the outputs to one or two sentences makes sense, but the inability to sustain a point or a thought, developing it from a previous one is where GPT-2 fails to carry out meaningful narratives. Especially one as long as 50,000 words.


Below are some outputs with visuals created by ATTGan (running the outputs through text to image) using Runway.


-------------


I had such a great thirst for milk that I would have died

if I did not get something to eat, for I had such little

sight that it was not possible for me to know when the next drop

would come. So I got up early in the morning, got up at once

when I found I had such great difficulty in keeping myself alive;

and although I had a good understanding, yet I was so stupid that

no one could make me understand myself. I went to school and went to

religious classes. At the latter I went to the university; but I left

out because I had such a bad appetite.




It was late in the morning, so that I went to bed; but as I was

sleeping, the moon came shining out of the window, and as it

was shining very bright I thought to myself:


"How beautiful that pretty little Princess is, is she not?" And I lay

over in the bed, and it seemed to me that the moon shone on me, and

on the eyes that I had seen on the bread that I had been eating, and on the

cloth that lay over me in the bed. This seemed the most beautiful

child's bed to me.


"Oh, now I shall soon be a Princess too; but I do not know that."



And now the little mermaid kissed his cheeks, and wept for him, and

wept for him, and wept, and wept, and all at the same time

wept, and wept, and the narrowest, deepest corner wept. And

when he was quite at peace, she wept and wept, and at last wept

with a terrible rage, and thought that perhaps the whole sea

would sink together, and that he must swim to the bottom of the

water, and drop dead beneath the waves. And when he rose, he looked

so tall that the little mermaid could scarcely be mistaken for him.




It is true that on Sunday evening, when the shoemaker alone stood outside

the splendid city walls, the woman was talking to herself, and she did

not know whether she had done so out of love or spite; or else she had

forgotten that she had to go to the great metropolis to buy the very

moldiest little carpet in the world.


Whatever may be the cause of this, I dare say, the woman was very much

humbled about it, and so was she at first.


"I will shortly buy a new carpet for you," she said to the old woman.




bottom of page