Dramatron 70B Script Writer

 

Alan Thompson gives an introduction to a scriptwriting system put together at DeepMind called Dramatron.  It is based on the 70 billion parameter Chinchilla transformer based language model.  A script created by the mode was presented as a live play at the Edmonton Fringe Festival to positive reviews.

The paper 'Co-Writing Screenplays and Theatre Scripts with Language Models : An Evaluation by Industry Professionals' is available here.  It incorporates a recursive process to augment the language mode in tracking what is going on in a script over time.


The 70 billion parameter model called Chinchilla that Dramatron is based on is described in this publication and associated blog post.  The goal of Chinchillas was to help answer the question: "What is the optimal model size and number of training tokens for a given compute budget?"  The short answer is smaller models trained on more data.  Chinchilla outperforms GPT-3 and some other larger parameter models.


Observations:

1: Context in ai generated multi-modal animations is a big issue.  Because the current text to image models don't have any real understanding of it.  The way that Dramatron recursively works with hierarchical story generation is interesting, and probably can be repurposed for multi-modal text to image story and animation generation.

Comments

Popular posts from this blog

CycleGAN: a GAN architecture for learning unpaired image to image transformations

Pix2Pix: a GAN architecture for image to image transformation

Smart Fabrics