Dramatron 70B Script Writer
Alan Thompson gives an introduction to a scriptwriting system put together at DeepMind called Dramatron. It is based on the 70 billion parameter Chinchilla transformer based language model. A script created by the mode was presented as a live play at the Edmonton Fringe Festival to positive reviews. The paper 'Co-Writing Screenplays and Theatre Scripts with Language Models : An Evaluation by Industry Professionals' is available here . It incorporates a recursive process to augment the language mode in tracking what is going on in a script over time. The 70 billion parameter model called Chinchilla that Dramatron is based on is described in this publication and associated blog post . The goal of Chinchillas was to help answer the question: "What is the optimal model size and number of training tokens for a given compute budget?" The short answer is smaller models trained on more data. Chinchilla outperforms GPT-3 and some other larger parameter models. Observat