The GPT-2

In order to write the dialogue of the piece, we used various forms of AI, mostly the GPT-2. This acronym stands for: 

Generative: The AI can predict  the word following a previous word, creating sentences while unsupervised.

Pretrained: The AI uses the parameters of a language model, which observes how sentences are formed and uses that structure to predict the next word, and the following, from a prompt.

Transformer: A neural network model with three basic items: sequence-to-sequence learning, an encoder and a decoder, and an attention model. This breaks language into a format the AI can work with through some internal translation, and allows the AI to comprehend the important parts of a sentence, like we may do while skimming a book.