Language and Technology Computer Timeline

From the AI Creative Writing Anthology edited by Geoff Davis.

4 AD: ‘Optatianus Porphyrius’ by Carmen XXV, fourth century AD. Samples and reinventions of writings.

1561: ‘Poetices’ by Julius Caesar Scaliger.

1700s: ‘Fivefold Thought Ring of the German Language’ by Georg Philipp Harsdörfer.

1726: The ‘Knowledge Engine’ is an engraving of a sketch from the notebook of Lemuel Gulliver in ‘Gulliver’s Travels’ by Jonathan Swift. This was used as if it was real in later histories.

1869: Lautréamont’s text ‘Les chants de Maldoror’ featured the description “…the chance encounter of a sewing machine and an umbrella on an operating table,” which was later used as an album title by the experimental music band Nurse With Wound in 1979 on their United Dairies label. Lautréamont was the pen name of Isidore-Lucien Ducasse. An influence on Dada and Surrealism automatic writing.

Early machine learning language outputs would often produce nonsensical or weirdly juxtaposed streams of text, making them popular with a certain type of artist.

1876: American engineer Joseph A. David invented the Plaque Découpée Universelle or the PDU. This stencil plate, with a complex patterned grid, was used to manually generate many writing symbols. Part of the history of typefaces.

1921: Dadaists liked Tatlin’s ‘Maschinenkunst’ (machine art) and Surrealist artists used ‘automatic’ techniques of unconscious writing, drawing and painting. The juxtaposition of unrelated concepts is a surrealist trope, as seen in the Exquisite Corpse paper drawing (or other technique) procedure, where separately created forms are linked together to produce a strange final image.

1943: Perceptron invented  by Warren McCulloch and Walter Pitts. The first implementation was a machine built in 1958 at the Cornell Aeronautical Laboratory by Frank Rosenblatt, funded by the United States Office of Naval Research. First implementation was software for the IBM 704, then hardware, as the ‘Mark 1 Perceptron’ in 1958. The Perceptron was designed for image recognition, but was assumed at the time to be the precusrors of walking and talking, and even thinking and reproducing, robots.

1953: The Manchester University Ferranti 1, the first comercially available computer, was hand programmed by Christopher Strachey to generate love poems based on Roget’s Thesaurus. These were tests (possibly quite useful outside of the lab), and were scattered and later collected.

1965: The Nobel prize-winning author J. M. Coetzee created 182 computer generated poems using an Atlas 2 supercomputer. Coetzee used some of this generated material in published poetry.

1966: ELIZA is an early natural language processing computer program created from 1964 to 1967 at MIT by Joseph Weizenbaum.

1968: Robin Shirley wrote his first program for computer-assisted poetry. First Computer Arts Society public talk, How to write a Computer Poem, helped by fellow poet and performer Spike Hawkins, at the ICA in London.

1976: Barbara Smith debuts “I Am Abandoned” using 2 conversational programs ( a ‘therapist’ and ‘mentally ill person’) communicating across a network, in a social setting (people watching). It used photography, performance, and an early version of chatbots to signal dismay at tech’s upper hand in the attention economy.

1979: Astropoeticon, a poem by Herbert W. Franke, published with art by Andreas Nottebohm. Not generated, but by one of the inspiring computer art pioneers.

1985: Geoff Davis of Micro Arts Group used procedural text generation in MA4 Story Generator, downloadable on Prestel tele‐text pre-internet.

1986-1999: Ray Kurzweil’s Cybernetic Poet (RKCP): Version 1 was followed by V2 in 1994, with developemt from 1995-1999. Used MArkov chains to learn and generate poems.

2017: In the field of Natural Language Processing NLP, the Transformer model was first used in deep machine learning by a team at Google Brain.

2018: The first version of OpenAI’s GPT was released, with 117 million parameters. GPT is a Generative Pretrained Transformer.

2019: GPT-2 released with 1.5 billion parameters. Parameters are the values in a machine learning (neural) model that are changed as it learns, they are also called weights, or configuration variables. The more the better, as the system has more statistical paths and flexibility for generating the output text.

2019: Janelle Shane writes a best-selling book ‘You are a Thing and I Love You’ about AI, using creative writing examples. Famous for weird recipes, love poems, etc.

2020: GPT-3 released with 175 billion parameters. There are many versions for different uses, GPT-3 Small has 125 million parameters and fewer internal levels.

2021: Voice Gems audio data art created from Astropoeticon readings sold as NFTs, created by Herbert W. Franke, Harry Yeff, Trung Bao and AI.

2022: November launch of ChatGPT becomes a sensation. It is a ‘conversational’ application of GPT-3, making it much easier to use. It has also been called GPT 3.5, as it has many enhancements such as ‘guardrails’ to prevent it from being too expensive by avoiding certain areas and topics. InstructGPT is another model.

2022: Sasha Stiles sells an NFT of generated writing at Christie’s Auction House, London.

2023: ChatGPT and text and text-to-image generation systems like Stable Diffusion become commonplace. GPT-4 is a big improvement and many other models start to appear from competitors.

2023: Kalen Iwamoto and AI: Romeo and Juliape. ChatGPT used to create a new play performed at Proof of People, Zero Space New York April 2023.