Personal project

Personal project

Personal project

Artificial Unintelligence

Junk paper

Artificial Unintelligence

Artificial Unintelligence

Machine learning by doing

One man's waste is another man's stationary.

Machine learning by doing

Machine learning by doing

GIF of a robotic head mashing it's face against a keyboard

To learn a bit about neural networks and the practical applications of AI, I played around with OpenAI’s machine learning model GPT-2. What I quickly realised is how unpredictable it can be. While grammatically correct, its outputs are often utterly absurd, and I. AM. OBSESSED.

Working with the streaming service All 4 at the time, I naturally decided to feed the network a bunch of programme descriptions and let the AI generate a series of shows from an alternate reality. I present to you, TV from a computer’s fever dream.

To learn a bit about neural networks and the practical applications of AI, I played around with OpenAI’s machine learning model GPT-2. What I quickly realised is how unpredictable it can be. While grammatically correct, its outputs are often utterly absurd, and I. AM. OBSESSED.

Working with the streaming service All 4 at the time, I naturally decided to feed the network a bunch of programme descriptions and let the AI generate a series of shows from an alternate reality. I present to you, TV from a computer’s fever dream.

To learn a bit about neural networks and the practical applications of AI, I played around with OpenAI’s machine learning model GPT-2. What I quickly realised is how unpredictable it can be. While grammatically correct, its outputs are often utterly absurd, and I. AM. OBSESSED.

Working with the streaming service All 4 at the time, I naturally decided to feed the network a bunch of programme descriptions and let the AI generate a series of shows from an alternate reality. I present to you, TV from a computer’s fever dream.

Seinfeld. Over the course of the series, George has been forced to make major life decisions, and his inner child becomes increasingly more violent and violent. The series has been released on DVD twice; the first and most recently on December 18, 2005.
The Inbetweeners. Gritty, about children living in a red light district and things getting a bit too intense.
Hollyoaks. In season four the llama went on an adventure with Rocky and Bullwinkle, who were attempting to break the record for the length of time it takes to enter a room with a key and enter without it. The llama comes across a stray rock. Now most of the episodes takes place in a garden shed.
Scrubs. Dr. Phil is the new head of the hospital, and everything must work perfectly for him or it's over. Featuring between seven and nine back-alley laser surgery.
The Great British Bake Off. Jessie J introduces a savoury dish and quizzes the chefs while they make over 3000 cakes in a challenging hour. Gordon Ramsay makes mashed potatoes.
Peep Show. It is the quintessential British sitcom, characterised by its burlesque humour. Show is set in the late 1960s and features great voice acting, strong writing, great characters, and great music.
The IT Crowd. After a tense moment in which Jen insists that Moss is still a high schooler, Jen reveals an arsenal of bold new schemes. Her power? When she or anyone her staff touches something that doesn't belong to them, they have to spend a few hours tumbling through the dark. 147 camera are broken this week.
The End of the F***ing World. Everything goes awry when Alyssa is framed for murder, their fingerprints all over the crime scene. Now, armed with a new weapon - their own mother - their only hope is to bring this to the TARDIS.

GPT-2 is a text generation tool that creates a continuation of any text you give it, using a fully trained neural network. It's configurable to produce any length of text on practically any topic. It was only trained to predict the next word in a text, but it unexpectedly learned basic competence. In some cases translating between languages and answering questions — without ever being told it would be evaluated on those tasks. Creepy.

The network was trained with 40GB of random text from 8 million web pages. Instead of using a typical crawler, OpenAI wanted to curate somewhat relevant and good-quality content – so they used websites mentioned in random Reddit posts with a high level of ‘upvotes’.

The diversity of the training material behind GPT-2 means it performs a lot better than other language models that have been trained on specific domains (like Wikipedia, news, or books). Its ability to generate text breaks a number of records in comprehension and language.

GPT-2 has some great applications, like better speech recognition and translation – but it also has some dangerous ones. Such as making it cheaper to create malicious content and phishing at scale. Earlier work that OpenAI did on synthetic imagery, audio, and video, shows that this kind of technology can play a big part in advancing a dangerous new movement of disinformation campaigns. Because of this, the full dataset has not been released.

GPT-2 is a text generation tool that creates a continuation of any text you give it, using a fully trained neural network. It's configurable to produce any length of text on practically any topic. It was only trained to predict the next word in a text, but it unexpectedly learned basic competence. In some cases translating between languages and answering questions — without ever being told it would be evaluated on those tasks. Creepy.

The network was trained with 40GB of random text from 8 million web pages. Instead of using a typical crawler, OpenAI wanted to curate somewhat relevant and good-quality content – so they used websites mentioned in random Reddit posts with a high level of ‘upvotes’.

The diversity of the training material behind GPT-2 means it performs a lot better than other language models that have been trained on specific domains (like Wikipedia, news, or books). Its ability to generate text breaks a number of records in comprehension and language.

GPT-2 has some great applications, like better speech recognition and translation – but it also has some dangerous ones. Such as making it cheaper to create malicious content and phishing at scale. Earlier work that OpenAI did on synthetic imagery, audio, and video, shows that this kind of technology can play a big part in advancing a dangerous new movement of disinformation campaigns. Because of this, the full dataset has not been released.

GPT-2 is a text generation tool that creates a continuation of any text you give it, using a fully trained neural network. It's configurable to produce any length of text on practically any topic. It was only trained to predict the next word in a text, but it unexpectedly learned basic competence. In some cases translating between languages and answering questions — without ever being told it would be evaluated on those tasks. Creepy.

The network was trained with 40GB of random text from 8 million web pages. Instead of using a typical crawler, OpenAI wanted to curate somewhat relevant and good-quality content – so they used websites mentioned in random Reddit posts with a high level of ‘upvotes’.

The diversity of the training material behind GPT-2 means it performs a lot better than other language models that have been trained on specific domains (like Wikipedia, news, or books). Its ability to generate text breaks a number of records in comprehension and language.

GPT-2 has some great applications, like better speech recognition and translation – but it also has some dangerous ones. Such as making it cheaper to create malicious content and phishing at scale. Earlier work that OpenAI did on synthetic imagery, audio, and video, shows that this kind of technology can play a big part in advancing a dangerous new movement of disinformation campaigns. Because of this, the full dataset has not been released.

Other projects

Anders Kristoffersen

Designer

Making cool shit in:

Making cool shit in:

Making cool shit in:

Lofoten Vesterålen Trondheim Southampton Stockholm London

Lofoten Vesterålen Trondheim Southampton Stockholm London

 © 2022

 © 2022

 © 2022