Finetune GPT-3 to write an entire coherent novel (part 1) | Summary and Q&A

8.9K views
May 12, 2022
by
David Shapiro
YouTube video player
Finetune GPT-3 to write an entire coherent novel (part 1)

TL;DR

OpenAI forum user seeks advice on fine-tuning GPT-3 for generating fiction; David Shapiro shares insights and conducts experiments in generating novel-length fiction from single story premises.

Install to Summarize YouTube Videos and Get Transcripts

Key Insights

  • ✍️ Fine-tuning GPT-3 for fiction writing remains a challenging task, and achieving coherent fiction is elusive.
  • 😃 The document example of fine-tuning, where the whole story or a big chunk of it is used as the completion, can help fine-tune the model to match the writer's style and tone.
  • 📔 The lore book mechanism, as used by AI Dungeon, can assist in maintaining consistency and referencing major story details in each completion.
  • ✍️ Experimenting with different prompts and completion formats can provide valuable insights into the capabilities and limitations of GPT-3 for fiction writing.
  • 😒 The auto muse 2 experiment aims to generate novel-length fiction by combining story premises, outlines, and summaries to generate coherent and engaging stories.
  • ✍️ The generation of fan fiction and the ability to write in different styles demonstrate the flexibility and adaptability of GPT-3 in fiction writing.

Transcript

hey everybody david shapiro here um so there was a post on the open ai forum that was asking about well here let me just show you um what was it the guy was asking about where was it um he was asking how to fine tune from scratch uh oh it was this one okay okay so let me show you guys this so this guy asked i'm using short stories that i wrote to b... Read More

Questions & Answers

Q: How can GPT-3 be fine-tuned for generating fiction?

Shapiro suggests using the document example of fine-tuning or the lore book mechanism to provide a prompt and input for the model to generate the next paragraphs of the story.

Q: What is the lore book mechanism used by AI Dungeon?

The lore book mechanism ensures that major story details can be referenced at all times for each completion, allowing for the inclusion of previous paragraphs and the generation of subsequent paragraphs.

Q: What are some tips for generating coherent fiction with GPT-3?

Shapiro recommends experimenting with different prompts and completion formats, such as using an outline, summary, or previous paragraphs as part of the input, to fine-tune the model to match your desired style and tone.

Q: What is the purpose of the auto muse 2 experiment?

The auto muse 2 experiment aims to generate novel-length fiction from a single story premise, exploring the capabilities and limitations of GPT-3 for fiction writing.

Summary & Key Takeaways

  • A user on the OpenAI forum asks for tips on fine-tuning GPT-3 for generating fiction.

  • David Shapiro recommends using the document example of fine-tuning or the lore book mechanism used by AI Dungeon.

  • Shapiro conducts experiments to generate fiction, using different story premises and writing styles.

Share This Summary 📚

Summarize YouTube Videos and Get Video Transcripts with 1-Click

Download browser extensions on:

Explore More Summaries from David Shapiro 📚

Summarize YouTube Videos and Get Video Transcripts with 1-Click

Download browser extensions on: