OpenAI’s GPT-2 Is Now Available - It Is Wise as a Scholar! 🎓 | Summary and Q&A

94.2K views
October 1, 2019
by
Two Minute Papers
YouTube video player
OpenAI’s GPT-2 Is Now Available - It Is Wise as a Scholar! 🎓

TL;DR

OpenAI GPT-2 is a powerful technique that learns natural language processing tasks with minimal supervision, using 40 GB of internet text as training data.

Install to Summarize YouTube Videos and Get Transcripts

Questions & Answers

Q: What is OpenAI GPT-2?

OpenAI GPT-2 is a learning-based technique that can perform natural language processing tasks, such as text completion and summarization, with minimal supervision. It learns language by reading 40 GB of internet text.

Q: How much training data does GPT-2 require?

GPT-2 requires 40 GB of non-binary plaintext data from the internet for training.

Q: Can GPT-2 cheat?

Yes, GPT-2 can cheat. It has been caught red-handed by generating sentences that are not related to the given text, but are still coherent and on topic.

Q: What is the size of the GPT-2 model?

The size of the GPT-2 model used in this content is about 750 million parameters, which is half the size of the original full model.

Summary & Key Takeaways

  • OpenAI GPT-2 is a learning-based technique that can perform various natural language processing operations.

  • GPT-2 can answer questions, complete text, summarize, and more, with minimal supervision.

  • It learns our language by itself, reading 40 GB of non-binary plaintext data from the internet.

Share This Summary 📚

Summarize YouTube Videos and Get Video Transcripts with 1-Click

Download browser extensions on:

Explore More Summaries from Two Minute Papers 📚

Summarize YouTube Videos and Get Video Transcripts with 1-Click

Download browser extensions on: