This content originally appeared on DEV Community and was authored by James Briggs
Language generation is one of those natural language tasks that can really produce an incredible feeling of awe at how far the fields of machine learning and artificial intelligence have come.
GPT-1, 2, and 3 are OpenAI’s top language models — well known for their ability to produce incredibly natural, coherent, and genuinely interesting language.
In this video, we will take a small snippet of text and learn how to feed that into a pre-trained GPT-2 model using PyTorch and Transformers to produce high-quality language generation in just eight lines of code. We cover:
PyTorch and Transformers
- Data
Building the Model
Initialization
Tokenization
Generation
Decoding
Results
This content originally appeared on DEV Community and was authored by James Briggs
James Briggs | Sciencx (2021-03-14T14:13:41+00:00) Text Generation With GPT-2 in Python. Retrieved from https://www.scien.cx/2021/03/14/text-generation-with-gpt-2-in-python/
Please log in to upload a file.
There are no updates yet.
Click the Upload button above to add an update.