Skip to content

A simple implementation of GPT-2, with some other stuff.

Notifications You must be signed in to change notification settings

Wooonster/NanoGPT

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

NanoGPT

Inspired by and Learned from Andrej Karpathy.

GPT 2

The gpt 2 this project reproduce is based on the 124M model size, with 12 layers of Transformers and 768 d_model size.

The model is written using PyTorch, and Huggingface Transformers.

About

A simple implementation of GPT-2, with some other stuff.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published