Skip to content
/ BabyLlama Public

Train and run a small Llama 2 model from scratch on the TinyStories dataset.

License

Notifications You must be signed in to change notification settings

EN10/BabyLlama

Repository files navigation

Baby Llama

Train and run a small Llama 2 model from scratch on the TinyStories dataset.

Baby Llama Code Example:

!cd llama2.c && python tinystories.py train_vocab --vocab_size=256
trainer_interface.cc(558) LOG(INFO) Alphabet size=102
Vocabulary size is smaller than required_chars. 256 vs 361.

Ref:

About

Train and run a small Llama 2 model from scratch on the TinyStories dataset.

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published