A large-scale 7B pretraining language model developed by BaiChuan-Inc.
-
Updated
Jul 18, 2024 - Python
A large-scale 7B pretraining language model developed by BaiChuan-Inc.
A series of large language models developed by Baichuan Intelligent Technology
A 13B large language model developed by Baichuan Intelligent Technology
A Contamination-free Multi-task Language Understanding Benchmark
[NeurIPS 2023 Spotlight] In-Context Impersonation Reveals Large Language Models' Strengths and Biases
LLMs' performance analysis on CPU, GPU, Execution Time and Energy Usage
Add a description, image, and links to the mmlu topic page so that developers can more easily learn about it.
To associate your repository with the mmlu topic, visit your repo's landing page and select "manage topics."