Dolly 2.0 is a 12B parameter language model from Databricks. The large language model (LLM) has been trained and instruction fine-tuned making it better suited for human interactivity. Crucially, Databricks released all code, model weights, and their fine-tuning dataset with an open-source license that permits commercial use. This makes Dolly 2.0 the world's first truly open-source instruction-tuned LLM, ready for you to run and test out with your own prompts via a Paperspace notebook.
Notebook | Framework | Type | Try for Free |
---|---|---|---|
Dolly 2.0 – The World’s First, Truly Open Instruction-Tuned LLM on IPUs – Inference | Hugging Face | Inference |
In this Paperspace Gradient notebook, you will learn how to run Dolly 2.0 with your own prompts using IPUs. The IPU (Intelligence Processing Unit) is a completely new kind of massively parallel processor designed to accelerate machine intelligence. Create and configure a Dolly inference pipeline, then run Dolly inference on a text prompt to generate answers to questions specified by users.
To do more with Dolly on IPUs, or to speak to an expert, please feel free to contact us.
Join our growing community and interact with AI experts, IPU developers and researchers. Hear the latest IPU news and get access to our newest models.
The contents of this repository are made available according to the terms of the MIT license. See the included LICENSE file for details.