- Python bindings for GPT-J ggml language models.
- Almost the same API as pyllamacpp.
- The easy way is to use the prebuilt wheels
pip install pygptj
- Build from source:
git clone git https://github.com/abdeladim-s/pygptj.git
You can run the following simple command line interface to test the package once it is installed:
pygtj path/to/ggml/model
from pygptj.model import Model
model = Model(model_path='path/to/gptj/ggml/model')
for token in model.generate("Tell me a joke ?"):
print(token, end='', flush=True)
You can set up an interactive dialogue by simply keeping the model
variable alive:
from pygptj.model import Model
model = Model(model_path='/path/to/ggml/model')
while True:
try:
prompt = input("You: ", flush=True)
if prompt == '':
continue
print(f"AI:", end='')
for token in model.generate(prompt):
print(f"{token}", end='', flush=True)
print()
except KeyboardInterrupt:
break
The following is an example showing how to "attribute a persona to the language model" :
from pygptj.model import Model
prompt_context = """Act as Bob. Bob is helpful, kind, honest,
and never fails to answer the User's requests immediately and with precision.
User: Nice to meet you Bob!
Bob: Welcome! I'm here to assist you with anything you need. What can I do for you today?
"""
prompt_prefix = "\nUser:"
prompt_suffix = "\nBob:"
model = Model(model_path='/path/to/ggml/model',
prompt_context=prompt_context,
prompt_prefix=prompt_prefix,
prompt_suffix=prompt_suffix)
while True:
try:
prompt = input("User: ")
if prompt == '':
continue
print(f"Bob: ", end='')
for token in model.generate(prompt, antiprompt='User:'):
print(f"{token}", end='', flush=True)
print()
except KeyboardInterrupt:
break
- You can always refer to the short documentation for more details.
You can check the API reference documentation for more details.
This project is licensed under the MIT License.