- update example
- give llm generated code access to ! commands
- fix multiline indentation
- Make system calls with a !
- Add an initial code block argument to the console
- update release script to create github releases
- cleanup
- remove pydantic
- Add a pypi version badge to README
- add ability to reset repl state from within the repl
- make llama-cpp-python an optional dependency
- update readme. some cleanup
- Refactor so pai functions can be called as normal python functions from within the repl.
- Fix python 3.8 compatibility
- prompt updates
- Updated README and modified assets
- Put code execution in its own module
- refactor to better separate the repl. Make the console input handling simpler
- condense the openai prompt, do some refactoring
- Updated README and added assets
- Readme updates
- prompt updates
- add initial prompt argument
- update readme
- Update README with new Goals section
- update prompt and readme
- Update the system prompt for openai
- Refactor and decouple the repl and the console state
- Add support for streaming
- Keep the version in a source file
- Print llm description when repl starts
- rename the 'chat-gpt' flag to 'openai'
- Add AI agents
- update readme
- Update readme with an agent example
- add a license
- Add support for llama.cpp