-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
NVIDIA Beta Tester #2
Comments
Hello Andrei, Thank you for your interest in testing Belullama with your RTX A2000 GPU. We appreciate your enthusiasm and willingness to contribute to the project! We're excited to inform you that we've noted your interest in beta testing the upcoming NVIDIA GPU-compatible version of Belullama. Your RTX A2000 with 8GB of RAM should be well-suited for testing once we release the GPU version. At the moment, we're in the final stages of development for NVIDIA GPU support. While we don't have an exact release date yet, we're working diligently to make it available as soon as possible. Here's what you can expect: We'll keep this issue updated with our progress on GPU support. If you have any questions or need further information, please don't hesitate to ask. We'll keep you updated on our progress, and we look forward to your involvement in testing the GPU version of Belullama! Best regards, |
Please try the GPU supported version: To install the GPU version of Belullama, which includes Ollama, Open WebUI, and Automatic1111, use the following command: curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer_gpu.sh | sudo bash This script will set up all components and configure them to work together seamlessly. Waiting for your feedbacks thanks for participate! |
Hey team,
I would like to test this on my laptop with RTX A2000 with 8GB of RAM. What do I need to do in order to get access to this?
Thanks,
Andrei
The text was updated successfully, but these errors were encountered: