Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

NVIDIA Beta Tester #2

Open
erghe opened this issue Jul 26, 2024 · 2 comments
Open

NVIDIA Beta Tester #2

erghe opened this issue Jul 26, 2024 · 2 comments

Comments

@erghe
Copy link

erghe commented Jul 26, 2024

Hey team,

I would like to test this on my laptop with RTX A2000 with 8GB of RAM. What do I need to do in order to get access to this?

Thanks,
Andrei

@ai-joe-git
Copy link
Owner

Hello Andrei,

Thank you for your interest in testing Belullama with your RTX A2000 GPU. We appreciate your enthusiasm and willingness to contribute to the project!

We're excited to inform you that we've noted your interest in beta testing the upcoming NVIDIA GPU-compatible version of Belullama. Your RTX A2000 with 8GB of RAM should be well-suited for testing once we release the GPU version.

At the moment, we're in the final stages of development for NVIDIA GPU support. While we don't have an exact release date yet, we're working diligently to make it available as soon as possible.

Here's what you can expect:

We'll keep this issue updated with our progress on GPU support.
Once we're ready for beta testing, we'll reach out to you directly through this issue with instructions on how to proceed.
In the meantime, if you'd like to familiarize yourself with Belullama, you can try the current CPU-based version by following the installation instructions in our README.
We greatly appreciate your patience and support as we work on bringing GPU acceleration to Belullama. Your feedback during the beta testing phase will be invaluable in ensuring a smooth experience for all users.

If you have any questions or need further information, please don't hesitate to ask. We'll keep you updated on our progress, and we look forward to your involvement in testing the GPU version of Belullama!

Best regards,
The Belullama Team

@ai-joe-git
Copy link
Owner

Please try the GPU supported version:

To install the GPU version of Belullama, which includes Ollama, Open WebUI, and Automatic1111, use the following command:

curl -s https://raw.githubusercontent.com/ai-joe-git/Belullama/main/belullama_installer_gpu.sh | sudo bash

This script will set up all components and configure them to work together seamlessly.

Waiting for your feedbacks thanks for participate!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants