Skip to content

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Segmentation fault when starting backend on intel cpu. #3238

Closed
4 tasks
w-A-L-L-e opened this issue Jun 17, 2024 · 0 comments
Closed
4 tasks

Segmentation fault when starting backend on intel cpu. #3238

w-A-L-L-e opened this issue Jun 17, 2024 · 0 comments

Comments

@w-A-L-L-e
Copy link

w-A-L-L-e commented Jun 17, 2024

Bug Report

When starting the backend with ./start.sh there seems to be a problem with the BertTokenizerFast causing a segfault:

Description

Bug Summary:
[Provide a brief but clear summary of the bug]

Steps to Reproduce:
pip install -r requirements.txt
./start.sh

Expected Behavior:
Backend starts up and listens to a port

Actual Behavior:
Backend gives segmentation fault:

OSError: Can't load tokenizer for '/Users/wschrep/.cache/huggingface/hub/models--sentence-transformers--all-MiniLM-L6-v2/snapshots/8b3219a92973c328a8e22fadcfa821b5dc75636a'. If you were trying to load it from 'https://huggingface.co/models', make sure you don't have a local directory with the same name. Otherwise, make sure '/Users/wschrep/.cache/huggingface/hub/models--sentence-transformers--all-MiniLM-L6-v2/snapshots/8b3219a92973c328a8e22fadcfa821b5dc75636a' is the correct path to a directory containing all relevant files for a BertTokenizerFast tokenizer.
[1]    33382 segmentation fault  ./start.sh

Environment

  • **Open WebUI Version: current version git

  • **Ollama (if applicable): 0.1.44

  • **Operating System: Mac OS Sonoma

Reproduction Details

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

 File "/Users/wschrep/llmWork/open-webui/backend/main.py", line 43, in <module>
    from apps.rag.main import app as rag_app
  File "/Users/wschrep/llmWork/open-webui/backend/apps/rag/main.py", line 210, in <module>
    update_embedding_model(
  File "/Users/wschrep/llmWork/open-webui/backend/apps/rag/main.py", line 187, in update_embedding_model
    app.state.sentence_transformer_ef = sentence_transformers.SentenceTransformer(
  File "/Users/wschrep/llmWork/open-webui/backend/python_env/lib/python3.9/site-packages/sentence_transformers/SentenceTransformer.py", line 197, in __init__
    modules = self._load_sbert_model(
  File "/Users/wschrep/llmWork/open-webui/backend/python_env/lib/python3.9/site-packages/sentence_transformers/SentenceTransformer.py", line 1296, in _load_sbert_model
    module = Transformer(model_name_or_path, cache_dir=cache_folder, **kwargs)
  File "/Users/wschrep/llmWork/open-webui/backend/python_env/lib/python3.9/site-packages/sentence_transformers/models/Transformer.py", line 38, in __init__
    self.tokenizer = AutoTokenizer.from_pretrained(
  File "/Users/wschrep/llmWork/open-webui/backend/python_env/lib/python3.9/site-packages/transformers/models/auto/tokenization_auto.py", line 899, in from_pretrained
    return tokenizer_class_fast.from_pretrained(pretrained_model_name_or_path, *inputs, **kwargs)
  File "/Users/wschrep/llmWork/open-webui/backend/python_env/lib/python3.9/site-packages/transformers/tokenization_utils_base.py", line 2094, in from_pretrained

Browser Console Logs:
[Include relevant browser console logs, if applicable]

Docker Container Logs:
[Include relevant Docker container logs, if applicable]

Screenshots (if applicable):
[Attach any relevant screenshots to help illustrate the issue]

Installation Method

[Describe the method you used to install the project, e.g., manual installation, Docker, package manager, etc.]

Additional Information

[Include any additional details that may help in understanding and reproducing the issue. This could include specific configurations, error messages, or anything else relevant to the bug.]

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

@open-webui open-webui locked and limited conversation to collaborators Jun 17, 2024
@tjbck tjbck converted this issue into discussion #3241 Jun 17, 2024

This issue was moved to a discussion.

You can continue the conversation there. Go to discussion →

Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant