Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bug: Uncaught (in promise) TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh #2208

Closed
4 tasks done
silentoplayz opened this issue May 12, 2024 · 13 comments
Labels
bug Something isn't working core core feature help wanted Extra attention is needed

Comments

@silentoplayz
Copy link
Collaborator

silentoplayz commented May 12, 2024

Bug Report

Description

Bug Summary:

  • I've found a few Uncaught (in promise) TypeErrors in Open WebUI, causing unexpected page freeze-like behavior along with a few console log errors produced to potentially help debug the issues.

Expected Behavior:

  • The page should not freeze up due to Uncaught (in promise) TypeErrors, requiring a manual refresh by the user to unfreeze the page and clear out browser console log errors.

Actual Behavior:

  • A few different Uncaught (in promise) TypeErrors, followed by a series of error messages related to Immutable, requiring a manual refresh by the user to unfreeze the page to solve.

Environment

  • Open WebUI Version: v0.1.124-v0.3.4 (latest)
  • Ollama: This bug has been found to be present on versions v0.1.35-v0.1.44 based on my testing.
  • Operating System: Windows 11 Pro Insider Preview (Edition) - Version: 24H2 - Installed on: May 19, 2024 - OS build: 26120.770 - Experience: Windows Feature Experience Pack 1000.26100.6.0
  • Browser: Firefox v127.0 (64-bit)

Confirmation:

  • I have read and followed all the instructions provided in the README.md.
  • I am on the latest version of both Open WebUI and Ollama.
  • I have included the browser console logs.
  • I have included the Docker container logs.

Logs and Screenshots

Browser Console Logs Reproduction Details:

1st error (browser console log):

23:06:07.634 Uncaught (in promise) TypeError: t is null
    $ dom.js:252
    m Tooltip.svelte:41
    _t Component.js:44
    m ResponseMessage.svelte:978
    p ResponseMessage.svelte:945
    p ResponseMessage.svelte:642
    p ResponseMessage.svelte:584
    p ResponseMessage.svelte:423
    p ResponseMessage.svelte:383
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    jt Chat.svelte:1351
    M Chat.svelte:1334
    ctx Component.js:138
    K Messages.svelte:215
    K Messages.svelte:212
    J Messages.svelte:295
    n Messages.svelte:311
    t lifecycle.js:105
    t lifecycle.js:104
    w UserMessage.svelte:55
    te UserMessage.svelte:345
    c rocket-loader.min.js:1
dom.js:252:62
    ut scheduler.js:86
    (Async: promise callback)
    lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    jt Chat.svelte:1351
    M Chat.svelte:1334
    ctx Component.js:138
    K Messages.svelte:215
    forEach self-hosted:157
    K Messages.svelte:212
    J Messages.svelte:295
    n Messages.svelte:311
    t lifecycle.js:105
    forEach self-hosted:157
    t lifecycle.js:104
    w UserMessage.svelte:55
    te UserMessage.svelte:345
    c rocket-loader.min.js:1

Steps to Reproduce 1st Error:

  1. Log into Open WebUI on either one of my personal TLDs or directly via localhost.
  2. Choose a local or external large language model in the model selector dropdown to begin chatting with.
  3. Send a query and await for the finished response.
  4. Send another query and await for the finished response.
  5. Delete your most recent message (2nd query) sent to the LLM.
  6. Press F12 on your keyboard to open up your browser console and switch to the Console tab to observe error.
  • The bugs I've reported above are reproducible with both local and external models.
  • All steps mentioned for step-by-step bug reproduction methods are crucial for ensuring you can replicate these same (or related) errors in your browser console.
  1. Press the Edit button of the first message you sent in the chat (before the LLM's response in this chat) and observe new error:
23:07:01.736 Uncaught (in promise) TypeError: R is undefined
    I UserMessage.svelte:36
    ee UserMessage.svelte:295
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m UserMessage.svelte:312
    m Tooltip.svelte:41
    _t Component.js:44
    m UserMessage.svelte:291
    m UserMessage.svelte:419
    m UserMessage.svelte:422
    _t Component.js:44
    m Messages.svelte:308
    m Messages.svelte:370
    m Messages.svelte:374
    p Messages.svelte:285
    p Messages.svelte:380
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    Zi Sidebar.svelte:37
    o index.js:56
    Z Chat.svelte:442
    Ie Chat.svelte:373
    Re MessageInput.svelte:478
    jt dom.js:371
    c rocket-loader.min.js:1
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m MessageInput.svelte:1001
    m MessageInput.svelte:448
    _t Component.js:44
    m Chat.svelte:1386
    m Chat.svelte:1306
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    p root.svelte:54
    p root.svelte:49
    wt utils.js:203
    p  layout.svelte:190
    p  layout.svelte:187
    at scheduler.js:119
UserMessage.svelte:36:2
    I UserMessage.svelte:36
    InterpretGeneratorResume self-hosted:1412
    AsyncFunctionNext self-hosted:799
    (Async: async)
    ee UserMessage.svelte:295
    c rocket-loader.min.js:1
    (Async: EventListener.handleEvent)
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m UserMessage.svelte:312
    m Tooltip.svelte:41
    _t Component.js:44
    m UserMessage.svelte:291
    m UserMessage.svelte:419
    m UserMessage.svelte:422
    _t Component.js:44
    m Messages.svelte:308
    m Messages.svelte:370
    m Messages.svelte:374
    p Messages.svelte:285
    p Messages.svelte:380
    at scheduler.js:119
    ut scheduler.js:79
    (Async: promise callback)
    lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    Zi Sidebar.svelte:37
    o index.js:56
    Z Chat.svelte:442
    InterpretGeneratorResume self-hosted:1412
    AsyncFunctionNext self-hosted:799
    (Async: async)
    Ie Chat.svelte:373
    InterpretGeneratorResume self-hosted:1412
    AsyncFunctionNext self-hosted:799
    (Async: async)
    Re MessageInput.svelte:478
    jt dom.js:371
    c rocket-loader.min.js:1
    (Async: EventListener.handleEvent)
    addEventListener rocket-loader.min.js:1
    St dom.js:361
    m MessageInput.svelte:1001
    m MessageInput.svelte:448
    _t Component.js:44
    m Chat.svelte:1386
    m Chat.svelte:1306
    _t Component.js:44
    m Help.svelte:40
    _t Component.js:44
    p root.svelte:54
    p root.svelte:49
    wt utils.js:203
    p  layout.svelte:190
    p  layout.svelte:187
    at scheduler.js:119
  1. Press the Edit button of the LLM's response to your message and observe another new error:
23:07:56.721 Uncaught (in promise) TypeError: U is undefined
    Ye ResponseMessage.svelte:316
    Re ResponseMessage.svelte:650
    c rocket-loader.min.js:1
ResponseMessage.svelte:316:2
    Ye ResponseMessage.svelte:316
    InterpretGeneratorResume self-hosted:1412
    AsyncFunctionNext self-hosted:799
    (Async: async)
    Re ResponseMessage.svelte:650
    c rocket-loader.min.js:1

2nd error; not sure if this is a huge issue at all though, as this doesn't appear to cause any issues and is unrelated to the actual bug report I'm making here (browser console log):

This error occurred upon signing out of my Open WebUI account:

Error: Promised response from onMessage listener went out of scope 6 background.js:505:27

This error occurred upon signing into my Open WebUI account:

Error: Promised response from onMessage listener went out of scope

Docker Container Logs:

  • Docker Desktop/Container logs are irrelevant to the issue at hand. There's nothing within Docker logs that screams that there's an "ISSUE" like browser logs does specifically.

Installation Method

Docker Desktop via a custom-built docker-compose.yml file for my Open WebUI instance and domains. Ollama is running natively on Windows via the Windows (Preview) version.

Note

If the bug report is incomplete or does not follow the provided instructions, it may not be addressed. Please ensure that you have followed the steps outlined in the README.md and troubleshooting.md documents, and provide all necessary information for us to reproduce and address the issue. Thank you!

Please let me know if I've missed anything or if there's any additional information needed!

@silentoplayz silentoplayz changed the title Bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh May 12, 2024
@silentoplayz silentoplayz changed the title bug: Uncaught TypeError when (not) annotating language model responses in Open WebUI, causing page freeze until refresh bug: Uncaught TypeError when (not) annotating LLM responses in Open WebUI, causes page freeze until refresh May 12, 2024
@tjbck
Copy link
Contributor

tjbck commented May 13, 2024

1st issue should be fixed on dev!

@Yanyutin753
Copy link
Contributor

Every time I delete information, the page fails to report an error and cannot be used normally. After refreshing the page, it can only be used normally.

Console error

image

@silentoplayz silentoplayz changed the title bug: Uncaught TypeError when (not) annotating LLM responses in Open WebUI, causes page freeze until refresh bug: Uncaught TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh May 22, 2024
@silentoplayz silentoplayz changed the title bug: Uncaught TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh bug: Uncaught (in promise) TypeErrors in Open WebUI, causes page freeze & browser console errors until refresh May 22, 2024
@tjbck tjbck added bug Something isn't working help wanted Extra attention is needed core core feature labels May 26, 2024
@kojdj0811
Copy link

I noticed a similar error. I'm using a 70b model on a slow GPU (P40), and I get an error at exactly 5 minutes and the text won't update.

Even after the above phenomenon occurred, the AI was still generating answers on the server. After the operation of the AI for answers was over, it was confirmed that VRAM was cleared after an additional 5 minutes as the default setting of OLLAMA_KEEP_ALIVE.

image

@silentoplayz
Copy link
Collaborator Author

silentoplayz commented Jun 11, 2024

I'm glad I could help by reporting bugs I've found that you then have fixed to make Open WebUI better for everyone to enjoy. Thanks for your hard work Tim. 🍻

@Algorithm5838
Copy link

Algorithm5838 commented Jun 17, 2024

I'm seeing the same bug and getting this error in the console when I delete the second message:

dom.js:253 Uncaught (in promise) TypeError: Cannot read properties of null (reading 'insertBefore')
    at $ (dom.js:253:10)
    at Object.m (Tooltip.svelte:37:1)
    at _t (Component.js:44:23)
    at Object.m (ResponseMessage.svelte:978:50)
    at Object.p (ResponseMessage.svelte:945:42)
    at Object.p (ResponseMessage.svelte:642:23)
    at Object.p (ResponseMessage.svelte:584:47)
    at Object.p (ResponseMessage.svelte:423:96)
    at Object.p (ResponseMessage.svelte:383:17)
    at at (scheduler.js:119:30)

@tjbck
Copy link
Contributor

tjbck commented Jun 17, 2024

The second issue should also be fixed on latest dev!

@Algorithm5838
Copy link

Thanks! I can confirm that the issue has been fixed in the new update, v0.3.5.

@silentoplayz
Copy link
Collaborator Author

silentoplayz commented Jun 17, 2024

The second issue should also be fixed on latest dev!

Hype! You've fixed the bug I've had for so long. It appears that throughout the existence of this bug report, all bugs I've reported within it's time on this specific bug report, has been fixed. Thanks for all your hard work @tjbck! ❤️

I believe this bug report can be closed as completed now.

@damajor
Copy link

damajor commented Jul 21, 2024

I have a similar issue on my instance.

Uncaught (in promise) 
TypeError: Cannot read properties of undefined (reading 'model')
    at Object.p (CompareMessages.svelte:113:9)
    at Object.p (CompareMessages.svelte:105:94)
    at Object.p (CompareMessages.svelte:104:10)
    at Object.p (CompareMessages.svelte:103:25)
    at at (scheduler.js:119:30)
    at ut (scheduler.js:79:5)
p	@	CompareMessages.svelte:113
p	@	CompareMessages.svelte:105
p	@	CompareMessages.svelte:104
p	@	CompareMessages.svelte:103
at	@	scheduler.js:119
ut	@	scheduler.js:79
Promise.then (async)		
lt	@	scheduler.js:20
ht	@	Component.js:81
(anonymous)	@	Component.js:139
he	@	Chat.svelte:319
await in he (async)		
(anonymous)	@	Chat.svelte:130
Pa.s.$$.update	@	Chat.svelte:138
at	@	scheduler.js:115
ut	@	scheduler.js:79
Promise.then (async)		
lt	@	scheduler.js:20
ht	@	Component.js:81
(anonymous)	@	Component.js:139
s.$$set	@	root.svelte:62
$set	@	Component.js:507
ce	@	client.js:1129
await in ce (async)		
(anonymous)	@	client.js:1603

I think it can be easily reproduced following these steps:

  • create new chat
  • use 3 models at once in the chat
  • duplicate the chat or create another one with 3 models
  • open browser debug
  • navigate to first chat and then to the second one (do it again)
  • it should hang when switching 1 or 2 times from one chat to the other one.

Well I am not able to reproduce on freshly created chats again, maybe it comes from old chats i havent used for a while (older than one day), I will update tomorrow or delete my comment if no more issue.

@silentoplayz
Copy link
Collaborator Author

silentoplayz commented Jul 21, 2024

@damajor I am unable to reproduce this particular Uncaught (in promise) TypeError: Cannot read properties of undefined (reading 'model') error on the latest dev branch of Open WebUI, but I am able to produce and successfully reproduce a couple of new different errors that I have not yet come across before.

Steps to reproduce the new errors in browser console that I've found:

  1. Create a new chat
  2. Send a query to 2 models at once in the chat
  3. Wait for both models to respond completely.
  4. Clone the chat
  5. Regenerate the 1st model's response (based on the model dropdown selector, models from top to bottom are sorted from left to right) in the cloned chat.
  6. Open the browser's debug console (F12).
  1. Navigate back to the initial/main chat.
  2. Error should be logged to console:
19:43:23.739 Uncaught (in promise) TypeError: t[0].messages[t[11]] is undefined
    p CompareMessages.svelte:111
    p CompareMessages.svelte:105
    p CompareMessages.svelte:104
    p CompareMessages.svelte:103
    at scheduler.js:119
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    he Chat.svelte:319
    update Chat.svelte:130
    update Chat.svelte:138
    at scheduler.js:115
    ut scheduler.js:79
    promise callback*lt scheduler.js:20
    ht Component.js:81
    ctx Component.js:139
    $$set root.svelte:62
    $set Component.js:507
    ce client.js:1129
    _start_router client.js:1603
    c rocket-loader.min.js:1
CompareMessages.svelte:111:65

The page will freeze beyond this point and requires a page refresh to fix. After refreshing the page, you can click the initial chat that you cloned from and get the same error in the browser console.

Note: Reproducing the initial error discussed in this comment not only causes the page to freeze, but also requires a page refresh to resolve the issue. These errors are reproducible with both local (Ollama) and external connection models.

@silentoplayz silentoplayz reopened this Jul 21, 2024
@silentoplayz
Copy link
Collaborator Author

silentoplayz commented Jul 22, 2024

I noticed a similar error. I'm using a 70b model on a slow GPU (P40), and I get an error at exactly 5 minutes and the text won't update.

Even after the above phenomenon occurred, the AI was still generating answers on the server. After the operation of the AI for answers was over, it was confirmed that VRAM was cleared after an additional 5 minutes as the default setting of OLLAMA_KEEP_ALIVE.

image

@kojdj0811 A related merge that should fix this issue: #3107

@damajor
Copy link

damajor commented Jul 22, 2024

Well maybe the UI was in a bad state when I get the issue, but I am not able to reproduce the problem atm.

@silentoplayz
Copy link
Collaborator Author

Closing in favor of #4408

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working core core feature help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

6 participants