Skip to content
This repository has been archived by the owner on Aug 10, 2023. It is now read-only.

[BUG] When used for long periods of time, responses become truncated #519

Closed
Cheesy-Brik opened this issue Feb 2, 2023 · 12 comments
Closed
Labels
bug Something isn't working

Comments

@Cheesy-Brik
Copy link

Description
When using a single ChatBot object instance for a while I notice it's reponses become strangely cutoff

Steps to Reproduce
I don't know what the exact issue is, but I running it through a discord bot for users to chat through, but after a while it's reponses become cutoff. I am using the "offical" version with the non-browser hosted version of the ai.

Expected behavior
it doesn't cutoff it's descriptions

@Cheesy-Brik Cheesy-Brik added the bug Something isn't working label Feb 2, 2023
@acheong08
Copy link
Owner

It's because of a 4000 token cutoff

@acheong08
Copy link
Owner

At that point, it only has an 800 token buffer

@acheong08
Copy link
Owner

You can ask it to continue

@acheong08
Copy link
Owner

I can add the ability to define the buffer space if necessary

@Cheesy-Brik
Copy link
Author

I can add the ability to define the buffer space if necessary

Could you add a way to delete some of the previous tokens, or allow for it to delete the needed token space (I guess just deleting a token for every new one it writes). A defined buffer space would also be a helpful setting as rn I'm not using much ram at all.

@Cheesy-Brik
Copy link
Author

Cheesy-Brik commented Feb 2, 2023

Also thanks for replying so quickly, 4am for me didn't expect to get a response lmao

@acheong08
Copy link
Owner

On the latest commit, you can define the buffer space (in tokens) when initializing the Chatbot class. def __init__(self, api_key: str, buffer: int = None) -> None:

You can also remove history manually with
chatbot.prompt.chat_history.pop(0)

@Cheesy-Brik
Copy link
Author

Thx

@Cheesy-Brik
Copy link
Author

Cheesy-Brik commented Feb 2, 2023

On the latest commit, you can define the buffer space (in tokens) when initializing the Chatbot class. def __init__(self, api_key: str, buffer: int = None) -> None:

You can also remove history manually with chatbot.prompt.chat_history.pop(0)

problem with loading convos, for some reason it comes up with

File "C:\Users\User\AppData\Local\Programs\Python\Python310\lib\site-packages\revChatGPT\Official.py", line 339, in construct_prompt
    if len(self.enc.encode(prompt)) > (4000 - self.buffer or 3200):
TypeError: unsupported operand type(s) for -: 'int' and 'NoneType''

When trying to ask a question, problem with prompt loading maybe?

@Cheesy-Brik
Copy link
Author

I loaded the conversation from a dumped json, so that's probably part of it

@acheong08
Copy link
Owner

It seems I have made some mistakes with saving/loading conversations. Haven't tested. Will do tomorrow

@acheong08
Copy link
Owner

Edit: Nevermind. I can't subtract if none. need more checking code. Fixing now...

acheong08 pushed a commit that referenced this issue Feb 2, 2023
@github-actions github-actions bot locked as resolved and limited conversation to collaborators May 20, 2023
Sign up for free to subscribe to this conversation on GitHub. Already have an account? Sign in.
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

2 participants