Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

When sending a message, two dialogue branches are created #278

Open
Zotic opened this issue Jan 25, 2025 · 3 comments
Open

When sending a message, two dialogue branches are created #278

Zotic opened this issue Jan 25, 2025 · 3 comments
Labels
bug bug!

Comments

@Zotic
Copy link

Zotic commented Jan 25, 2025

Version of hugchat 0.4.18

When sending a message using self.chatbot.chat(send_text).wait_until_done() (by stream mode this error also occurs) two dialogue branches are created, and the first one send in response, while the second one seems to have not been generated yet.

Interface:

Image

Huggingface chat:
first branch:
Image
second branch:
Image

logs:

26.01.2025 01:18:23 - Soraka - DEBUG - [[User(first_name='', id=, is_bot=False, is_premium=, language_code='ru', username=''), 'test']]
26.01.2025 01:18:23 - root - DEBUG - message_id: da19281e-151f-4bc2-a0b9-3dc3a8e89bd4
26.01.2025 01:18:24 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679563113dc126191915d01c HTTP/1.1" 200 None
26.01.2025 01:18:24 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679563113dc126191915d01c HTTP/1.1" 200 None
26.01.2025 01:18:24 - urllib3.connectionpool - DEBUG - Resetting dropped connection: huggingface.co
26.01.2025 01:18:25 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /chat/conversation/679563113dc126191915d01c/__data.json?x-sveltekit-invalidated=01 HTTP/1.1" 200 None
26.01.2025 01:18:25 - root - DEBUG - conversation 679563113dc126191915d01c history: [MessageNode(id='da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', role='system', content='', ancestors=[], children=['616ce2ae-2e63-4cc1-b513-68a0744ab261', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51'], created_at=1737832673.019, updated_at=1737832673.019), MessageNode(id='616ce2ae-2e63-4cc1-b513-68a0744ab261', role='user', content='test', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4'], children=['5c2d3bce-0168-4397-a819-0b7a8660ebc2'], created_at=1737832680.549, updated_at=1737832680.549), MessageNode(id='5c2d3bce-0168-4397-a819-0b7a8660ebc2', role='assistant', content='This is a test response.', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '616ce2ae-2e63-4cc1-b513-68a0744ab261'], children=[], created_at=1737832680.549, updated_at=1737832680.556), MessageNode(id='3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', role='user', content='test', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4'], children=['17006a75-21cd-4feb-801c-6d8f7af490c8'], created_at=1737832681.122, updated_at=1737832681.122), MessageNode(id='17006a75-21cd-4feb-801c-6d8f7af490c8', role='assistant', content='This', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51'], children=[], created_at=1737832681.122, updated_at=1737832681.129)]
26.01.2025 01:18:25 - Soraka - DEBUG - This is a test response.
26.01.2025 01:18:25 - Soraka - INFO - Sending try1: This is a test response.
26.01.2025 01:18:25 - Soraka - INFO - Sending successful

If I then send a message to the same chat, it will enter the second branch and again two new branches will be created, where the answer will return from the first, and the ungenerated text will be in second.
Before sending, I always change the conversation (I need to do this, since my bot is in a group chat), it changes in this way and the ID (conv_id) does not change in this test for the first and second message.

conv = hugchat.Conversation(id = conv_id)
self.chatbot.change_conversation(conv)

Interface:

Image

Huggingface chat:

first branch of second branch:
Image
second branch of second branch:
Image

logs:

26.01.2025 01:29:23 - Soraka - DEBUG - [[User(first_name='', id=, is_bot=False, is_premium=, language_code='ru', username=''), 'test 2']]
26.01.2025 01:29:23 - root - DEBUG - message_id: 17006a75-21cd-4feb-801c-6d8f7af490c8
26.01.2025 01:29:23 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679563113dc126191915d01c HTTP/1.1" 200 None
26.01.2025 01:29:24 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679563113dc126191915d01c HTTP/1.1" 200 None
26.01.2025 01:29:24 - urllib3.connectionpool - DEBUG - Resetting dropped connection: huggingface.co
26.01.2025 01:29:24 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /chat/conversation/679563113dc126191915d01c/__data.json?x-sveltekit-invalidated=01 HTTP/1.1" 200 None
26.01.2025 01:29:24 - root - DEBUG - conversation 679563113dc126191915d01c history: [MessageNode(id='da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', role='system', content='', ancestors=[], children=['616ce2ae-2e63-4cc1-b513-68a0744ab261', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51'], created_at=1737832673.019, updated_at=1737832673.019), MessageNode(id='616ce2ae-2e63-4cc1-b513-68a0744ab261', role='user', content='test', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4'], children=['5c2d3bce-0168-4397-a819-0b7a8660ebc2'], created_at=1737832680.549, updated_at=1737832680.549), MessageNode(id='5c2d3bce-0168-4397-a819-0b7a8660ebc2', role='assistant', content='This is a test response.', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '616ce2ae-2e63-4cc1-b513-68a0744ab261'], children=[], created_at=1737832680.549, updated_at=1737832680.556), MessageNode(id='3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', role='user', content='test', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4'], children=['17006a75-21cd-4feb-801c-6d8f7af490c8'], created_at=1737832681.122, updated_at=1737832681.122), MessageNode(id='17006a75-21cd-4feb-801c-6d8f7af490c8', role='assistant', content='This', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51'], children=['3f488e67-80b1-4c7b-8828-4573b9261322', '1541c013-8496-4acb-bdc9-b8cf1faa3bed'], created_at=1737832681.122, updated_at=1737832681.129), MessageNode(id='3f488e67-80b1-4c7b-8828-4573b9261322', role='user', content='test 2', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', '17006a75-21cd-4feb-801c-6d8f7af490c8'], children=['3c3239ee-4039-403f-b9b9-f6cb916cad14'], created_at=1737833340.152, updated_at=1737833340.152), MessageNode(id='3c3239ee-4039-403f-b9b9-f6cb916cad14', role='assistant', content='Hello! How can I help you today?', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', '17006a75-21cd-4feb-801c-6d8f7af490c8', '3f488e67-80b1-4c7b-8828-4573b9261322'], children=[], created_at=1737833340.152, updated_at=1737833340.16), MessageNode(id='1541c013-8496-4acb-bdc9-b8cf1faa3bed', role='user', content='test 2', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', '17006a75-21cd-4feb-801c-6d8f7af490c8'], children=['d627b7cf-49ac-47ea-80c7-342706b563b5'], created_at=1737833340.842, updated_at=1737833340.842), MessageNode(id='d627b7cf-49ac-47ea-80c7-342706b563b5', role='assistant', content='This is a', ancestors=['da19281e-151f-4bc2-a0b9-3dc3a8e89bd4', '3f6c5da2-e731-4efc-bd17-6ea58a8f0d51', '17006a75-21cd-4feb-801c-6d8f7af490c8', '1541c013-8496-4acb-bdc9-b8cf1faa3bed'], children=[], created_at=1737833340.842, updated_at=1737833340.85)]
26.01.2025 01:29:24 - Soraka - DEBUG - Hello! How can I help you today?
26.01.2025 01:29:24 - Soraka - INFO - Sending try1: Hello! How can I help you today?
26.01.2025 01:29:25 - Soraka - INFO - Sending successful
@Zotic Zotic added the bug bug! label Jan 25, 2025
Copy link

Hi! Thanks for your issue, we will deal with your issue as soon as possible.

@Zotic
Copy link
Author

Zotic commented Jan 30, 2025

Update:
This happens on all models except the new one - DeepSeek.
Most likely, this is due to the new structure on the site - "reasoning" which describes how the model thinks. That is, two generation responses are created and for some reason your API always triggers the creation of the second response for all models.

@Zotic
Copy link
Author

Zotic commented Feb 1, 2025

I found out what the problem is. In your code (hugchat.py line 706) you have a loop that repeats the generation request if there are any errors. In it, you set that the generation is considered complete if the server response of the "finalAnswer" type comes. Looking at what responses come (logs below), this type of response is not sent and the program again creates a generation request. Then you already have protection against this and the generation ends, but the problem is that when the generation request is repeated, a new dialogue branch is created, which causes problems later. I fixed this by simply disabling repetition attempts, I think this is not very safe, but it will do as a temporary solution. I hope after the holidays you will come up with a new way to recognize the end of generation. Happy New Year!

logs:

01.02.2025 21:11:04 - root - DEBUG - message_id: 0e5e8052-33fe-4b07-adae-ed8691af5271
01.02.2025 21:11:04 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679e63972fd2f02cdb7d254f HTTP/1.1" 200 None
01.02.2025 21:11:04 - root - DEBUG - {"type":"status","status":"started"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"status","status":"keepAlive"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"title","title":"New Chat"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"status","status":"keepAlive"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":"[\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":"Sor\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":"aka\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":"]:\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"status","status":"keepAlive"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":" Да\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:04 - root - DEBUG - {"type":"stream","token":".\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000\u0000"}
01.02.2025 21:11:05 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "POST /chat/conversation/679e63972fd2f02cdb7d254f HTTP/1.1" 200 None
01.02.2025 21:11:05 - root - DEBUG - {"type":"status","status":"started"}
01.02.2025 21:11:05 - urllib3.connectionpool - DEBUG - Resetting dropped connection: huggingface.co
01.02.2025 21:11:05 - urllib3.connectionpool - DEBUG - https://huggingface.co:443 "GET /chat/conversation/679e63972fd2f02cdb7d254f/__data.json?x-sveltekit-invalidated=01 HTTP/1.1" 200 None

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug bug!
Projects
None yet
Development

No branches or pull requests

1 participant