-
Notifications
You must be signed in to change notification settings - Fork 244
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
LlavaNextForConditionalGeneration.forward() got an unexpected keyword argument 'token_idx' #1708
Comments
Hi @DavidAbrahamyan! This happens because And you can remove from optimum.habana.transformers.models.llava_next import GaudiLlavaNextForConditionalGeneration as After that, your script returns a new error related to the number of image tokens. I guess you took inspiration from this example right? https://github.com/huggingface/optimum-habana/blob/main/examples/image-to-text/run_pipeline.py |
Thanks a lot for your response! I tried doing as you said, however, now, I encounter the following issue:
so it seems that there is a memory allocation issue. However, considering that I am able to do inference using pipeline class from transformers, I think it is strange to see such an issue. Do you happen to know what is the reason for that? Thanks a lot in advance |
@DavidAbrahamyan Can you try to install the lib from the main branch with
please? |
I am trying to do an inference using Llava Next
here is my code:
While running this, I get the following error:
TypeError: LlavaNextForConditionalGeneration.forward() got an unexpected keyword argument 'token_idx'
Any idea on what is causing this problem? Thanks in advance
The text was updated successfully, but these errors were encountered: