Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix for Falcon image-to-text crash #1760

Merged
merged 1 commit into from
Feb 12, 2025
Merged

Conversation

schoi-habana
Copy link
Collaborator

  File "/root/optimum-habana/examples/image-to-text/run_pipeline.py", line 414, in <module>
    main()
  File "/root/optimum-habana/examples/image-to-text/run_pipeline
  .py", line 375, in main
    generator(images, prompt=args.prompt, batch_size=args.batch_size, generate_kwargs=generate_kwargs)
  File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/image_to_text.py", line 137, in call
    return super().call(inputs, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/base.py", line 1343, in call
    outputs = list(final_iterator)
  File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/pt_utils.py", line 124, in next
    item = next(self.iterator)
  File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/pt_utils.py", line 124, in next
    item = next(self.iterator)
  File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/dataloader.py", line 708, in next
    data = self._next_data()
  File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/dataloader.py", line 764, in _next_data
    data = self._dataset_fetcher.fetch(index)  # may raise StopIteration
  File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/usr/local/lib/python3.10/dist-packages/torch/utils/data/_utils/fetch.py", line 52, in <listcomp>
    data = [self.dataset[idx] for idx in possibly_batched_index]
  File "/usr/local/lib/python3.10/dist-packages/transformers/pipelines/pt_utils.py", line 19, in getitem
    processed = self.process(item, **self.params)
  File "/root/optimum-habana/examples/image-to-text/run_pipeline.py", line 368, in preprocess
    model_inputs = processor(images=image, text=prompt, return_tensors=self.framework, **kwargs)
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/llava_next/processing_llava_next.py", line 162, in call
    num_image_tokens = self._get_number_of_features(orig_height, orig_width, height, width)
  File "/usr/local/lib/python3.10/dist-packages/transformers/models/llava_next/processing_llava_next.py", line 181, in _get_number_of_features
    patches_height = height // self.patch_size
TypeError: unsupported operand type(s) for //: 'int' and 'NoneType' ```

This happens because of a change in LlavaNextProcessor in Transformers.
In Transformers 4.45.2, if self.patch_size was None, the code skipped the part that caused the error. But in later versions, it no longer skips it.
Other Llava-Next models didn’t run into this error, so the fix is limited to falcon-11b-vlm.

## Before submitting
- [ ] This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
- [ ] Did you make sure to update the documentation with your changes?
- [ ] Did you write any new necessary tests?

@libinta libinta added run-test Run CI for PRs from external contributors transformers_future labels Feb 10, 2025
Copy link
Collaborator

@regisss regisss left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

LGTM!

@regisss regisss merged commit 595b816 into transformers_future Feb 12, 2025
1 check passed
@regisss regisss deleted the schoi/hs4880 branch February 12, 2025 09:35
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
run-test Run CI for PRs from external contributors transformers_future
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants