Skip to content

Conversation

@st81
Copy link
Contributor

@st81 st81 commented Oct 20, 2025

What does this PR do?

This PR replaces Tensor.new_tensor() calls to suppress warning that users cannot address. This allows users to avoid being confused by warnings they cannot act on, making it easier for them to focus on meaningful warnings.

More specifically, when calling the EncoderDecoderModel forward method, users see warning like:

from transformers import EncoderDecoderModel, AutoTokenizer

tok = AutoTokenizer.from_pretrained("bert-base-uncased")
model = EncoderDecoderModel.from_encoder_decoder_pretrained("bert-base-uncased", "bert-base-uncased")
model.config.decoder_start_token_id = tok.cls_token_id
model.config.pad_token_id = tok.pad_token_id

src = tok("hello world", return_tensors="pt")
tgt = tok("hi there", return_tensors="pt").input_ids
labels = tgt.clone()
labels[labels == tok.pad_token_id] = -100

out = model(
    input_ids=src["input_ids"],
    attention_mask=src["attention_mask"],
    labels=labels,
)
/home/shutotakahashi/projects/transformers-uv/transformers/src/transformers/models/encoder_decoder/modeling_encoder_decoder.py:453:
UserWarning: To copy construct from a tensor, it is recommended to use sourceTensor.detach().clone() or sourceTensor.detach().clone().requires_grad_(True), rather than tensor.new_tensor(sourceTensor).
  decoder_attention_mask = decoder_input_ids.new_tensor(decoder_input_ids != self.config.pad_token_id)

This warning is triggered by the use of Tensor.new_tensor() in the internal code and cannot be resolved by users.

The fix is functionally equivalent because new_tensor() creates a new tensor with the same dtype as the original tensor. Additionally, I think the fixed version is more semantically clear as it explicitly shows that decoder_attention_mask is created through tensor comparison followed by type casting.

Before submitting

  • This PR fixes a typo or improves the docs (you can dismiss the other checks if that's the case).
  • Did you read the contributor guideline,
    Pull Request section?
  • Was this discussed/approved via a Github issue or the forum? Please add a link
    to it if that's the case.
  • Did you make sure to update the documentation with your changes? Here are the
    documentation guidelines, and
    here are tips on formatting docstrings.
  • Did you write any new necessary tests?

Who can review?

@st81

This comment has been minimized.

@st81 st81 force-pushed the reduce_warning_noise_caused_by_new_tensor branch from ca20e5c to 89277c0 Compare October 21, 2025 00:14
@github-actions
Copy link
Contributor

[For maintainers] Suggested jobs to run (before merge)

run-slow: encoder_decoder

Copy link
Member

@Rocketknight1 Rocketknight1 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah, this is a more idiomatic way to do it, too. Thank you!

@Rocketknight1 Rocketknight1 enabled auto-merge (squash) October 21, 2025 11:46
@Rocketknight1 Rocketknight1 merged commit c4e88f7 into huggingface:main Oct 21, 2025
18 checks passed
@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

ngazagna-qc pushed a commit to ngazagna-qc/transformers that referenced this pull request Oct 23, 2025
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants