Unsuppressable warning: "<model> will not detect padding tokens in inputs_embeds
"
#30871
Open
2 of 4 tasks
inputs_embeds
"
#30871
System Info
transformers
version: 4.39.3Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
Running this example prints a warning for every loop, despite there being no padding.
Output:
Expected behavior
The warning is printed once, or at least there is some way to disable the warning.
I was split between whether this should be a bug report or a feature request. It makes sense to display this warning, but in my project I need to run on embeddings often and the warning really spams the logs.
For a while I was only running batches of size
1
so I was suppressing the warning by temporarily settingmodel.config.pad_token_id = None
. The problem with this is that I then can't run batches of size>1
even if I'm careful to make them the same length with no padding tokens.I'm not sure of the best way to handle this, but either using the
warnings
library to make it only print once and/or allow it to be suppressed would help, or having some flag to disable the warning.The earliest instance of the string
will not detect padding tokens in
in the codebase I could find was from #7501.The text was updated successfully, but these errors were encountered: