You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy-and-paste the text below in your GitHub issue and FILL OUT the two last points.
- `transformers` version: 4.48.2
- Platform: Linux-5.15.167.4-microsoft-standard-WSL2-x86_64-with-glibc2.35
- Python version: 3.10.12
- Huggingface_hub version: 0.28.1
- Safetensors version: 0.5.2
- Accelerate version: 1.3.0
- Accelerate config: not found
- PyTorch version (GPU?): 2.5.1+cu124 (True)
- Tensorflow version (GPU?): not installed (NA)
- Flax version (CPU?/GPU?/TPU?): not installed (NA)
- Jax version: not installed
- JaxLib version: not installed
- Using distributed or parallel set-up in script?: <fill in>
- Using GPU in script?: <fill in>
- GPU type: NVIDIA GeForce RTX 4070 Laptop GPU
Who can help?
No response
Information
The official example scripts
My own modified scripts
Tasks
An officially supported task in the examples folder (such as GLUE/SQuAD, ...)
System Info
transformers-cli env
Who can help?
No response
Information
Tasks
examples
folder (such as GLUE/SQuAD, ...)Reproduction
I was broadly following the steps outlined in this project's example: https://github.com/aehrc/cxrmate/blob/main/examples/cxrmate.ipynb
pip install --force-reinstall transformers==4.48.2
I found the fix in my case was to change:
to
in lines 384-386 of
transformers/generation/utils.py
Expected behavior
I would expect a default variable to be applied to the
model_inputs["past_key_values"]
variable, as has been assumed by thecxrmate
package.The text was updated successfully, but these errors were encountered: