-
Notifications
You must be signed in to change notification settings - Fork 25.5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Bug Fix] fix qa pipeline tensor to numpy #31585
base: main
Are you sure you want to change the base?
Conversation
@@ -118,7 +118,7 @@ def select_starts_ends( | |||
max_answer_len (`int`): Maximum size of the answer to extract from the model's output. | |||
""" | |||
# Ensure padded tokens & question tokens cannot belong to the set of candidate answers. | |||
undesired_tokens = np.abs(np.array(p_mask) - 1) | |||
undesired_tokens = np.abs(p_mask.numpy() - 1) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Does this still work if you run the pipeline in jax?
from transformers import pipeline
pipe = pipeline("question-answering", model="hf-internal-testing/tiny-random-bert", framework="flax")
question = "What's my name?"
context = "My Name is Sasha and I live in Lyon."
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It will raise a value error.
ValueError: Pipeline cannot infer suitable model classes from hf-internal-testing/tiny-random-bert
.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Besides, temsor.numpy()
has been already used in other pipelines like ASR
Hi @Narsil @amyeroberts
This PR fixed the error for question-answering pipeline, the error could be reproduced by
Traceback: