You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm requesting support for chunk ranking in the file search tool when using openai-php/laravel. Currently, the file search returns all results it deems relevant, but this can lead to lower-quality responses if the model uses content with low relevance. It would be useful to adjust this behavior by enabling chunk ranking configuration in the file_search tool to ensure only highly relevant chunks are used.
The expected functionality would allow:
Inspecting file search chunks: Using parameters like include to retrieve the specific file chunks used during a response generation run.
ranker: Which ranker to use, e.g., auto or default_2024_08_21.
score_threshold: A value between 0.0 and 1.0, to filter file chunks based on their relevance score, improving the quality of responses.
For example, in the OpenAI API, you can inspect the file chunks during a run as follows:
This feature would significantly enhance the precision of responses generated from file searches. It would be great if this could be incorporated into future releases.
Thank you!
The text was updated successfully, but these errors were encountered:
I'm requesting support for chunk ranking in the file search tool when using openai-php/laravel. Currently, the file search returns all results it deems relevant, but this can lead to lower-quality responses if the model uses content with low relevance. It would be useful to adjust this behavior by enabling chunk ranking configuration in the file_search tool to ensure only highly relevant chunks are used.
The expected functionality would allow:
Inspecting file search chunks: Using parameters like include to retrieve the specific file chunks used during a response generation run.
Configurable chunk ranking: Adjusting settings like:
ranker: Which ranker to use, e.g., auto or default_2024_08_21.
score_threshold: A value between 0.0 and 1.0, to filter file chunks based on their relevance score, improving the quality of responses.
For example, in the OpenAI API, you can inspect the file chunks during a run as follows:
This feature would significantly enhance the precision of responses generated from file searches. It would be great if this could be incorporated into future releases.
Thank you!
The text was updated successfully, but these errors were encountered: