-
I'm trying to get the raw response from the model provider in both streaming and non streaming usecases. An example usecase would be to do something like implementing an API mirror for the provider. I've tried a few methods using I would like to get a response object that matches whatever the provider's response is. I COULD manually process the So as an example, in an OpenAI stream, I might get a response like: {
"id": "chatcmpl-xxxxxxxxx",
"object": "chat.completion.chunk",
"created": 1719518600,
"model": "gpt-3.5-turbo",
"system_fingerprint": null,
"choices": [
{
"index": 0,
"delta": {
"content": "?"
},
"finish_reason": null
}
]
} Using
|
Beta Was this translation helpful? Give feedback.
Replies: 1 comment 1 reply
-
One option would be to intercept the |
Beta Was this translation helpful? Give feedback.
I ended up doing something like the below. Unfortunately, I couldn't figure out how to use the fetch to actually access the response: