Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

StreamData response Lags #1963

Open
lightify97 opened this issue Jun 15, 2024 · 7 comments
Open

StreamData response Lags #1963

lightify97 opened this issue Jun 15, 2024 · 7 comments
Assignees
Labels
ai/ui bug Something isn't working

Comments

@lightify97
Copy link

lightify97 commented Jun 15, 2024

Description

Hi! I'm experiencing an issue with stream data response. So mainly there are two issues:

  • I get the stream data result in the sources but same data is repeated multiple times as new entries in returned array.

image

  • The UI in both modes becomes unresponsive for a long period of time. As can be seen in the recording below. This is being run in the build mode. Previously I had set the chatMode to text and the endpoint was not adding any streamData, it was working perfectly fine without any issues.

ai

Versions:
"ai": "^3.1.30",
"@ai-sdk/openai": "^0.0.9",
"next": "14.1.4",
"@langchain/community": "^0.0.53",
"@langchain/core": "^0.1.61",
"@langchain/openai": "^0.0.28",

Code example

export async function POST(req: NextRequest, res: NextResponse) {
  try {
    const body = await req.json();
    const { question, chatId } = body;
    const messages = body.messages ?? [];
    const previousMessages = messages.slice(0, -1);
    const currentMessageContent = messages[messages.length - 1].content;


    if (typeof currentMessageContent !== "string") {
      const result = await streamText({
        model: openai("gpt-4o"),
        messages: [
          {
            role: 'user',
            content: [
              { type: 'text', text: ANSWER_TEMPLATE + currentMessageContent[0].text },
              {
                type: 'image',
                image: currentMessageContent[1].image,
              },
            ],
          },
        ],
      });
      return new StreamingTextResponse(result.textStream, {
        headers: {
          "x-message-index": (previousMessages.length + 1).toString(),
        },
      });
    }

    const contextCall = await fetch(
      `https://getprompt${process.env.FUNCTIONS_BASE_URL}`,
      {
        method: "POST",
        body: JSON.stringify({ question: currentMessageContent, chatId }),
        headers: {
          "Content-Type": "application/json",
        },
      },
    );

    const outputParser = new BytesOutputParser()
    // const outputParser = new StringOutputParser()
    const model = new ChatOpenAI({
      apiKey: process.env.OPENAI_API_KEY,
      modelName: "gpt-4o",
      temperature: 0.5,
      streaming: true,
    });

    const context = (await contextCall.json());

    const chain = await answerPrompt.pipe(model).stream({
      question: currentMessageContent,
      context,
      chat_history: formatVercelMessages(previousMessages),
    })


    // Create a new StreamData
    const data = new StreamData();
    data.append(transformData(context));

    const aiStream = LangChainAdapter.toAIStream(chain, {
      onFinal(completion) {
        data.close();
      },
    });

    return new StreamingTextResponse(aiStream, {
      headers: {
        "x-message-index": (previousMessages.length + 1).toString(),
      },
    }, data);

  } catch (error: any) {
    console.error(error);
    return NextResponse.json({
      status: 500,
      error: "Something went wrong. Please try again later.",
    });
  }
}

Here's part of the client side component:

"use client";
import { useAppContext } from "@/lib/hooks/useAppContext";
import { Spinner } from "@nextui-org/react";
import { Message } from "ai";
import { useChat } from "ai/react";
import { useEffect, useRef, useState } from "react";
import ChatInput from "./ChatInput";
import ChatMessage from "./ChatMessage";

const ChatWindow = ({ id }: { id: string }) => {
  // const [messages, setMessages] = useState<Message[]>([]);
  const [intermediateLoading, setIntermediateLoading] = useState(false);
  const [initializing, setInitializing] = useState(true);
  const messageInputRef = useRef<HTMLElement | null>(null);
  const [followUpQuestions, setFollowUpQuestions] = useState<string[]>([]);
  const { state, dispatch } = useAppContext();

  function getFollowUpQuestions(text: string) {
    fetch("/api/chat/followup", {
      method: "POST",
      body: JSON.stringify({ answer: text }),
      headers: {
        "Content-Type": "application/json",
      },
    })
      .then((res) => res.json())
      .then((data) => {
        setFollowUpQuestions(data.followupQuestions);
      });
  }

  async function syncMessage(message: any) {
    fetch("/api/chat/sync", {
      method: "POST",
      body: JSON.stringify({ chatId: id, message }),
      headers: {
        "Content-Type": "application/json",
      },
    })
      .then((res) => res.json())
      .then((data) => {});
  }

  const {
    messages,
    setMessages,
    handleSubmit,
    input,
    setInput,
    isLoading,
    error,
    handleInputChange,
    append,
    reload,
    data: sources,
  } = useChat({
    body: { chatId: id },
    streamMode: "stream-data",
    onResponse(response) {
      setIntermediateLoading(false);
      messageInputRef.current?.scrollIntoView({ behavior: "smooth" });
    },
    async onFinish(response) {
      console.log(sources);
      setIntermediateLoading(false);
      messageInputRef.current?.scrollIntoView({ behavior: "smooth" });
      syncMessage(response);
      getFollowUpQuestions(response.content);
    },
  });

  useEffect(() => {
    if (state.question) {
      append(state.question);
      syncMessage(state.question);
      dispatch({ type: "CLEAR_QUESTION" });
    }
  }, [state.question]);

  useEffect(() => {
    messageInputRef.current = document.getElementById("message-input");
    const fetchSummary = async () => {
      const res = await fetch(`/api/chat?id=${id}`);
      const data = await res.json();
      const initialMessage: Message = {
        id: "0",
        role: "assistant",
        content: data.data.summary,
        createdAt: new Date(),
      };
      if (data.data?.messages) {
        setMessages([initialMessage, ...data.data?.messages]);
        getFollowUpQuestions(
          data.data.messages[data.data.messages.length - 1].content,
        );
      } else {
        setMessages([initialMessage]);
        setFollowUpQuestions(data.data.followupQuestions);
      }
      setInitializing(false);
    };

    fetchSummary();
  }, []);

  const sendMessage = async (e: any) => {
    console.log("Sending message with event: ", e);
    // create a new message object
    let message = {
      content: input,
      role: "user",
      createdAt: new Date(),
    };
    setIntermediateLoading(true);
    setFollowUpQuestions([]);
    handleSubmit(e);
    setIntermediateLoading(false);
    syncMessage(message);
    // scroll to bottom
    messageInputRef.current?.scrollIntoView({ behavior: "smooth" });
  };

  if (initializing) {
    return (
      <div className="site-bg flex max-h-screen min-h-screen flex-col items-center justify-center overflow-y-auto">
        <div className="flex flex-col gap-2 p-4">
          <Spinner size="lg" />
          <p>Loading Messages...</p>
        </div>
      </div>
    );
  }

  if (error) {
    return (
      <div className="site-bg flex max-h-screen min-h-screen flex-col items-center justify-center overflow-y-auto">
        <div className="flex flex-col gap-2 p-4">
          <p>Error: {error.message}</p>
          <button
            onClick={() => {
              reload();
            }}
          >
            Retry
          </button>
        </div>
      </div>
    );
  }

  return (
    <div className="site-bg flex max-h-screen min-h-screen flex-col justify-between overflow-y-auto">
      <div className="flex flex-col gap-2 p-4">
        {messages.map((message, index) => (
          <>
            <ChatMessage
              key={index}
              sources={sources}
              message={message}
              loading={isLoading}
              first={index === 0}
              last={index === messages.length - 1}
              append={append}
              followUpQuestions={followUpQuestions}
              setFollowUpQuestions={setFollowUpQuestions}
              syncMessage={syncMessage}
            />
          </>
        ))}
      </div>
      {isLoading && (
        <Spinner
          size="lg"
          className="animate-spin"
          style={{ margin: "0 auto" }}
        />
      )}

      <div>
        <ChatInput
          onSendMessage={sendMessage}
          input={input}
          setInput={setInput}
          handleInputChange={handleInputChange}
        />
        <div className="flex flex-col items-center justify-center py-2">
          <p className="text-center text-xs text-gray-500 dark:text-gray-400">
            <strong>Note:</strong> I’m still learning. My answer can only be
            used as a reference.
          </p>
          <p className="text-center text-xs  text-gray-400 dark:text-gray-500">
            This is a free community plan, documents and associated chats will
            be removed after 7 days if inactivity.
          </p>
        </div>
      </div>
    </div>
  );
};

export default ChatWindow;

Update

When stream data is removed from the response like below it still lags. So the problem might be related to LangchainAdapter or stream-data in the useChat hook.

// // Create a new StreamData
    // const data = new StreamData();
    // data.append(transformData(context));

    // // Append additional data

    const aiStream = LangChainAdapter.toAIStream(chain, {
      onFinal(completion) {
        // data.close();
      },
    });

    return new StreamingTextResponse(aiStream, {
      headers: {
        "x-message-index": (previousMessages.length + 1).toString(),
      },
    });

Additional context

No response

@lightify97 lightify97 changed the title Stream Data response Lags StreamData response Lags Jun 15, 2024
@lightify97
Copy link
Author

@lgrammel any pointers?

@lgrammel lgrammel added bug Something isn't working ai/ui labels Jun 18, 2024
@amadk
Copy link

amadk commented Jun 25, 2024

We're facing this issue too, it's happening on all models, even without langchain. When we reverted back the ai package from v3 to v2, then the streaming works fine, no lag.

@lgrammel lgrammel self-assigned this Jun 25, 2024
@lgrammel
Copy link
Collaborator

@amadk do you have a minimal reproduction that you could share?

@amadk
Copy link

amadk commented Jun 28, 2024

ok it seems to be working fine with the minimal reproduction, like with just pure Nextjs (page router) and ai sdk, no library or anything. We're going to try upgrading our UI library and refactoring the components on the chat page to see if it helps. It was just strange because the ai SDK v2 worked fine with our current setup. Will investigate a bit more and send updates.

@jad-eljerdy
Copy link

Any update on this ? Happening on our end as well, can't seem to get it to work!

@amadk
Copy link

amadk commented Aug 20, 2024

Hi sorry for the super late response. the problem started occurring for us when we used react-syntax-highlighter with ai SDK. I think the problem is not that ai sdk is laggy, it's actually super fast, the problem is it causes the components to re-render so fast (especially with the fast models like claude sonnet), the page begins to freeze and becomes laggy. So we added a throttle to fix it:

export const useThrottleMessages = (messages) => {
  const [throttledMessages, setThrottledMessages] = useState<any>([])

  useEffect(() => {
    const updateThrottledMessages = throttle((newMessages) => {
      setThrottledMessages(newMessages)
    }, 50)

    updateThrottledMessages(messages)

    return () => {
      updateThrottledMessages.cancel()
    }
  }, [messages?.[messages?.length - 1]?.content])

  return { throttledMessages }
}

@jad-eljerdy
Copy link

@amadk Great insight, thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
ai/ui bug Something isn't working
Projects
None yet
Development

No branches or pull requests

4 participants