Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix: streaming for Agent.createTask #788

Merged
merged 10 commits into from
May 2, 2024

Conversation

himself65
Copy link
Member

@himself65 himself65 commented May 1, 2024

Fixes: #590

Feedback from users, who want to show a response for each task output, because it will have a long time polling for function call

Roadmap

  • Readable Stream implementation
  • Eary return stream response, then we handle function call (llm.call -> user time -> handle llm.call -> next round)
  • Give some ability to user to modify current run? (not sure if this is good)

Update

  • should we implement our queuingStrategy with log(function call + time) cost???

Copy link

changeset-bot bot commented May 1, 2024

🦋 Changeset detected

Latest commit: 9b7c650

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 7 packages
Name Type
llamaindex Patch
@llamaindex/core-e2e Patch
docs Patch
@llamaindex/experimental Patch
@llamaindex/cloudflare-worker-agent-test Patch
@llamaindex/next-agent-test Patch
@llamaindex/waku-query-engine-test Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link

vercel bot commented May 1, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 2, 2024 0:29am

@himself65 himself65 marked this pull request as ready for review May 1, 2024 23:51
@himself65 himself65 changed the title feat: fully streaming for Agent.createTask fix: streaming for Agent.createTask May 2, 2024
@himself65 himself65 merged commit 61103b6 into run-llama:main May 2, 2024
14 of 15 checks passed
@himself65 himself65 deleted the himself65/20240501/agent-response branch May 2, 2024 04:28
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

streaming response in createTask of OpenAIAgent
1 participant