Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(core): 0.3.0 release #775

Merged
merged 31 commits into from
May 1, 2024

Conversation

himself65
Copy link
Member

@himself65 himself65 commented Apr 26, 2024

Updates

  • add support for Cloudflare(maybe WinterJS and others also support, but I didn't test). Better support for next.js (you don't need to install @llamaindex/edge)
  • set default node.js version to 20 as it's a LITS; we might no longer consider 16 support but still support 18~22

Checklist

  • Update blog
    • README.md
    • Agent
    • Function Tool
    • TypeScript Support
  • Test Production
    • Edge runtime
    • Cloudflare worker
  • Agent examples
    • with Cloudflare Worker
    • with Next.js + Vercel AI RSC
    • with Waku (vite)

QA part

(asked by @marcusschiesser)

Q: AsyncIterator could be transformed to ReadableStream easily

Yes, ReadableStream is async iterable by native, and all of the async iterator could be translated to ReadableStream in a few lines. (for example https://github.com/vercel/ai/blob/73356a9d46f1d271e3278cb92bd8ebe92a6a058d/packages/core/streams/ai-stream.ts#L284)

But this does not mean we should still return only an async iterator. Readable Stream could bring more features to the developer side.

  1. tee, pipeTo, pipeThough, and Stream helpers, which are already existing useful APIs
  2. cloneable, a stream could be cloned which would benefit whose wanna re-use a response
const r = new ReadableStream({
	start: (controller) => {
		controller.enqueue('1');
		controller.enqueue('2');
		controller.enqueue('3');
		controller.close();
	},
});

const [a, b] = r.tee()

for await (const v of a) {
	console.log('v', v);
}

const cp = structuredClone(b, {
	transfer: [b],
})

for await (const v of cp) {
	console.log('v', v);
}
// outputs:
// v 1
// v 2
// v 3
// v 1
// v 2
// v 3

Is this a breaking change?

No, Readable Stream is already async iterable (which was async iterable in the same). But we announce this as new version to show that with Readable could do more things

What about other old APIs that use AsyncIterator?

Personally, I think we should change them, too, but this is not an emergency and not have extra benefit, but only code style changes which I think would benefit for reading?

// old
function f() {
  return (function*(){})() // this is ugly to return a generator
}
// new 
function g() {
  return new ReadableStream({
    start(controller) {controller.end()}
  })
}

Copy link

changeset-bot bot commented Apr 26, 2024

🦋 Changeset detected

Latest commit: b901ab7

The changes in this PR will be included in the next version bump.

This PR includes changesets to release 9 packages
Name Type
llamaindex Minor
@llamaindex/edge Minor
@llamaindex/env Minor
docs Patch
@llamaindex/experimental Patch
@llamaindex/cloudflare-worker-agent-test Patch
@llamaindex/next-agent-test Patch
@llamaindex/waku-query-engine Patch
@llamaindex/nextjs-edge-runtime-test Patch

Not sure what this means? Click here to learn what changesets are.

Click here if you're a maintainer who wants to add another changeset to this PR

Copy link

vercel bot commented Apr 26, 2024

The latest updates on your projects. Learn more about Vercel for Git ↗︎

Name Status Preview Comments Updated (UTC)
llama-index-ts-docs ✅ Ready (Inspect) Visit Preview 💬 Add feedback May 1, 2024 3:32am

@himself65 himself65 marked this pull request as ready for review April 29, 2024 01:23
@himself65 himself65 changed the title feat: llamaindex 0.3.0 release prepare feat(core): 0.3.0 release Apr 29, 2024
},
}),
);
// @ts-expect-error: see https://github.com/cloudflare/workerd/issues/2067
Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

"wrangler": "^3.52.0"
},
"dependencies": {
"llamaindex": "workspace:*"
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

might be a bit confusing that cloudflare is using llamaindex and for edge we have a dedicated package.
Technically, we don't have a reason for a dedicated edge package; the idea was that we could track usage separately.

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think this use same package name will be better, It's not a suggeseted way to seperate pacakges

Copy link
Member Author

@himself65 himself65 left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

tomorrow I will release a beta version and try to bump create-llama and chat-llamaindex

@marcusschiesser
Copy link
Collaborator

@himself65 for create-llama you might want to use this PR: run-llama/create-llama#63 as @thucpn already updated some code for your agents there

@himself65 himself65 merged commit 5016f21 into run-llama:main May 1, 2024
15 checks passed
@himself65 himself65 deleted the himself65/20240426/next-example branch May 1, 2024 03:34
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants