-
Notifications
You must be signed in to change notification settings - Fork 375
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(core): 0.3.0 release #775
feat(core): 0.3.0 release #775
Conversation
🦋 Changeset detectedLatest commit: b901ab7 The changes in this PR will be included in the next version bump. This PR includes changesets to release 9 packages
Not sure what this means? Click here to learn what changesets are. Click here if you're a maintainer who wants to add another changeset to this PR |
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
…f65/20240426/next-example
}, | ||
}), | ||
); | ||
// @ts-expect-error: see https://github.com/cloudflare/workerd/issues/2067 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
"wrangler": "^3.52.0" | ||
}, | ||
"dependencies": { | ||
"llamaindex": "workspace:*" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
might be a bit confusing that cloudflare is using llamaindex
and for edge we have a dedicated package.
Technically, we don't have a reason for a dedicated edge package; the idea was that we could track usage separately.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this use same package name will be better, It's not a suggeseted way to seperate pacakges
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
tomorrow I will release a beta version and try to bump create-llama and chat-llamaindex
@himself65 for create-llama you might want to use this PR: run-llama/create-llama#63 as @thucpn already updated some code for your agents there |
… into himself65/20240426/next-example
Updates
@llamaindex/edge
)20
as it's a LITS; we might no longer consider 16 support but still support 18~22Checklist
QA part
(asked by @marcusschiesser)
Q: AsyncIterator could be transformed to ReadableStream easily
Yes, ReadableStream is async iterable by native, and all of the async iterator could be translated to ReadableStream in a few lines. (for example https://github.com/vercel/ai/blob/73356a9d46f1d271e3278cb92bd8ebe92a6a058d/packages/core/streams/ai-stream.ts#L284)
But this does not mean we should still return only an async iterator. Readable Stream could bring more features to the developer side.
Is this a breaking change?
No, Readable Stream is already async iterable (which was async iterable in the same). But we announce this as new version to show that with Readable could do more things
What about other old APIs that use AsyncIterator?
Personally, I think we should change them, too, but this is not an emergency and not have extra benefit, but only code style changes which I think would benefit for reading?