-
Notifications
You must be signed in to change notification settings - Fork 2.2k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(google-vertexai): Support Non-Google and Model Garden models in Vertex AI - Anthropic integration #6999
base: main
Are you sure you want to change the base?
Conversation
Testing
… a way that makes sense. Add option for fetch() that determines how to handle missing MediaBlobs.
… a way that makes sense. Add option for fetch() that determines how to handle missing MediaBlobs.
Change default action if invalid blob. Add "ignore" action for invalid blobs.
Bug fix for BlobStore.store with how it handles the key
Required making some functions and methods async
# Conflicts (resolved): # libs/langchain-google-common/src/chat_models.ts # libs/langchain-google-common/src/utils/gemini.ts
@bracesproul @jacoblee93 - I believe this is ready for review and, if things look good, integration and release. This is just the Anthropic updates. As the other supported models are ready, I'll be doing separate PRs for them. (I don't want to hold things up.) |
Hmmm... I see the docs have failed, but I don't see why. |
return this._location ?? this.computedLocation; | ||
} | ||
|
||
get computedLocation(): string { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Any point to having a base implementation here? One less level of indirection to just put it in location
above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Mostly this is how location and endpoint evolved and come from a few requirements:
- The endpoint is usually, but not always, location based. So it may need to be specified separately from location.
- Other elements of the path may have location. So location may need to be specified separately from endpoint.
- In the simple cases, I didn't want people to have to specify both, but usually just location. But have them be able to if necessary.
Most resources are available at us-central1. But, apparently not all. So I wanted subclasses to be able to set their default location if appropriate.
And I wanted to be able to support default locations. Requiring a developer to have to know more magic values makes for less manageable code. Especially if we can avoid it. (We support default values all over the place.)
return this._endpoint ?? this.computedEndpoint; | ||
} | ||
|
||
get computedEndpoint(): string { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See above
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See above.
get computedLocation(): string { | ||
switch (this.apiName) { | ||
case "google": | ||
return super.computedLocation; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why isn't this just super.location
?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
computedLocation
is typically only called from location
. Keeping it calling the same method on the superclass keeps it clean and avoids additional logic processing.
case "google": | ||
return super.computedLocation; | ||
case "anthropic": | ||
return "us-east5"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Should we be hardcoding this here?
If it's only available in one region now, would prefer to have this configurable or even not have a default at all and just have it documented
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
See above about making magic values as unnecessary as possible.
Some of the Anthropic models are available in other regions, but all are available in us-east5. And none, mysteriously, are available in us-central1.
I'd like to keep it as a default, but clearly have it available to be settable.
description: tool.description ?? `A function available to call.`, | ||
parameters: jsonSchema, | ||
}; | ||
export class GoogleRequestLogger extends GoogleRequestCallbackHandler { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is intended for public use? Or just your own debugging?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Initially for my debugging, but there have been cases where people have reported problems that I couldn't duplicate. Being able to easily have them add this logger to the callbacks to see what the request and response were would make it easier to create a test and a fix.
So it is rare that anyone would need to use it, but invaluable to have in place if they do.
* If the content is an array of just text fields, turn them into a string. | ||
* @param fields | ||
*/ | ||
function newAIMessageChunk(fields: string | AIMessageFields): AIMessageChunk { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Think it's worth exposing and reusing some logic from @langchain/anthropic
?
https://github.com/langchain-ai/langchainjs/tree/main/libs/langchain-anthropic/src/utils
Could even make it a dep
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I considered it, and even tried for a bit, but ultimately decided not to for a few reasons:
- It adds an unnecessary dependency in the case where people don't intend to use Claude on GCP. Including pulling in the Anthropic library.
- Some of the types aren't exactly compatible (I don't remember which ones now - I just remember starting down this road and running into issues).
The code you highlighted here has more to do with how my logic was generating the content objects and how there are expectations in parts of the base code that AIMessageChunk.content
should be a string. So this normalizes it in those cases. (Which are almost all the cases for AI Messages historically.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
(To be clear, however, I'm open minded on this. I don't like duplicating code.)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Overall looks fantastic - left some nits/comments and will try myself before cutting a release
@jacoblee93 - Thanks! Very much appreciate the feedback. Any thoughts on the problems with docs? |
Fixes #2562
Fixes #6207
The ultimate objective is to add full support to Vertex AI for the various 3P models that are provided via API as well as custom user endpoints they may deploy for personal models
This will be done through an API object that provides methods that both turn the LangChain objects into model-API compatible objects that will be formatted in JSON as well as take the JSON result and convert it to LangChain objects.
In this first batch, we get the Gemini support integrated with this API object and add Anthropic / Claude support.
This work was supported by Google Cloud Credits provided by Google.