-
-
Notifications
You must be signed in to change notification settings - Fork 512
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(ollama): support calling the Ollama local process #2923
base: main
Are you sure you want to change the base?
Conversation
✅ Deploy Preview for testcontainers-go ready!
To edit notification comments on pull requests, go to your Netlify site configuration. |
0d0ff48
to
8619850
Compare
0146fc5
to
2f2865b
Compare
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This LGTM, waiting for you to mark is as ready. Great job with the process execution handling 🏆
|
||
// Terminate implements testcontainers.Container interface for the local Ollama binary. | ||
// It stops the local Ollama process, removing the log file. | ||
func (c *localProcess) Terminate(ctx context.Context) error { |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
todo: conflict with #2926, so depending on which one is merged first, we need to update it.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yer just replied to that PR, I don't think it works in its current form as it breaks the interface separation.
Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods. This defaults to running the binary with an ephemeral port to avoid port conflicts. This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal. Improve API compatibility with: - Multiplexed output streams - State reporting - Exec option processing - WaitingFor customisation Fix Container implementation: - Port management - Running checks - Terminate processing - Endpoint argument definition - Add missing methods - Consistent environment handling
Refactor local processing to use the new log sub match functionality.
Validate the container request to ensure the user configuration can be processed and no fields that would be ignored are present.
Remove temporary simple test.
Allow the local ollama binary name to be configured using the image name.
Detail the container request supported fields.
9927ad0
to
5e586c8
Compare
Update local process site docs to match recent changes.
Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods.
This defaults to running the binary with an ephemeral port to avoid conflict.
This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal.
Improve API compatibility with:
Fix Container implementation: