Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat(ollama): support calling the Ollama local process #2923

Open
wants to merge 40 commits into
base: main
Choose a base branch
from

Conversation

stevenh
Copy link
Collaborator

@stevenh stevenh commented Dec 17, 2024

Refactor local process handling for Ollama using a container implementation avoiding the wrapping methods.

This defaults to running the binary with an ephemeral port to avoid conflict.
This behaviour can be overridden my setting OLLAMA_HOST either in the parent environment or in the values passed via WithUseLocal.

Improve API compatibility with:

  • Multiplexed output streams
  • State reporting
  • Exec option processing
  • WaitingFor customisation

Fix Container implementation:

  • Port management
  • Running checks
  • Terminate processing
  • Endpoint argument definition
  • Add missing methods
  • Consistent environment handling

Copy link

netlify bot commented Dec 17, 2024

Deploy Preview for testcontainers-go ready!

Name Link
🔨 Latest commit 52bd20f
🔍 Latest deploy log https://app.netlify.com/sites/testcontainers-go/deploys/6765b0c2bbf985000831ca66
😎 Deploy Preview https://deploy-preview-2923--testcontainers-go.netlify.app
📱 Preview on mobile
Toggle QR Code...

QR Code

Use your smartphone camera to open QR code link.

To edit notification comments on pull requests, go to your Netlify site configuration.

@stevenh stevenh changed the title refactor(ollama): local process feat(ollama): local process Dec 17, 2024
@stevenh stevenh changed the title feat(ollama): local process feat(ollama): refactor local process Dec 17, 2024
@stevenh stevenh force-pushed the refactor/ollama-local branch from 0d0ff48 to 8619850 Compare December 17, 2024 23:58
@github-actions github-actions bot added the feature New functionality or new behaviors on the existing one label Dec 17, 2024
@stevenh stevenh force-pushed the refactor/ollama-local branch 2 times, most recently from 0146fc5 to 2f2865b Compare December 18, 2024 19:02
mdelapenya
mdelapenya previously approved these changes Dec 20, 2024
Copy link
Member

@mdelapenya mdelapenya left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This LGTM, waiting for you to mark is as ready. Great job with the process execution handling 🏆

modules/ollama/local.go Outdated Show resolved Hide resolved
modules/ollama/local.go Show resolved Hide resolved

// Terminate implements testcontainers.Container interface for the local Ollama binary.
// It stops the local Ollama process, removing the log file.
func (c *localProcess) Terminate(ctx context.Context) error {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

todo: conflict with #2926, so depending on which one is merged first, we need to update it.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yer just replied to that PR, I don't think it works in its current form as it breaks the interface separation.

modules/ollama/local.go Show resolved Hide resolved
wait/log.go Outdated Show resolved Hide resolved
mdelapenya and others added 19 commits December 20, 2024 16:40
Refactor local process handling for Ollama using a container implementation
avoiding the wrapping methods.

This defaults to running the binary with an ephemeral port to avoid port
conflicts. This behaviour can be overridden my setting OLLAMA_HOST either
in the parent environment or in the values passed via WithUseLocal.

Improve API compatibility with:

- Multiplexed output streams
- State reporting
- Exec option processing
- WaitingFor customisation

Fix Container implementation:

- Port management
- Running checks
- Terminate processing
- Endpoint argument definition
- Add missing methods
- Consistent environment handling
Refactor local processing to use the new log sub match functionality.
Validate the container request to ensure the user configuration can be processed
and no fields that would be ignored are present.
Remove temporary simple test.
Allow the local ollama binary name to be configured using the image name.
Detail the container request supported fields.
Update local process site docs to match recent changes.
@stevenh stevenh marked this pull request as ready for review December 20, 2024 18:05
@stevenh stevenh requested a review from a team as a code owner December 20, 2024 18:05
@mdelapenya mdelapenya changed the title feat(ollama): refactor local process feat(ollama): support calling the Ollama local process Dec 20, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature New functionality or new behaviors on the existing one
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants