feat: Support ollama instead of llamafile #231
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
New command line flag:
--llm-type
- Set tolitellm
by default, and accepts 'google' or 'ollama'Removed
is_local
flagExample:
podcastfy/client.py --llm-type ollama --llm-model-name=llama3.1:8b-instruct-q8_0 --transcript ./data/transcripts/transcript_a846e44acfe143579b1fa570feb73328.txt