-
Notifications
You must be signed in to change notification settings - Fork 112
Add mlx_lm.server example #302
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: main
Are you sure you want to change the base?
Conversation
WalkthroughAdds a new model entry to Changes
Estimated code review effort🎯 2 (Simple) | ⏱️ ~8 minutes Pre-merge checks and finishing touches✅ Passed checks (3 passed)
✨ Finishing touches🧪 Generate unit tests
📜 Recent review detailsConfiguration used: Path: .coderabbit.yaml Review profile: CHILL Plan: Pro 📒 Files selected for processing (1)
✅ Files skipped from review due to trivial changes (1)
Tip 👮 Agentic pre-merge checks are now available in preview!Pro plan users can now enable pre-merge checks in their settings to enforce checklists before merging PRs.
Please see the documentation for more information. Example: reviews:
pre_merge_checks:
custom_checks:
- name: "Undocumented Breaking Changes"
mode: "warning"
instructions: |
Pass/fail criteria: All breaking changes to public APIs, CLI flags, environment variables, configuration keys, database schemas, or HTTP/GraphQL endpoints must be documented in the "Breaking Change" section of the PR description and in CHANGELOG.md. Exclude purely internal or private changes (e.g., code not exported from package entry points or explicitly marked as internal). Please share your feedback with us on this Discord post. Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
📜 Review details
Configuration used: Path: .coderabbit.yaml
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (1)
config.example.yaml
(2 hunks)
🔇 Additional comments (1)
config.example.yaml (1)
256-256
: LGTMPreload entry unchanged functionally.
@losuler Does it support full set of llama.cpp endpoints? |
interesting.. didnt realize that llama-swap supports mlx_lm as well! |
I guess not, but it supports Footnotes
|
This one took me longer than I liked to figure out, so I wanted to add an example for anyone else who is coming across the same problem. Which was that without
useModelName
being the file path,mlx_lm.server
is using the model name, which causes it to download the model to the huggingface cache path, instead of using the one I manually downloaded to the specified--model
path.Summary by CodeRabbit