Skip to content

Commit

Permalink
Add information about ollama.
Browse files Browse the repository at this point in the history
Thanks to @krassowski for pointing out the issue.  See #840 for additional
suggestions on how to improve the UX for unlisted models; for now this only
addresses clarifying the docs.
  • Loading branch information
fperez committed Feb 7, 2025
1 parent fbc4895 commit ac78b93
Showing 1 changed file with 15 additions and 1 deletion.
16 changes: 15 additions & 1 deletion docs/source/users/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -439,7 +439,20 @@ models.

### Ollama usage

To get started, follow the instructions on the [Ollama website](https://ollama.com/) to set up `ollama` and download the models locally. To select a model, enter the model name in the settings panel, for example `deepseek-coder-v2`.
To get started, follow the instructions on the [Ollama website](https://ollama.com/) to set up `ollama` and download the models locally. To select a model, enter the model name in the settings panel, for example `deepseek-coder-v2`. You can see all locally available models with `ollama list`.

Note that for the models to be available to JupyterLab-AI, your Ollama server must be running. You can check that this is the case by calling `ollama serve` at the terminal, and should see something like:

```
$ ollama serve
Error: listen tcp 127.0.0.1:11434: bind: address already in use
```

In some platforms (e.g. macOS or Windows), there may also be a graphical user interface or application that lets you start/stop the Ollama server from a menu.

:::{tip}
If you don't see Ollama listed as a model provider in the configuration box, despite confirming that your Ollama server is active, you may not have installed the []`langchain-ollama` python package](https://pypi.org/project/langchain-ollama/) that is necessary for Jupyter-AI to interface with Ollama, as indicated in the [model providers](#model-providers) section above.
:::

### vLLM usage

Expand Down Expand Up @@ -710,6 +723,7 @@ We currently support the following language model providers:
- `cohere`
- `huggingface_hub`
- `nvidia-chat`
- `ollama`
- `openai`
- `openai-chat`
- `sagemaker-endpoint`
Expand Down

0 comments on commit ac78b93

Please sign in to comment.