Skip to content

Commit

Permalink
[pre-commit.ci] auto fixes from pre-commit.com hooks
Browse files Browse the repository at this point in the history
for more information, see https://pre-commit.ci
  • Loading branch information
pre-commit-ci[bot] committed Feb 5, 2025
1 parent 6d4184f commit f072c66
Show file tree
Hide file tree
Showing 2 changed files with 3 additions and 3 deletions.
2 changes: 1 addition & 1 deletion docs/source/users/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -443,7 +443,7 @@ To get started, follow the instructions on the [Ollama website](https://ollama.c

### vLLM usage

`vLLM` is a fast and easy-to-use library for LLM inference and serving. The [vLLM website](https://docs.vllm.ai/en/latest/) explains installation and usage. To use `vLLM` in Jupyter AI, please see the dedicated documentation page on using [vLLM in Jupyter AI](vllm.md).
`vLLM` is a fast and easy-to-use library for LLM inference and serving. The [vLLM website](https://docs.vllm.ai/en/latest/) explains installation and usage. To use `vLLM` in Jupyter AI, please see the dedicated documentation page on using [vLLM in Jupyter AI](vllm.md).

### Asking about something in your notebook

Expand Down
4 changes: 2 additions & 2 deletions docs/source/users/vllm.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

`vLLM` is a fast and easy-to-use library for LLM inference and serving. The [vLLM website](https://docs.vllm.ai/en/latest/) explains installation and usage.

Depending on your hardware set up you will install `vLLM` using these [instructions](https://docs.vllm.ai/en/latest/getting_started/installation/index.html). It is best to install it in a dedicated python environment.
Depending on your hardware set up you will install `vLLM` using these [instructions](https://docs.vllm.ai/en/latest/getting_started/installation/index.html). It is best to install it in a dedicated python environment.

Once it is installed you may start serving any model with the command:
```python
Expand All @@ -24,7 +24,7 @@ Start up Jupyter AI and update the AI Settings as follows (notice that we are us
alt="Screen shot of AI setting for using vllm."
class="screenshot" width="75%"/>

Since vLLM may be addressed using OpenAI's API, you can test if the model is available using the API call as shown:
Since vLLM may be addressed using OpenAI's API, you can test if the model is available using the API call as shown:

<img src="../_static/vllm-api.png"
alt="Screen shot of using vllm programmatically with its API."
Expand Down

0 comments on commit f072c66

Please sign in to comment.