Skip to content
Merged
Show file tree
Hide file tree
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 0 additions & 1 deletion doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -239,7 +239,6 @@ def __init__(self, version: str):
"ray-overview/examples/llamafactory-llm-fine-tune/README.ipynb",
"ray-overview/examples/llamafactory-llm-fine-tune/**/*.ipynb",
"serve/tutorials/asynchronous-inference/content/asynchronous-inference.ipynb",
"serve/tutorials/asynchronous-inference/content/README.md",
] + autogen_files

# If "DOC_LIB" is found, only build that top-level navigation item.
Expand Down
Original file line number Diff line number Diff line change
@@ -1,3 +1,6 @@
---
orphan: true
---
# Asynchronous Inference with Ray Serve

**⏱️ Time to complete:** 30 minutes
Expand Down Expand Up @@ -54,14 +57,14 @@ Redis serves as both the message broker (task queue) and result backend.
**Install and start Redis (Google Colab compatible):**


```python
```bash
# Install and start Redis server
!sudo apt-get update -qq
!sudo apt-get install -y redis-server
!redis-server --port 6399 --save "" --appendonly no --daemonize yes
sudo apt-get update -qq
sudo apt-get install -y redis-server
redis-server --port 6399 --save "" --appendonly no --daemonize yes

# Verify Redis is running
!redis-cli -p 6399 ping
redis-cli -p 6399 ping
```

**Alternative methods:**
Expand All @@ -74,7 +77,7 @@ Redis serves as both the message broker (task queue) and result backend.


```python
!pip install -q ray[serve-async-inference]>=2.50.0 requests>=2.31.0 PyPDF2>=3.0.0 celery[redis]
pip install -q ray[serve-async-inference]>=2.50.0 requests>=2.31.0 PyPDF2>=3.0.0 celery[redis]
```

## Step 3: Start the Ray Serve Application
Expand Down