Skip to content
Merged
Show file tree
Hide file tree
Changes from 3 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 13 additions & 7 deletions doc/source/conf.py
Original file line number Diff line number Diff line change
Expand Up @@ -231,15 +231,21 @@ def __init__(self, version: str):
"templates/*",
"cluster/running-applications/doc/ray.*",
"data/api/ray.data.*.rst",
"ray-overview/examples/**/README.md", # Exclude .md files in examples subfolders
"train/examples/**/README.md",
"serve/tutorials/deployment-serve-llm/README.*",
"serve/tutorials/deployment-serve-llm/*/notebook.ipynb",
"data/examples/**/content/README.md",
# Hide README.md used for display on the console (anyscale templates)
"serve/tutorials/**/content/**README.md",
"data/examples/**/content/**README.md",
"ray-overview/examples/**/content/**README.md",
"ray-core/examples/**/content/**README.md",
"train/examples/**/content/**README.md",
"tune/examples/**/content/**README.md",
# Other misc files (overviews, console-only examples, etc)
"ray-overview/examples/llamafactory-llm-fine-tune/README.ipynb",
"ray-overview/examples/llamafactory-llm-fine-tune/**/*.ipynb",
"serve/tutorials/asynchronous-inference/content/asynchronous-inference.ipynb",
"serve/tutorials/asynchronous-inference/content/README.md",
# Legacy/backward compatibility
"ray-overview/examples/**/README.md",
"train/examples/**/README.md",
"serve/tutorials/deployment-serve-llm/README.*",
"serve/tutorials/deployment-serve-llm/**.ipynb",
] + autogen_files

# If "DOC_LIB" is found, only build that top-level navigation item.
Expand Down
2 changes: 1 addition & 1 deletion doc/source/serve/examples.yml
Original file line number Diff line number Diff line change
Expand Up @@ -153,5 +153,5 @@ examples:
skill_level: advanced
use_cases:
- generative ai
link: tutorials/asynchronous-inference/content/README
link: tutorials/asynchronous-inference/content/asynchronous-inference
related_technology: integrations
Original file line number Diff line number Diff line change
Expand Up @@ -70,6 +70,11 @@ Redis serves as both the message broker (task queue) and result backend.
- **Docker:** `docker run -d -p 6379:6379 redis:latest`
- **Other platforms:** [Official Redis Installation Guide](https://redis.io/docs/getting-started/installation/)

**Note:** If you're using a hosted Redis instance, ensure that your Ray Serve cluster can access it. For example, when using AWS ElastiCache for Redis:

- Launch the ElastiCache instance in the same VPC that's attached to your Anyscale cloud.
- Attach IAM roles with read/write access to ElastiCache to your cluster instances.

## Step 2: Install Dependencies


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -129,12 +129,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"```note\n",
"If you're using a hosted Redis instance, ensure that your Ray Serve cluster can access it. For example, when using AWS ElastiCache for Redis:\n",
"**Note:** If you're using a hosted Redis instance, ensure that your Ray Serve cluster can access it. For example, when using AWS ElastiCache for Redis:\n",
"\n",
"- Launch the ElastiCache instance in the same VPC that's attached to your Anyscale cloud.\n",
"- Attach IAM roles with read/write access to ElastiCache to your cluster instances.\n",
"```"
"- Attach IAM roles with read/write access to ElastiCache to your cluster instances."
]
},
{
Expand Down Expand Up @@ -601,7 +599,8 @@
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.12.12"
}
},
"orphan": true
},
"nbformat": 4,
"nbformat_minor": 0
Expand Down