Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
This PR contains the following updates:
2.42.1
->2.43.0
Release Notes
ray-project/ray (ray)
v2.43.0
Compare Source
Highlights
ray.data.llm
andray.serve.llm
. See the below notes for more details.RAY_TRAIN_V2_ENABLED=1
environment variable. See the migration guide for more information.uv run
that allows easily specifying Python dependencies for both driver and workers in a consistent way and enables quick iterations for development of Ray applications (#50160, 50462), check out our blog postRay Libraries
Ray Data
🎉 New Features:
Processor
abstraction that interoperates with existing Ray Data pipelines. This abstraction can be configured two ways:vLLMEngineProcessorConfig
, which configures vLLM to load model replicas for high throughput model inferenceHttpRequestProcessorConfig
, which sends HTTP requests to an OpenAI-compatible endpoint for inference.UnionOperator
(#50436)💫 Enhancements:
ShufflingBatcher
ontotry_combine_chunked_columns
(#50296)ArrowBlockAccessor
,PandasBlockAccessor
(#50498)AggregateFn
withAggregateFnV2
, cleaning up Aggregation infrastructure (#50585)TaskDurationStats
andon_execution_step
callback (#50766)🔨 Fixes:
grouped_data.py
docstrings (#50392)test_map_batches_async_generator
(#50459)pyarrow.infer_type
on datetime arrays (#50403)📖 Documentation:
Ray Train
🎉 New Features:
RAY_TRAIN_V2_ENABLED=1
environment variable. See the migration guide for more information.💫 Enhancements:
ray[train]
extra install (#46682)🔨 Fixes:
📖 Documentation:
🏗 Architecture refactoring:
Ray Tune
🔨 Fixes:
📖 Documentation:
🏗 Architecture refactoring:
Ray Serve
🎉 New Features:
VLLMService
: A prebuilt deployment that offers a full-featured vLLM engine integration, with support for features such as LoRA multiplexing and multimodal language models.LLMRouter
: An out-of-the-box OpenAI compatible model router that can route across multiple LLM deployments.💫 Enhancements:
required_resources
to REST API (#50058)🔨 Fixes:
RLlib
🎉 New Features:
💫 Enhancements:
eval_env_runner_group
from the training steps. (#50057)OfflinePreLearner
docstring. (#50107)🔨 Fixes:
on_workers/env_runners_recreated
callback would be called twice. (#50172)default_resource_request
: aggregator actors missing in placement group for local Learner. (#50219, #50475)📖 Documentation:
Ray Core and Ray Clusters
Ray Core
💫 Enhancements:
🔨 Fixes:
Ray Clusters
📖 Documentation:
Ray Dashboard
🎉 New Features:
Thanks
Thank you to everyone who contributed to this release! 🥳
@liuxsh9, @justinrmiller, @CheyuWu, @400Ping, @scottsun94, @bveeramani, @bhmiller, @tylerfreckmann, @hefeiyun, @pcmoritz, @matthewdeng, @dentiny, @erictang000, @gvspraveen, @simonsays1980, @aslonnie, @shorbaji, @LeoLiao123, @justinvyu, @israbbani, @zcin, @ruisearch42, @khluu, @kouroshHakha, @sijieamoy, @SergeCroise, @raulchen, @anson627, @bluenote10, @allenyin55, @martinbomio, @rueian, @rynewang, @owenowenisme, @Betula-L, @alexeykudinkin, @crypdick, @jujipotle, @saihaj, @EricWiener, @kevin85421, @MengjinYan, @chris-ray-zhang, @SumanthRH, @chiayi, @comaniac, @angelinalg, @kenchung285, @tanmaychimurkar, @andrewsykim, @MortalHappiness, @sven1977, @richardliaw, @omatthew98, @fscnick, @akyang-anyscale, @cristianjd, @Jay-ju, @spencer-p, @win5923, @wxsms, @stfp, @letaoj, @JDarDagran, @jjyao, @srinathk10, @edoakes, @vincent0426, @dayshah, @davidxia, @DmitriGekhtman, @GeneDer, @HYLcool, @gameofby, @can-anyscale, @ryanaoleary, @eddyxu
Configuration
📅 Schedule: Branch creation - "every weekend" (UTC), Automerge - At any time (no schedule defined).
🚦 Automerge: Disabled by config. Please merge this manually once you are satisfied.
♻ Rebasing: Whenever PR becomes conflicted, or you tick the rebase/retry checkbox.
🔕 Ignore: Close this PR and you won't be reminded about this update again.
This PR was generated by Mend Renovate. View the repository job log.