Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

chore(deps-dev): bump the gha group across 1 directory with 9 updates #2768

Open
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Mar 10, 2025

Bumps the gha group with 9 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

Package From To
flake8 7.0.0 7.1.2
vcrpy 6.0.2 7.0.0
pytest-asyncio 0.23.8 0.25.3
chromadb 0.5.23 0.6.3
openai 1.58.1 1.65.5
llama-index 0.12.8 0.12.23
sqlalchemy 2.0.36 2.0.38
llama-index-agent-openai 0.4.1 0.4.6
onnxruntime 1.19.2 1.20.1

Updates flake8 from 7.0.0 to 7.1.2

Commits
  • fffee8b Release 7.1.2
  • 19001f7 Merge pull request #1966 from PyCQA/limit-procs-to-file-count
  • f35737a avoid starting unnecessary processes when file count is limited
  • cf1542c Release 7.1.1
  • 939ea3d Merge pull request #1949 from stephenfin/issue-1948
  • bdcd5c2 Handle escaped braces in f-strings
  • 2a811cc Merge pull request #1946 from Viicos/patch-1
  • 10314ad Fix wording of plugins documentation
  • 65a38c4 Release 7.1.0
  • 34c97e0 Merge pull request #1939 from PyCQA/new-pycodestyle
  • Additional commits viewable in compare view

Updates vcrpy from 6.0.2 to 7.0.0

Release notes

Sourced from vcrpy's releases.

v7.0.0

What's Changed

- Drop support for python 3.8 (major version bump) - thanks @jairhenrique
- Various linting and test fixes - thanks @jairhenrique
- Bugfix for urllib2>=2.3.0 - missing version_string ([#888](https://github.com/kevin1024/vcrpy/issues/888))
- Bugfix for asyncio.run - thanks @alekeik1

New Contributors

Changelog

Sourced from vcrpy's changelog.

Changelog

For a full list of triaged issues, bugs and PRs and what release they are targeted for please see the following link.

ROADMAP MILESTONES <https://github.com/kevin1024/vcrpy/milestones>_

All help in providing PRs to close out bug issues is appreciated. Even if that is providing a repo that fully replicates issues. We have very generous contributors that have added these to bug issues which meant another contributor picked up the bug and closed it out.

  • 7.0.0

  • 6.0.2

  • 6.0.1

    • Bugfix with to Tornado cassette generator (thanks @​graingert)
  • 6.0.0

    • BREAKING: Fix issue with httpx support (thanks @​parkerhancock) in #784. NOTE: You may have to recreate some of your cassettes produced in previous releases due to the binary format being saved incorrectly in previous releases
    • BREAKING: Drop support for boto (vcrpy still supports boto3, but is dropping the deprecated boto support in this release. (thanks @​jairhenrique)
    • Fix compatibility issue with Python 3.12 (thanks @​hartwork)
    • Drop simplejson (fixes some compatibility issues) (thanks @​jairhenrique)
    • Run CI on Python 3.12 and PyPy 3.9-3.10 (thanks @​mgorny)
    • Various linting and docs improvements (thanks @​jairhenrique)
    • Tornado fixes (thanks @​graingert)
  • 5.1.0

  • 5.0.0

    • BREAKING CHANGE: Drop support for Python 3.7. 3.7 is EOL as of 6/27/23 Thanks @​jairhenrique
    • BREAKING CHANGE: Custom Cassette persisters no longer catch ValueError. If you have implemented a custom persister (has anyone implemented a custom persister? Let us know!) then you will need to throw a CassetteNotFoundError when unable to find a cassette. See #681 for discussion and reason for this change. Thanks @​amosjyng for the PR and the review from @​hartwork
  • 4.4.0

    • HUGE thanks to @​hartwork for all the work done on this release!
    • Bring vcr/unittest in to vcrpy as a full feature of vcr instead of a separate library. Big thanks to @​hartwork for doing this and to @​agriffis for originally creating the library
    • Make decompression robust towards already decompressed input (thanks @​hartwork)

... (truncated)

Commits
  • 3278619 Release v7.0.0
  • 3fb62e0 fix: correctly handle asyncio.run when loop exists
  • 8197865 build(deps): update sphinx requirement from <8 to <9
  • be651bd pre-commit: Autoupdate
  • a6698ed Fix aiohttp tests
  • 48d0a2e Fixed missing version_string attribute when used with urllib3>=2.3.0
  • 5b858b1 Fix lint
  • c8d99a9 Fix ruff configuration
  • ce27c63 Merge pull request #736 from kevin1024/drop-python38
  • ab8944d Drop python 3.8 support
  • Additional commits viewable in compare view

Updates pytest-asyncio from 0.23.8 to 0.25.3

Release notes

Sourced from pytest-asyncio's releases.

pytest-asyncio 0.25.3

  • Avoid errors in cleanup of async generators when event loop is already closed #1040

pytest-asyncio 0.25.2

  • Call loop.shutdown_asyncgens() before closing the event loop to ensure async generators are closed in the same manner as asyncio.run does #1034

pytest-asyncio 0.25.1

  • Fixes an issue that caused a broken event loop when a function-scoped test was executed in between two tests with wider loop scope #950
  • Improves test collection speed in auto mode #1020
  • Corrects the warning that is emitted upon redefining the event_loop fixture

pytest-asyncio 0.25.0

0.25.0 (2024-12-13)

  • Deprecated: Added warning when asyncio test requests async @pytest.fixture in strict mode. This will become an error in a future version of flake8-asyncio. #979
  • Updates the error message about pytest.mark.asyncio's scope keyword argument to say loop_scope instead. #1004
  • Verbose log displays correct parameter name: asyncio_default_fixture_loop_scope #990
  • Propagates contextvars set in async fixtures to other fixtures and tests on Python 3.11 and above. #1008

pytest-asyncio 0.24.0

0.24.0 (2024-08-22)

  • BREAKING: Updated minimum supported pytest version to v8.2.0
  • Adds an optional loop_scope keyword argument to pytest.mark.asyncio. This argument controls which event loop is used to run the marked async test. #706, #871
  • Deprecates the optional scope keyword argument to pytest.mark.asyncio for API consistency with pytest_asyncio.fixture. Users are encouraged to use the loop_scope keyword argument, which does exactly the same.
  • Raises an error when passing scope or loop_scope as a positional argument to @pytest.mark.asyncio. #812
  • Fixes a bug that caused module-scoped async fixtures to fail when reused in other modules #862 #668

pytest-asyncio 0.24.0a1

0.24.0 (UNRELEASED)

  • BREAKING: Updated minimum supported pytest version to v8.2.0
  • Adds an optional loop_scope keyword argument to pytest.mark.asyncio. This argument controls which event loop is used to run the marked async test. #706, #871
  • Deprecates the optional scope keyword argument to pytest.mark.asyncio for API consistency with pytest_asyncio.fixture. Users are encouraged to use the loop_scope keyword argument, which does exactly the same.
  • Raises an error when passing scope or loop_scope as a positional argument to @pytest.mark.asyncio. #812
  • Fixes a bug that caused module-scoped async fixtures to fail when reused in other modules #862 #668

pytest-asyncio 0.24.0a0

0.24.0 (UNRELEASED)

  • Adds an optional loop_scope keyword argument to pytest.mark.asyncio. This argument controls which event loop is used to run the marked async test. #706, #871
  • Deprecates the optional scope keyword argument to pytest.mark.asyncio for API consistency with pytest_asyncio.fixture. Users are encouraged to use the loop_scope keyword argument, which does exactly the same.
  • Raises an error when passing scope or loop_scope as a positional argument to @pytest.mark.asyncio. #812
Commits
  • 7c50192 fix: Avoid errors in cleanup of async generators when event loop is already c...
  • 2188cdb build: Prepare release of v0.25.2.
  • c3ad634 fix: Shutdown generators before closing event loops.
  • e8ffb10 [pre-commit.ci] pre-commit autoupdate
  • aae43d4 Build(deps): Bump hypothesis in /dependencies/default
  • 941e8b5 Build(deps): Bump pygments from 2.18.0 to 2.19.1 in /dependencies/docs
  • 623ab74 docs: Prepare release of v0.25.1.
  • c236550 docs: Fix broken link to the pytest.mark.asyncio reference.
  • 41c645b fix: Correct warning message when redefining the event_loop fixture.
  • 2fd10f8 docs: Clarify deprecation of event_loop fixture.
  • Additional commits viewable in compare view

Updates chromadb from 0.5.23 to 0.6.3

Release notes

Sourced from chromadb's releases.

0.6.3

Version: 0.6.3 Git ref: refs/tags/0.6.3 Build Date: 2025-01-14T22:21 PIP Package: chroma-0.6.3.tar.gz Github Container Registry Image: ghcr.io/chroma-core/chroma:0.6.3 DockerHub Image: chromadb/chroma:0.6.3

What's Changed

New Contributors

... (truncated)

Commits

Updates openai from 1.58.1 to 1.65.5

Release notes

Sourced from openai's releases.

v1.65.5

1.65.5 (2025-03-09)

Full Changelog: v1.65.4...v1.65.5

Chores

v1.65.4

1.65.4 (2025-03-05)

Full Changelog: v1.65.3...v1.65.4

Bug Fixes

  • api: add missing file rank enum + more metadata (#2164) (0387e48)

v1.65.3

1.65.3 (2025-03-04)

Full Changelog: v1.65.2...v1.65.3

Chores

  • internal: remove unused http client options forwarding (#2158) (76ec464)
  • internal: run example files in CI (#2160) (9979345)

v1.65.2

1.65.2 (2025-03-01)

Full Changelog: v1.65.1...v1.65.2

Bug Fixes

  • azure: azure_deployment use with realtime + non-deployment-based APIs (#2154) (5846b55)

Chores

v1.65.1

1.65.1 (2025-02-27)

Full Changelog: v1.65.0...v1.65.1

Documentation

  • update URLs from stainlessapi.com to stainless.com (#2150) (dee4298)

... (truncated)

Changelog

Sourced from openai's changelog.

1.65.5 (2025-03-09)

Full Changelog: v1.65.4...v1.65.5

Chores

1.65.4 (2025-03-05)

Full Changelog: v1.65.3...v1.65.4

Bug Fixes

  • api: add missing file rank enum + more metadata (#2164) (0387e48)

1.65.3 (2025-03-04)

Full Changelog: v1.65.2...v1.65.3

Chores

  • internal: remove unused http client options forwarding (#2158) (76ec464)
  • internal: run example files in CI (#2160) (9979345)

1.65.2 (2025-03-01)

Full Changelog: v1.65.1...v1.65.2

Bug Fixes

  • azure: azure_deployment use with realtime + non-deployment-based APIs (#2154) (5846b55)

Chores

1.65.1 (2025-02-27)

Full Changelog: v1.65.0...v1.65.1

Documentation

  • update URLs from stainlessapi.com to stainless.com (#2150) (dee4298)

1.65.0 (2025-02-27)

Full Changelog: v1.64.0...v1.65.0

... (truncated)

Commits
  • a6b4930 release: 1.65.5
  • 530f9b8 chore: move ChatModel type to shared (#2167)
  • dfc4cfa release: 1.65.4
  • 5608d64 fix(api): add missing file rank enum + more metadata (#2164)
  • d6bb8c1 release: 1.65.3
  • b31f4d4 chore(internal): run example files in CI (#2160)
  • 65f2c5c chore(internal): remove unused http client options forwarding (#2158)
  • 64af9e8 release: 1.65.2
  • c98d740 fix(azure): azure_deployment use with realtime + non-deployment-based APIs (#...
  • ba2a8a0 chore(docs): update client docstring (#2152)
  • Additional commits viewable in compare view

Updates llama-index from 0.12.8 to 0.12.23

Changelog

Sourced from llama-index's changelog.

llama-index-core [0.12.23]

  • added merging_separator argument to allow for specifying chunk merge separator in semantic splitter (#18027)
  • Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)
  • Fix the error raised when ReactAgent is created without an explicit system message (#18041)
  • add a field keep_whitespaces to TokenTextSplitter (#17998)
  • do not convert raw tool output to string in AgentWorkflow (#18006)

llama-index-embeddings-ollama [0.6.0]

  • feat: add client_kwargs Parameter to OllamaEmbedding Class (#18012)

llama-index-llms-anthropic [0.6.10]

  • anthropic caching and thinking updates (#18039)
  • allow caching of tool results (#18028)
  • support caching of anthropic system prompt (#18008)
  • Ensure resuming a workflow actually works (#18023)
  • [MarkdownNodeParser] Adding customizable header path separator char (#17964)
  • feat: return event instance from run() when stop event is custom (#18001)

llama-index-llms-azure-openai [0.3.2]

  • AzureOpenAI: api_base and azure_endpoint are mutually exclusive (#18037)
  • Add base_url to AzureOpenAI (#17996)

llama-index-llms-bedrock-converse [0.4.8]

  • message text is required in boto3 model (#17989)

llama-index-llms-ollama [0.5.3]

  • Make request_timeout in Ollama LLM optional (#18007)

llama-index-llms-mistralai [0.4.0]

  • MistralAI support for multImodal content blocks (#17997)

llama-index-readers-file [0.4.6]

  • Bugfix: Use torch.no grad() in inference in ImageVisionLLMReader when PyTorch is installed (#17970)

llama-index-storage-chat-store-mongo [0.1.0]

  • Feat/mongo chat store (#17979)

llama-index-core [0.12.23]

  • added merging_separator argument to allow for specifying chunk merge separator in semantic splitter (#18027)
  • Add support for running single-agent workflows within the BaseWorkflowAgent class (#18038)

... (truncated)

Commits
  • 4c8d1d6 v0.12.23 (#18050)
  • 54e0a2c Change AgentWorkflow to FunctionAgent in documentation. (#18042)
  • 13de07b chore: bump jinja version in docs dependencies (#18047)
  • d3a861f added merging_separator argument to allow for specifying chunk merge (#18027)
  • d569009 Fix the error raised when ReactAgent is created without an explicit system me...
  • 3dee964 Add support for running single-agent workflows within the BaseWorkflowAgent c...
  • 8b3e456 chore: skip deeplake tests when running on CI (#18015)
  • 37243c8 build(deps-dev): bump jinja2 from 3.1.5 to 3.1.6 (#18024)
  • 02a4c93 build(deps-dev): bump jinja2 from 3.1.5 to 3.1.6 in /llama-index-core (#18025)
  • fa51d5a AzureOpenAI: api_base and azure_endpoint are mutually exclusive (#18037)
  • Additional commits viewable in compare view

Updates sqlalchemy from 2.0.36 to 2.0.38

Release notes

Sourced from sqlalchemy's releases.

2.0.38

Released: February 6, 2025

engine

  • [engine] [bug] Fixed event-related issue where invoking Engine.execution_options() on a Engine multiple times while making use of event-registering parameters such as isolation_level would lead to internal errors involving event registration.

    References: #12289

sql

  • [sql] [bug] Reorganized the internals by which the .c collection on a FromClause gets generated so that it is resilient against the collection being accessed in concurrent fashion. An example is creating a Alias or Subquery and accessing it as a module level variable. This impacts the Oracle dialect which uses such module-level global alias objects but is of general use as well.

    References: #12302

  • [sql] [bug] Fixed SQL composition bug which impacted caching where using a None value inside of an in_() expression would bypass the usual "expanded bind parameter" logic used by the IN construct, which allows proper caching to take place.

    References: #12314

postgresql

  • [postgresql] [usecase] [asyncio] Added an additional asyncio.shield() call within the connection terminate process of the asyncpg driver, to mitigate an issue where terminate would be prevented from completing under the anyio concurrency library.

    References: #12077

  • [postgresql] [bug] Adjusted the asyncpg connection wrapper so that the connection.transaction() call sent to asyncpg sends None for isolation_level if not otherwise set in the SQLAlchemy dialect/wrapper, thereby allowing asyncpg to make use of the server level setting for isolation_level in the absense of a client-level setting. Previously, this behavior of asyncpg was blocked by a hardcoded read_committed.

... (truncated)

Commits

Updates llama-index-agent-openai from 0.4.1 to 0.4.6

Updates onnxruntime from 1.19.2 to 1.20.1

Release notes

Sourced from onnxruntime's releases.

ONNX Runtime v1.20.1

What's new?

Python Quantization Tool

CPU EP

QNN EP

TensorRT EP

Packaging

Contributions

Big thank you to the release manager @​yf711, along with @​adrianlizarraga, @​HectorSVC, @​jywu-msft, and everyone else who helped to make this patch release process a smooth one!

ONNX Runtime v1.20.0

Release Manager: @​apsonawane

Announcements

  • All ONNX Runtime Training packages have been deprecated. ORT 1.19.2 was the last release for which onnxruntime-training (PyPI), onnxruntime-training-cpu (PyPI), Microsoft.ML.OnnxRuntime.Training (Nuget), onnxruntime-training-c (CocoaPods), onnxruntime-training-objc (CocoaPods), and onnxruntime-training-android (Maven Central) were published.
  • ONNX Runtime packages will stop supporting Python 3.8 and Python 3.9. This decision aligns with NumPy Python version support. To continue using ORT with Python 3.8 and Python 3.9, you can use ORT 1.19.2 and earlier.
  • ONNX Runtime 1.20 CUDA packages will include new dependencies that were not required in 1.19 packages. The following dependencies are new: libcudnn_adv.so.9, libcudnn_cnn.so.9, libcudnn_engines_precompiled.so.9, libcudnn_engines_runtime_compiled.so.9, libcudnn_graph.so.9, libcudnn_heuristic.so.9, libcudnn_ops.so.9, libnvrtc.so.12, and libz.so.1.

Build System & Packages

  • Python 3.13 support is included in PyPI packages.
  • ONNX 1.17 support will be delayed until a future release, but the ONNX version used by ONNX Runtime has been patched to include a shape inference change to the Einsum op.
  • DLLs in the Maven build are now digitally signed (fix for issue reported here).
  • (Experimental) vcpkg support added for the CPU EP. The DML EP does not yet support vcpkg, and other EPs have not been tested.

Core

  • MultiLoRA support.
  • Reduced memory utilization.
    • Fixed alignment that was causing mmap to fail for external weights.
    • Eliminated double allocations when deserializing external weights.
    • Added ability to serialize pre-packed weights so that they don’t cause an increase in memory utilization when the model is loaded.
  • Support bfloat16 and float8 data types in python I/O binding API.

Performance

  • INT4 quantized embedding support on CPU and CUDA EPs.

... (truncated)

Commits

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore <dependency name> major version will close this group update PR and stop Dependabot creating any more for the specific dependency's major version (unless you unignore this specific dependency's major version or upgrade to it yourself)
  • @dependabot ignore <dependency name> minor version will close this group update PR and stop Dependabot creating any more for the specific dependency's minor version (unless you unignore this specific dependency's minor version or upgrade to it yourself)
  • @dependabot ignore <dependency name> will close this group update PR and stop Dependabot creating any more for the specific dependency (unless you unignore this specific dependency or upgrade to it yourself)
  • @dependabot unignore <dependency name> will remove all of the ignore conditions of the specified dependency
  • @dependabot unignore <dependency name> <ignore condition> will remove the ignore condition of the specified dependency and ignore conditions

@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Mar 10, 2025
Bumps the gha group with 9 updates in the /packages/opentelemetry-instrumentation-llamaindex directory:

| Package | From | To |
| --- | --- | --- |
| [flake8](https://github.com/pycqa/flake8) | `7.0.0` | `7.1.2` |
| [vcrpy](https://github.com/kevin1024/vcrpy) | `6.0.2` | `7.0.0` |
| [pytest-asyncio](https://github.com/pytest-dev/pytest-asyncio) | `0.23.8` | `0.25.3` |
| [chromadb](https://github.com/chroma-core/chroma) | `0.5.23` | `0.6.3` |
| [openai](https://github.com/openai/openai-python) | `1.58.1` | `1.65.5` |
| [llama-index](https://github.com/run-llama/llama_index) | `0.12.8` | `0.12.23` |
| [sqlalchemy](https://github.com/sqlalchemy/sqlalchemy) | `2.0.36` | `2.0.38` |
| llama-index-agent-openai | `0.4.1` | `0.4.6` |
| [onnxruntime](https://github.com/microsoft/onnxruntime) | `1.19.2` | `1.20.1` |



Updates `flake8` from 7.0.0 to 7.1.2
- [Commits](PyCQA/flake8@7.0.0...7.1.2)

Updates `vcrpy` from 6.0.2 to 7.0.0
- [Release notes](https://github.com/kevin1024/vcrpy/releases)
- [Changelog](https://github.com/kevin1024/vcrpy/blob/master/docs/changelog.rst)
- [Commits](kevin1024/vcrpy@v6.0.2...v7.0.0)

Updates `pytest-asyncio` from 0.23.8 to 0.25.3
- [Release notes](https://github.com/pytest-dev/pytest-asyncio/releases)
- [Commits](pytest-dev/pytest-asyncio@v0.23.8...v0.25.3)

Updates `chromadb` from 0.5.23 to 0.6.3
- [Release notes](https://github.com/chroma-core/chroma/releases)
- [Changelog](https://github.com/chroma-core/chroma/blob/main/RELEASE_PROCESS.md)
- [Commits](chroma-core/chroma@0.5.23...0.6.3)

Updates `openai` from 1.58.1 to 1.65.5
- [Release notes](https://github.com/openai/openai-python/releases)
- [Changelog](https://github.com/openai/openai-python/blob/main/CHANGELOG.md)
- [Commits](openai/openai-python@v1.58.1...v1.65.5)

Updates `llama-index` from 0.12.8 to 0.12.23
- [Release notes](https://github.com/run-llama/llama_index/releases)
- [Changelog](https://github.com/run-llama/llama_index/blob/main/CHANGELOG.md)
- [Commits](run-llama/llama_index@v0.12.8...v0.12.23)

Updates `sqlalchemy` from 2.0.36 to 2.0.38
- [Release notes](https://github.com/sqlalchemy/sqlalchemy/releases)
- [Changelog](https://github.com/sqlalchemy/sqlalchemy/blob/main/CHANGES.rst)
- [Commits](https://github.com/sqlalchemy/sqlalchemy/commits)

Updates `llama-index-agent-openai` from 0.4.1 to 0.4.6

Updates `onnxruntime` from 1.19.2 to 1.20.1
- [Release notes](https://github.com/microsoft/onnxruntime/releases)
- [Changelog](https://github.com/microsoft/onnxruntime/blob/main/docs/ReleaseManagement.md)
- [Commits](microsoft/onnxruntime@v1.19.2...v1.20.1)

---
updated-dependencies:
- dependency-name: flake8
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: vcrpy
  dependency-type: direct:development
  update-type: version-update:semver-major
  dependency-group: gha
- dependency-name: pytest-asyncio
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: chromadb
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: openai
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
- dependency-name: llama-index
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: sqlalchemy
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: llama-index-agent-openai
  dependency-type: direct:development
  update-type: version-update:semver-patch
  dependency-group: gha
- dependency-name: onnxruntime
  dependency-type: direct:development
  update-type: version-update:semver-minor
  dependency-group: gha
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot force-pushed the dependabot/pip/packages/opentelemetry-instrumentation-llamaindex/gha-6220d0a66e branch from 4006992 to ed48694 Compare March 17, 2025 02:47
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants