Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump ddtrace from 2.13.0 to 2.14.2 #221

Closed
wants to merge 1 commit into from

Conversation

dependabot[bot]
Copy link
Contributor

@dependabot dependabot bot commented on behalf of github Oct 7, 2024

Bumps ddtrace from 2.13.0 to 2.14.2.

Release notes

Sourced from ddtrace's releases.

2.14.2

Bug Fixes

  • Tracing

    • celery: Fixes an issue where celery.apply spans didn't close if the after_task_publish or task_postrun signals didn't get sent when using apply_async, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs.
    • celery: Fixes an issue where celery.apply spans using task_protocol 1 didn't close by improving the check for the task id in the body.
  • Profiling

    • All files with platform-dependent code have had their filenames updated to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
    • Enables code provenance when using libdatadog exporter, DD_PROFILING_EXPORT_LIBDD_ENABLED, DD_PROFILING_STACK_V2_ENABLED, or DD_PROFILING_TIMELINE_ENABLED.
    • Fixes an issue where flamegraph was upside down for stack v2, DD_PROFILING_STACK_V2_ENABLED.

2.14.1

New Features

  • Code Security (IAST): Always report a telemetry log error when an IAST propagation error raises, regardless of whether the _DD_IAST_DEBUG environment variable is enabled or not.

Bug Fixes

  • tracing: Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
  • Code Security: fix potential memory leak on IAST exception handling.
  • profiling: Fixes endpoint profiling when using libdatadog exporter, either with DD_PROFILING_EXPORT_LIBDD_ENABLED or DD_PROFILING_TIMELINE_ENABLED.

2.14.0

Deprecation Notes

  • Tracing
    • Deprecates the DD_TRACE_SPAN_AGGREGATOR_RLOCK environment variable. It will be removed in v3.0.0.
    • Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
    • DD_HTTP_CLIENT_TAG_QUERY_STRING configuration is deprecated and will be removed in v3.0.0. Use DD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING instead.

New Features

  • DSM

    • Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
    • Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
  • LLM Observability

    • Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
    • The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
    • The LangChain integration now submits tool spans to LLM Observability.
    • LLM Observability spans generated by the OpenAI integration now have updated span name and model_provider values. Span names are now prefixed with the OpenAI client name (possible values: OpenAI/AzureOpenAI) instead of the default openai prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. The model_provider field also now corresponds to openai or azure_openai based on the OpenAI client.
    • The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the stream_options={"include_usage": True} option is set on the completion or chat completion call.
    • Introduces the LLMObs.annotation_context() context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active.
    • Introduces prompt template annotation, which can be passed as an argument to LLMObs.annotate(prompt={...}) for LLM span kinds. For more information on prompt annotations, see the docs.
    • google_generativeai: Introduces tracing support for Google Gemini API generate_content calls.
      See the docs for more information.
    • openai: The OpenAI integration now includes a new openai.request.client tag with the possible values OpenAI/AzureOpenAI to help differentiate whether the request was made to Azure OpenAI or OpenAI.
    • openai: The OpenAI integration now captures token data from streamed completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the stream_options={"include_usage": True} option is set on the completion or chat completion call.
  • Profiling

... (truncated)

Changelog

Sourced from ddtrace's changelog.

2.14.2

Bug Fixes

  • Tracing

    • celery: Fixes an issue where celery.apply spans didn't close if the after_task_publish or task_postrun signals didn't get sent when using apply_async, which can happen if there is an internal exception during the handling of the task. This update also marks the span as an error if an exception occurs.
    • celery: Fixes an issue where celery.apply spans using task_protocol 1 didn't close by improving the check for the task id in the body.
  • Profiling

    • All files with platform-dependent code have had their filenames updated to reflect the platform they are for. This fixes issues where the wrong file would be used on a given platform.
    • Enables code provenance when using libdatadog exporter, DD_PROFILING_EXPORT_LIBDD_ENABLED, DD_PROFILING_STACK_V2_ENABLED, or DD_PROFILING_TIMELINE_ENABLED.
    • Fixes an issue where flamegraph was upside down for stack v2, DD_PROFILING_STACK_V2_ENABLED.

2.14.1

New Features

  • Code Security (IAST): Always report a telemetry log error when an IAST propagation error raises, regardless of whether the _DD_IAST_DEBUG environment variable is enabled or not.

Bug Fixes

  • tracing: Removes a reference cycle that caused unnecessary garbage collection for top-level spans.
  • Code Security: fix potential memory leak on IAST exception handling.
  • profiling: Fixes endpoint profiling when using libdatadog exporter, either with DD_PROFILING_EXPORT_LIBDD_ENABLED or DD_PROFILING_TIMELINE_ENABLED.

2.14.0

Deprecation Notes

  • Tracing
    • Deprecates the DD_TRACE_SPAN_AGGREGATOR_RLOCK environment variable. It will be removed in v3.0.0.
    • Deprecates support for APM Legacy App Analytics. This feature and its associated configuration options are deprecated and will be removed in v3.0.0.
    • DD_HTTP_CLIENT_TAG_QUERY_STRING configuration is deprecated and will be removed in v3.0.0. Use DD_TRACE_HTTP_CLIENT_TAG_QUERY_STRING instead.

New Features

  • DSM

    • Introduces new tracing and datastreams monitoring functionality for Avro Schemas.
    • Introduces new tracing and datastreams monitoring functionality for Google Protobuf.
  • LLM Observability

    • Adds support to automatically submit Gemini Python SDK calls to LLM Observability.
    • The OpenAI integration now captures tool calls returned from streamed responses when making calls to the chat completions endpoint.
    • The LangChain integration now submits tool spans to LLM Observability.
    • LLM Observability spans generated by the OpenAI integration now have updated span name and model_provider values. Span names are now prefixed with the OpenAI client name (possible values: OpenAI/AzureOpenAI) instead of the default openai prefix to better differentiate whether the request was made to Azure OpenAI or OpenAI. The model_provider field also now corresponds to openai or azure_openai based on the OpenAI client.
    • The OpenAI integration now ensures accurate token data from streamed OpenAI completions and chat completions, if provided in the streamed response. To ensure accurate token data in the traced streamed operation, ensure that the stream_options={"include_usage": True} option is set on the completion or chat completion call.
    • Introduces the LLMObs.annotation_context() context manager method, which allows modifying the tags of integration generated LLM Observability spans created while the context manager is active.
    • Introduces prompt template annotation, which can be passed as an argument to LLMObs.annotate(prompt={...}) for LLM span kinds. For more information on prompt annotations, see the docs.
    • google_generativeai: Introduces tracing support for Google Gemini API generate_content calls.
      See the docs for more information.

... (truncated)

Commits
  • 73d3d41 fix(profiling): code provenance using libdatadog exporter [backport 2.14] (#1...
  • a213e8d fix(celery): close celery.apply spans even without after_task_publish, when...
  • c7df637 fix(profiling): platform-specific files should have platform-specific filenam...
  • c2c5727 fix(celery): close celery.apply spans using task_protocol 1 [backport 2.14] (...
  • 51eb0e2 fix(profiling): reverse locations for stack v2 [backport-2.14] (#10871)
  • 65a9f20 chore(iast): memory leak in pypika and pydantic [backport 2.14] (#10858)
  • 300dfcc fix(tracing): avoid assigning span's local root to self, so that the python G...
  • 467db5a chore(iast): django Invalid or empty source_value [backport 2.14] (#10819)
  • a5ce818 chore(profiling): prevent strings from GC'ed whose string_views are passed to...
  • cd8d72d chore(iast): disable error log metric [backport 2.14] (#10815)
  • Additional commits viewable in compare view

Dependabot compatibility score

Dependabot will resolve any conflicts with this PR as long as you don't alter it yourself. You can also trigger a rebase manually by commenting @dependabot rebase.


Dependabot commands and options

You can trigger Dependabot actions by commenting on this PR:

  • @dependabot rebase will rebase this PR
  • @dependabot recreate will recreate this PR, overwriting any edits that have been made to it
  • @dependabot merge will merge this PR after your CI passes on it
  • @dependabot squash and merge will squash and merge this PR after your CI passes on it
  • @dependabot cancel merge will cancel a previously requested merge and block automerging
  • @dependabot reopen will reopen this PR if it is closed
  • @dependabot close will close this PR and stop Dependabot recreating it. You can achieve the same result by closing it manually
  • @dependabot show <dependency name> ignore conditions will show all of the ignore conditions of the specified dependency
  • @dependabot ignore this major version will close this PR and stop Dependabot creating any more for this major version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this minor version will close this PR and stop Dependabot creating any more for this minor version (unless you reopen the PR or upgrade to it yourself)
  • @dependabot ignore this dependency will close this PR and stop Dependabot creating any more for this dependency (unless you reopen the PR or upgrade to it yourself)

Bumps [ddtrace](https://github.com/DataDog/dd-trace-py) from 2.13.0 to 2.14.2.
- [Release notes](https://github.com/DataDog/dd-trace-py/releases)
- [Changelog](https://github.com/DataDog/dd-trace-py/blob/main/CHANGELOG.md)
- [Commits](DataDog/dd-trace-py@v2.13.0...v2.14.2)

---
updated-dependencies:
- dependency-name: ddtrace
  dependency-type: direct:production
  update-type: version-update:semver-minor
...

Signed-off-by: dependabot[bot] <[email protected]>
@dependabot dependabot bot added the dependencies Pull requests that update a dependency file label Oct 7, 2024
Copy link
Contributor Author

dependabot bot commented on behalf of github Oct 21, 2024

Superseded by #224.

@dependabot dependabot bot closed this Oct 21, 2024
@dependabot dependabot bot deleted the dependabot/pip/ddtrace-2.14.2 branch October 21, 2024 19:15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
dependencies Pull requests that update a dependency file
Projects
None yet
Development

Successfully merging this pull request may close these issues.

0 participants