-
Notifications
You must be signed in to change notification settings - Fork 677
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
🐛 Bug Report: incompatibilities with LLM semantics #1455
Comments
what seems similar to the spec is if you splat out json decoded events into attributes, the It is also possible that there are some implicit understandings of how to interpret the spec that I'm lacking, so feel free to correct me. |
Thanks for this @codefromthecrypt. OpenLLMetry was released on Oct 2023 and pre-dates the semantic conventions defined by otel. The semantic convention work is actually the result of this project, OpenLLMetry, and is very much still work in progress. When we started the OSS, there were no semantic conventions for LLMs and so we decided to add attributes to things that we think would be important to users. These became the basis for the discussions we've done in the otel working group where some attributes were officially adopted, some were changed slightly (for example, we've decided to change the prefix The incompatibilities you mentioned are just things we haven't gotten the chance to formalize in the otel working group but will be adopted soon. |
can you please link to issues upstream about "The incompatibilities you mentioned are just things we haven't gotten the chance to formalize in the otel working group but will be adopted soon."? because that's easier to track |
This is an update based on latest OpenLLMetry which now includes metrics based on the following sample code and what the semantics pending release 1.27.0 will define. Sample Codeimport os
from openai import OpenAI
from traceloop.sdk import Traceloop
# Set the service name such that it is different from other experiments
app_name = "openllmetry-python-ollama-traceloop"
# Default the SDK endpoint ENV variable to localhost
api_endpoint = os.getenv("TRACELOOP_BASE_URL", "http://localhost:4318")
# Don't batch spans, as this is a demo
Traceloop.init(app_name=app_name, api_endpoint=api_endpoint, disable_batch=True)
def main():
ollama_host = os.getenv('OLLAMA_HOST', 'localhost')
# Use the OpenAI endpoint, not the Ollama API.
base_url = 'http://' + ollama_host + ':11434/v1'
client = OpenAI(base_url=base_url, api_key='unused')
messages = [
{
'role': 'user',
'content': '<|fim_prefix|>def hello_world():<|fim_suffix|><|fim_middle|>',
},
]
chat_completion = client.chat.completions.create(model='codegemma:2b-code', messages=messages)
print(chat_completion.choices[0].message.content)
if __name__ == "__main__":
main() SpansSemantic evaluation on spans. compatible:
missing required fields:
deprecated fields:
incompatible:
not yet defined in the standard:
MetricsSemantic evaluation on 'gen_ai.client.token.usage'inputcompatible:
missing:
not yet defined in the standard:
outputcompatible:
missing:
not yet defined in the standard:
'gen_ai.client.operation.duration'compatible:
missing:
not yet defined in the standard:
'gen_ai.client.generation.choices' (custom)This is not defined in the spec Example collector log
|
Which component is this bug for?
LLM Semantic Conventions
📜 Description
As a first timer, I tried the ollama instrumentation, and sent a trace to a local collector. Then I compared the output with llm semantics defined by otel. I noticed as many compatibilities as incompatibilities, and it made me concerned that other instrumentation may have other large glitches.
👟 Reproduction steps
use olllama-python with the instrumentation here. It doesn't matter if you use the traceloop-sdk or normal otel to initialize the instrumentation ( I checked both just in case)
👍 Expected behavior
otel specs should be a subset of openllmetry semantics, so no incompatible attributes.
👎 Actual Behavior with Screenshots
compatible:
Incompatible:
not yet defined in the standard:
🤖 Python Version
3.12
📃 Provide any additional context for the Bug.
partially addressed by @gyliu513 in #884
👀 Have you spent some time to check if this bug has been raised before?
Are you willing to submit PR?
None
The text was updated successfully, but these errors were encountered: