Promptflow Tracing Error with Custom LLM Node #3778
Replies: 1 comment 1 reply
-
I found the solution. I was missing the open telemetry and OpenAI package which was causing issues in the trace. |
Beta Was this translation helpful? Give feedback.
1 reply
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
-
Hi everyone,
We are getting this error from promptflow tracing:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/litellm/llms/azure.py", line 643, in completion
response = azure_client.chat.completions.create(**data, timeout=timeout) # type: ignore
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_integrations/_openai_injector.py", line 95, in wrapper
return f(*args, **kwargs)
^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_trace.py", line 561, in wrapped
token_collector.collect_openai_tokens_for_parent_span(span)
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_trace.py", line 143, in collect_openai_tokens_for_parent_span
merged_tokens = {
^
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_trace.py", line 144, in
key: self._span_id_to_tokens[parent_span_id].get(key, 0) + tokens.get(key, 0)
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~
TypeError: unsupported operand type(s) for +: 'NoneType' and 'NoneType'
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.11/site-packages/opentelemetry/trace/init.py", line 590, in use_span
yield span
File "/usr/local/lib/python3.11/site-packages/opentelemetry/sdk/trace/init.py", line 1108, in start_as_current_span
yield span
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_trace.py", line 98, in start_as_current_span
yield span
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py", line 906, in _start_flow_span
yield span
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py", line 937, in _exec_inner_with_trace
output, nodes_outputs = self._traverse_nodes(inputs, context)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py", line 1215, in _traverse_nodes
nodes_outputs, bypassed_nodes = self._submit_to_scheduler(context, inputs, batch_nodes)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/flow_executor.py", line 1270, in _submit_to_scheduler
return scheduler.execute(self._line_timeout_sec)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py", line 131, in execute
raise e
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py", line 113, in execute
self._dag_manager.complete_nodes(self._collect_outputs(completed_futures))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py", line 160, in _collect_outputs
each_node_result = each_future.result()
^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 449, in result
return self.__get_result()
^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/concurrent/futures/_base.py", line 401, in __get_result
raise self._exception
File "/usr/local/lib/python3.11/concurrent/futures/thread.py", line 58, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/executor/_flow_nodes_scheduler.py", line 181, in _exec_single_node_in_thread
result = context.invoke_tool(node, f, kwargs=kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py", line 90, in invoke_tool
result = self._invoke_tool_inner(node, f, kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py", line 201, in _invoke_tool_inner
raise e
File "/usr/local/lib/python3.11/site-packages/promptflow/_core/flow_execution_context.py", line 182, in _invoke_tool_inner
return f(**kwargs)
^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/tracing/_trace.py", line 556, in wrapped
output = func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^
File "/usr/local/lib/python3.11/site-packages/promptflow/tools/common.py", line 416, in wrapper
raise WrappedOpenAIError(e)
promptflow.tools.exception.WrappedOpenAIError: OpenAI API hits APIError: litellm.APIError: AzureException APIError - unsupported operand type(s) for +: 'NoneType' and 'NoneType' [Error reference: https://platform.openai.com/docs/guides/error-codes/api-errors]
It occurs when we are looping through and running multiple llms in our custom llm node, and it throws after the loop is complete.
The error is coming from promptflow trace as when I removed start_trace() call, no error occurs.
It is a simple loop through that is throwing the error. This is what the interface looks [like:]
Has anyone seen this before or can provide guidance on how to loop through with trace?
Beta Was this translation helpful? Give feedback.
All reactions