You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Error Handling The implementation in semantic_router/layer.py lines 300-307 introduces a default behavior where if no LLM is provided, it defaults to using OpenAILLM with use_async=True. This could lead to unexpected behavior if the user is unaware that no LLM was provided and the system defaults to an asynchronous LLM. It might be beneficial to either document this behavior clearly or reconsider if a silent defaulting is appropriate without explicit user consent or notification.
Exception Handling In the __init__ method of OpenAILLM class, the exception handling when initializing self.client could be more specific. Currently, it catches a general Exception, which might obscure the root cause of initialization failures. It would be better to catch specific exceptions related to API client initialization to provide clearer error messages to the user.
if function_schemas:
+ if not completion.choices:+ raise ValueError("No choices available in the completion response.")
tool_calls = completion.choices[0].message.tool_calls
if tool_calls is None:
raise ValueError("Invalid output, expected a tool call.")
if len(tool_calls) < 1:
raise ValueError(
"Invalid output, expected at least one tool to be specified."
)
Suggestion importance[1-10]: 10
Why: This suggestion addresses a potential bug that could cause an IndexError, thus improving the robustness and reliability of the code.
10
Maintainability
Simplify the client initialization logic to reduce redundancy and enhance code clarity
Refactor the exception handling in the initialization of self.client to reduce code duplication and improve maintainability.
-if use_async:- try:- self.client = openai.AsyncOpenAI(api_key=api_key)- except Exception as e:- raise ValueError(- f"AsyncOpenAI API client failed to initialize. Error: {e}"- ) from e-else:- try:- self.client = openai.OpenAI(api_key=api_key)- except Exception as e:- raise ValueError(- f"OpenAI API client failed to initialize. Error: {e}"- ) from e+try:+ self.client = openai.AsyncOpenAI(api_key=api_key) if use_async else openai.OpenAI(api_key=api_key)+except Exception as e:+ raise ValueError(+ f"{'Async' if use_async else ''}OpenAI API client failed to initialize. Error: {e}"+ ) from e
Suggestion importance[1-10]: 9
Why: This suggestion significantly improves code maintainability and readability by reducing redundancy in exception handling, making the code easier to manage and less error-prone.
9
Reset execution count to None for reproducibility
To ensure that the notebook cells are reproducible and do not depend on external state, it is advisable to reset the execution count to None before sharing or exporting the notebook.
Why: Resetting the execution count to None ensures that the notebook cells are reproducible and do not depend on external state, which is important for maintainability and sharing.
8
Best practice
Add exception handling to asynchronous calls to improve robustness
It's recommended to handle potential exceptions when using asynchronous calls to avoid runtime errors that could halt the execution of your notebook. You can use a try-except block to catch exceptions and handle them gracefully.
-response = await rl.acall("what is the time in new york city?")+try:+ response = await rl.acall("what is the time in new york city?")+except Exception as e:+ response = f"An error occurred: {str(e)}"
response
Suggestion importance[1-10]: 9
Why: Adding exception handling to asynchronous calls is a best practice that improves the robustness and reliability of the code by preventing runtime errors from halting execution.
9
Enhancement
Improve the readability and maintainability of the acall method by extracting message transformation into a separate method
Refactor the acall method to separate the logic of message transformation and API call into a private method to improve code readability and maintainability.
+transformed_messages = self._transform_messages(messages)
completion = await self.client.chat.completions.create(
model=self.name,
- messages=[m.to_openai() for m in messages],+ messages=transformed_messages,
temperature=self.temperature,
max_tokens=self.max_tokens,
tools=tools, # type: ignore # We pass a list of dicts which get interpreted as Iterable[ChatCompletionToolParam].
)
+def _transform_messages(self, messages):+ return [m.to_openai() for m in messages]+
Suggestion importance[1-10]: 8
Why: This suggestion enhances code readability and maintainability by separating concerns, making the acall method cleaner and easier to understand.
8
Add comments to clarify the context of outputs
When displaying outputs in a notebook, it's useful to add explanatory text or comments to clarify the context or purpose of the output, especially when the output is a plain text or a simple data structure.
+# Output showing the route choice for a casual conversation query
"RouteChoice(name='chitchat', function_call=None, similarity_score=None)"
Suggestion importance[1-10]: 7
Why: Adding comments to clarify the context of outputs enhances readability and helps users understand the purpose of the output, especially when dealing with plain text or simple data structures.
7
Possible issue
Ensure route.llm is not set to None by checking self.llm existence before assignment
Consider checking for the existence of self.llm before assigning route.llm to avoid potential issues where route.llm could be set to None inadvertently if self.llm has not been initialized properly.
if not self.llm:
logger.warning(
"No LLM provided for dynamic route, will use OpenAI LLM default"
)
self.llm = OpenAILLM(use_async=True)
- route.llm = self.llm-else:- route.llm = self.llm+route.llm = self.llm if self.llm else None
Suggestion importance[1-10]: 7
Why: The suggestion improves the robustness of the code by ensuring route.llm is not set to None, but the likelihood of self.llm being None after initialization is low, making this a minor improvement.
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
PR Type
Enhancement, Documentation
Description
RouteLayer
.OpenAILLM
to support both synchronous and asynchronous clients.acall
method inOpenAILLM
for handling async requests.Changes walkthrough 📝
layer.py
Add async support for dynamic routes in RouteLayer
semantic_router/layer.py
openai.py
Enhance OpenAILLM to support async client and calls
semantic_router/llms/openai.py
acall
method for handling async requests.02-dynamic-routes.ipynb
Update dynamic routes documentation with async examples
docs/02-dynamic-routes.ipynb