-
-
Notifications
You must be signed in to change notification settings - Fork 1.3k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[feature request] Add support for schema definition with tools/function calling #251
Comments
@nobilelucifero could you share the full relevant code excerpt which you tried running and how should it look like to achieve tools customizability? will think how we might improve that area. |
Wi, we are defining something similar, stay tuned! |
Amazing! This is the code (more or less because I've been playing with it) without using Scrapegraph: # %%capture
# !pip install openai
import requests
import json
from google.colab import userdata
from openai import OpenAI
llm_client = OpenAI(
api_key="YOUR_OPENAI_KEY"
)
"""Set up tools"""
def get_tov(audience, purpose):
result = {
"audience": audience,
"purpose": purpose
}
return json.dumps(result)
tools = [
{
"type": "function",
"function": {
"name": "get_tov",
"description": "Get the Tone of Voice of a content source piece written by the author themselves.",
"parameters": {
"type": "object",
"properties": {
"audience": {
"type": "string",
"description": "Defines who are the primary audiences or target demographics of the input. Each item will be a 2-3 word description.",
},
"purpose": {
"type": "string",
"description": "What's the main purpose or goal of the text?",
},
},
"required": ["audience", "purpose"],
},
}
}
]
tool_choice = {"type": "function", "function": {"name": "get_tov"}}
def converse(messages):
response = llm_client.chat.completions.create(
model="gpt-3.5-turbo",
messages=messages,
tools=tools,
tool_choice="auto"
# tool_choice=tool_choice
)
response_message = response.choices[0].message
tool_calls = response_message.tool_calls
if tool_calls:
messages.append(response_message)
available_functions = {
"get_tov": get_tov
}
for tool_call in tool_calls:
print(f"Function: {tool_call.function.name}")
print(f"Params: {tool_call.function.arguments}")
function_name = tool_call.function.name
function_to_call = available_functions[function_name]
function_args = json.loads(tool_call.function.arguments)
function_response = function_to_call(
audience = function_args.get("audience"),
purpose = function_args.get("purpose"),
)
print(f"Tool reponse: {function_response}")
messages.append({
"tool_call_id": tool_call.id,
"role": "tool",
"name": function_name,
"content": function_response
})
second_reponse = llm_client.chat.completions.create(
model = "gpt-3.5-turbo",
messages = messages,
)
return second_reponse.choices[0].message.content
# return response.choices[0].message;
input = """
A poor Woodman was cutting down a tree near the edge of a deep pool in the forest. It was late in the day and the Woodman was tired. He had been working since sunrise and his strokes were not so sure as they had been early that morning. Thus it happened that the axe slipped and flew out of his hands into the pool.
The Woodman was in despair. The axe was all he possessed with which to make a living, and he had not money enough to buy a new one. As he stood wringing his hands and weeping, the god Mercury suddenly appeared and asked what the trouble was. The Woodman told what had happened, and straightway the kind Mercury dived into the pool. When he came up again he held a wonderful golden axe.
"Is this your axe?" Mercury asked the Woodman.
"No," answered the honest Woodman, "that is not my axe."
Mercury laid the golden axe on the bank and sprang back into the pool. This time he brought up an axe of silver, but the Woodman declared again that his axe was just an ordinary one with a wooden handle.
Mercury dived down for the third time, and when he came up again he had the very axe that had been lost.
The poor Woodman was very glad that his axe had been found and could not thank the kind god enough. Mercury was greatly pleased with the Woodman's honesty.
"I admire your honesty," he said, "and as a reward you may have all three axes, the gold and the silver as well as your own."
The happy Woodman returned to his home with his treasures, and soon the story of his good fortune was known to everybody in the village. Now there were several Woodmen in the village who believed that they could easily win the same good fortune. They hurried out into the woods, one here, one there, and hiding their axes in the bushes, pretended they had lost them. Then they wept and wailed and called on Mercury to help them.
And indeed, Mercury did appear, first to this one, then to that. To each one he showed an axe of gold, and each one eagerly claimed it to be the one he had lost. But Mercury did not give them the golden axe. Oh no! Instead he gave them each a hard whack over the head with it and sent them home. And when they returned next day to look for their own axes, they were nowhere to be found.
Honesty is the best policy.
"""
result = converse(messages=[
{
"role": "system",
"content": "You are a savvy copywriter for SEO, Social Media, and Blogs."
}, {
"role": "user",
"content": "What's the tone of voice of this text?"
}, {
"role": "user",
"content": input,
},
])
print(result) |
Is your feature request related to a problem? Please describe.
I was playing with Scrapegraph, and I wanted to define my structured output using
tools
both in theSmartScraperGraph()
function and thegraph_config
configuration object.Describe the solution you'd like
Is there a way to do something like this currently?
Describe alternatives you've considered
Besides re-prompting the initial output, I've tried:
and then
Which returns the following error:
Additional context
This guide explains it better than I could:
https://cookbook.openai.com/examples/how_to_call_functions_with_chat_models
The text was updated successfully, but these errors were encountered: