diff --git a/docs/how-to-guides/develop-a-tool/create-custom-llm-tool.md b/docs/how-to-guides/develop-a-tool/create-custom-llm-tool.md new file mode 100644 index 000000000000..5e11d8a69e47 --- /dev/null +++ b/docs/how-to-guides/develop-a-tool/create-custom-llm-tool.md @@ -0,0 +1,64 @@ +# Create Custom LLM Tool +In this document, we will guide you through the process of developing a tool with `custom_llm` type which can connect to a customized large language model with prompt. + +Custom LLM is a fine-tuned large language model by yourself. If you find it has good performance and is useful, then you can follow this guidance to make it a tool so that it can be shared with other people to have more impact. + +## Prerequisites +- Please ensure that your `Prompt flow` for VS Code is updated to version 1.2.0 or later. + +## How to create a custom LLM tool +Here we use [an existing tool package](../../../examples/tools/tool-package-quickstart/my_tool_package) as an example. If you want to create your own tool, please refer to [create and use tool package](create-and-use-tool-package.md). + +### Add a `PromptTemplate` input for your tool, like in [this example](../../../examples/tools/tool-package-quickstart/my_tool_package/tools/tool_with_custom_llm_type.py) + +```python +from jinja2 import Template +from promptflow import tool +from promptflow.connections import CustomConnection +# 1. Import the PromptTemplate type. +from promptflow.contracts.types import PromptTemplate + + +# 2. Add a PromptTemplate input for your tool method. +@tool +def my_tool(connection: CustomConnection, prompt: PromptTemplate, **kwargs) -> str: + # 3. The prompt is used for your custom LLM to do inference, customise your own code to handle and use the prompt here. + message = Template(prompt, trim_blocks=True, keep_trailing_newline=True).render(**kwargs) + return message +``` + +### Generate custom LLM tool YAML + +Run the command below in your tool project directory to automatically generate your tool YAML, use _-t "custom_llm"_ or _--tool-type "custom_llm"_ to indicate this is a custom LLM tool: +``` +python \tool\generate_package_tool_meta.py -m -o -t "custom_llm" +``` +Here we use [an existing tool](../../../examples/tools/tool-package-quickstart/my_tool_package/yamls/tool_with_custom_llm_type.yaml) as an example. +``` +cd D:\proj\github\promptflow\examples\tools\tool-package-quickstart + +python D:\proj\github\promptflow\scripts\tool\generate_package_tool_meta.py -m my_tool_package.tools.tool_with_custom_llm_type -o my_tool_package\yamls\tool_with_custom_llm_type.yaml -n "Custom LLM Tool" -d "This is a tool to demonstrate the custom_llm tool type" -t "custom_llm" +``` +This command will generate YAML as follows: + +```yaml +my_tool_package.tools.tool_with_custom_llm_type.my_tool: +name: Custom LLM Tool +description: This is a tool to demonstrate the custom_llm tool type +# The type is custom_llm. +type: custom_llm +module: my_tool_package.tools.tool_with_custom_llm_type +function: my_tool +inputs: + connection: + type: + - CustomConnection +``` + +## Use custom LLM tool in VS Code extension +Follow the steps to [build and install your tool package](create-and-use-tool-package.md#build-and-share-the-tool-package) and [use your tool from VS Code extension](create-and-use-tool-package.md#use-your-tool-from-vscode-extension). + +Here we use an existing flow to demonstrate the experience, open [this flow](../../../examples/flows/standard/custom_llm_tool_showcase/flow.dag.yaml) in VS Code extension. +- There is a node named "custom_llm_tool" with a prompt template file. You have the option to either use an existing file or create a new one as the prompt template file. + +![use_custom_llm_tool](../../media/how-to-guides/develop-a-tool/use_custom_llm_tool.png) \ No newline at end of file diff --git a/docs/how-to-guides/develop-a-tool/index.md b/docs/how-to-guides/develop-a-tool/index.md index ff2bd4b3a883..81352b72bc0a 100644 --- a/docs/how-to-guides/develop-a-tool/index.md +++ b/docs/how-to-guides/develop-a-tool/index.md @@ -8,5 +8,5 @@ We provide guides on how to develop a tool and use it. create-and-use-tool-package add-a-tool-icon add-category-and-tags-for-tool -use-custom-llm-tool +create-custom-llm-tool ``` \ No newline at end of file diff --git a/docs/how-to-guides/develop-a-tool/use-custom-llm-tool.md b/docs/how-to-guides/develop-a-tool/use-custom-llm-tool.md deleted file mode 100644 index b0516b7ed72b..000000000000 --- a/docs/how-to-guides/develop-a-tool/use-custom-llm-tool.md +++ /dev/null @@ -1,52 +0,0 @@ -# Use Custom LLM Tool -Users sometimes need to use a prompt template within their tools. To simplify this, we've introduced the `PromptTemplate` feature. -In this guide, we will provide a detailed walkthrough on how to use `PromptTemplate` as a tool input. We will also demonstrate the user experience when utilizing this type of tool within a flow. - -## Prerequisites -- Create a tool package as described in [Create and Use Tool Package](create-and-use-tool-package.md). -- Ensure that the tool's type is `custom_llm`. - -## How to create a tool with prompt template -Here we use [an existing tool package](../../../examples/tools/tool-package-quickstart/my_tool_package) as an example. - -1. Add a `PromptTemplate` input for your tool, such as in [this example](../../../examples/tools/tool-package-quickstart/my_tool_package/tools/custom_llm_tool_showcase.py) - - ```python - from jinja2 import Template - from promptflow import tool - from promptflow.connections import CustomConnection - # 1. import the PromptTemplate type - from promptflow.contracts.types import PromptTemplate - - - # 2. add a PromptTemplate input for your tool method - @tool - def my_tool(connection: CustomConnection, prompt: PromptTemplate, **kwargs) -> str: - # 3. customise your own code to handle and use the prompt here - message = Template(prompt, trim_blocks=True, keep_trailing_newline=True).render(**kwargs) - return message - ``` - -2. Configure the tool YAML. Please note that the `PromptTemplate` input should not be included in the YAML. You can check out an example in [this location](../../../examples/tools/tool-package-quickstart/my_tool_package/yamls/custom_llm_tool_showcase.yaml): - - ```yaml - my_tool_package.tools.custom_llm_tool_showcase.my_tool: - name: Custom LLM Tool - description: This is a custom LLM tool - type: custom_llm - module: my_tool_package.tools.custom_llm_tool_showcase - function: my_tool - inputs: - connection: - type: - - CustomConnection - - ``` - -## Use a tool with a prompt template input in VS Code extension -To use your tool with a prompt template input, follow the steps to [build and install your tool package](create-and-use-tool-package.md#build-and-share-the-tool-package) and [use your tool from VS Code extension](create-and-use-tool-package.md#use-your-tool-from-vscode-extension). - -Here, we will use an existing flow to demonstrate the experience. Open [this flow](../../../examples/flows/standard/prompt-template-input-tool-showcase/flow.dag.yaml) in VS Code extension. -- There is a node named "tool_with_prompt_template" with a prompt template file, and the inputs of this node contain the input for the prompt template. - -![use_prompt_template_in_flow](../../media/how-to-guides/develop-a-tool/use_prompt_template_in_flow.png) \ No newline at end of file diff --git a/docs/media/how-to-guides/develop-a-tool/use_custom_llm_tool.png b/docs/media/how-to-guides/develop-a-tool/use_custom_llm_tool.png new file mode 100644 index 000000000000..1ab389aa3b15 Binary files /dev/null and b/docs/media/how-to-guides/develop-a-tool/use_custom_llm_tool.png differ diff --git a/docs/media/how-to-guides/develop-a-tool/use_prompt_template_in_flow.png b/docs/media/how-to-guides/develop-a-tool/use_prompt_template_in_flow.png deleted file mode 100644 index 9ebd1afed417..000000000000 Binary files a/docs/media/how-to-guides/develop-a-tool/use_prompt_template_in_flow.png and /dev/null differ diff --git a/examples/tools/tool-package-quickstart/my_tool_package/tools/custom_llm_tool_showcase.py b/examples/tools/tool-package-quickstart/my_tool_package/tools/tool_with_custom_llm_type.py similarity index 96% rename from examples/tools/tool-package-quickstart/my_tool_package/tools/custom_llm_tool_showcase.py rename to examples/tools/tool-package-quickstart/my_tool_package/tools/tool_with_custom_llm_type.py index 79ab574b7d7b..b9535467ffed 100644 --- a/examples/tools/tool-package-quickstart/my_tool_package/tools/custom_llm_tool_showcase.py +++ b/examples/tools/tool-package-quickstart/my_tool_package/tools/tool_with_custom_llm_type.py @@ -10,4 +10,4 @@ def my_tool(connection: CustomConnection, prompt: PromptTemplate, **kwargs) -> s # Usually connection contains configs to connect to an API. # Not all tools need a connection. You can remove it if you don't need it. message = Template(prompt, trim_blocks=True, keep_trailing_newline=True).render(**kwargs) - return message + return message \ No newline at end of file diff --git a/examples/tools/tool-package-quickstart/my_tool_package/yamls/custom_llm_tool_showcase.yaml b/examples/tools/tool-package-quickstart/my_tool_package/yamls/custom_llm_tool_showcase.yaml deleted file mode 100644 index 9c29351e6c6b..000000000000 --- a/examples/tools/tool-package-quickstart/my_tool_package/yamls/custom_llm_tool_showcase.yaml +++ /dev/null @@ -1,10 +0,0 @@ -my_tool_package.tools.custom_llm_tool_showcase.my_tool: - name: Custom LLM Tool - description: This is a custom LLM tool - type: custom_llm - module: my_tool_package.tools.custom_llm_tool_showcase - function: my_tool - inputs: - connection: - type: - - CustomConnection diff --git a/examples/tools/tool-package-quickstart/my_tool_package/yamls/tool_with_custom_llm_type.yaml b/examples/tools/tool-package-quickstart/my_tool_package/yamls/tool_with_custom_llm_type.yaml new file mode 100644 index 000000000000..8b90501802e7 --- /dev/null +++ b/examples/tools/tool-package-quickstart/my_tool_package/yamls/tool_with_custom_llm_type.yaml @@ -0,0 +1,10 @@ +my_tool_package.tools.tool_with_custom_llm_type.my_tool: + description: This is a tool to demonstrate the custom_llm tool type + function: my_tool + inputs: + connection: + type: + - CustomConnection + module: my_tool_package.tools.tool_with_custom_llm_type + name: Custom LLM Tool + type: custom_llm diff --git a/examples/tools/tool-package-quickstart/tests/test_custom_llm_tool_showcase.py b/examples/tools/tool-package-quickstart/tests/test_tool_with_custom_llm_type.py similarity index 76% rename from examples/tools/tool-package-quickstart/tests/test_custom_llm_tool_showcase.py rename to examples/tools/tool-package-quickstart/tests/test_tool_with_custom_llm_type.py index cacb91b78c0c..08c2af8d672c 100644 --- a/examples/tools/tool-package-quickstart/tests/test_custom_llm_tool_showcase.py +++ b/examples/tools/tool-package-quickstart/tests/test_tool_with_custom_llm_type.py @@ -2,7 +2,7 @@ import unittest from promptflow.connections import CustomConnection -from my_tool_package.tools.custom_llm_tool_showcase import my_tool +from my_tool_package.tools.tool_with_custom_llm_type import my_tool @pytest.fixture @@ -17,8 +17,8 @@ def my_custom_connection() -> CustomConnection: return my_custom_connection -class TestToolWithPromptTemplateInput: - def test_custom_llm_tool_showcase(self, my_custom_connection): +class TestToolWithCustomLLMType: + def test_tool_with_custom_llm_type(self, my_custom_connection): result = my_tool(my_custom_connection, "Hello {{text}}", text="Microsoft") assert result == "Hello Microsoft"