Skip to content

Commit

Permalink
Merge branch 'main' into yigao/fix_matrix
Browse files Browse the repository at this point in the history
  • Loading branch information
crazygao authored Sep 8, 2023
2 parents 493dc6c + 2077f58 commit cb90270
Show file tree
Hide file tree
Showing 73 changed files with 1,416 additions and 1,279 deletions.
1 change: 1 addition & 0 deletions .github/workflows/publish_doc.yml
Original file line number Diff line number Diff line change
Expand Up @@ -11,6 +11,7 @@ on:
- 'docs/**'
- 'scripts/docs/**'
- '.github/workflows/publish_doc.yml'
- 'src/promptflow/promptflow/**'

# Sets permissions of the GITHUB_TOKEN to allow deployment to GitHub Pages
permissions:
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks

exclude: '(^docs/)|flows|examples|scripts|src/promptflow/promptflow/azure/_restclient|src/promptflow/tests/test_configs|src/promptflow-tools'
exclude: '(^docs/)|flows|examples|scripts|src/promptflow/promptflow/azure/_restclient/|src/promptflow/tests/test_configs|src/promptflow-tools'

repos:
- repo: https://github.com/pre-commit/pre-commit-hooks
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,11 +21,11 @@ git clone https://github.com/microsoft/promptflow.git
### Create custom tool package
Run below command under root folder to create your tool project quickly:
```
python scripts\tool\generate_tool_package_template.py --destination <your-tool-project> --package-name <your-package-name> --tool-name <your-tool-name> --function-name <your-tool-function-name>
python <path-to-scripts>\tool\generate_tool_package_template.py --destination <your-tool-project> --package-name <your-package-name> --tool-name <your-tool-name> --function-name <your-tool-function-name>
```
For example:
```
python scripts\tool\generate_tool_package_template.py --destination hello-world-proj --package-name hello-world --tool-name hello_world_tool --function-name get_greeting_message
python D:\proj\github\promptflow\scripts\tool\generate_tool_package_template.py --destination hello-world-proj --package-name hello-world --tool-name hello_world_tool --function-name get_greeting_message
```
This auto-generated script will create one tool for you. The parameters _destination_ and _package-name_ are mandatory. The parameters _tool-name_ and _function-name_ are optional. If left unfilled, the _tool-name_ will default to _hello_world_tool_, and the _function-name_ will default to _tool-name_.

Expand Down Expand Up @@ -64,11 +64,11 @@ hello-world-proj/

> [!Note] If you create a new tool, don't forget to also create the corresponding tool YAML. You can run below command under your tool project to auto generate your tool YAML. You may want to specify `-n` for `name` and `-d` for `description`, which would be displayed as the tool name and tooltip in prompt flow UI.
```
python ..\scripts\tool\generate_package_tool_meta.py -m <tool_module> -o <tool_yaml_path> -n <tool_name> -d <tool_description>
python <path-to-scripts>\tool\generate_package_tool_meta.py -m <tool_module> -o <tool_yaml_path> -n <tool_name> -d <tool_description>
```
For example:
```
python ..\scripts\tool\generate_package_tool_meta.py -m hello_world.tools.hello_world_tool -o hello_world\yamls\hello_world_tool.yaml -n "Hello World Tool" -d "This is my hello world tool."
python D:\proj\github\promptflow\scripts\tool\generate_package_tool_meta.py -m hello_world.tools.hello_world_tool -o hello_world\yamls\hello_world_tool.yaml -n "Hello World Tool" -d "This is my hello world tool."
```
To populate your tool module, adhere to the pattern \<package_name\>.tools.\<tool_name\>, which represents the folder path to your tool within the package.
6. **tests**: This directory contains all your tests, though they are not required for creating your custom tool package. When adding a new tool, you can also create corresponding tests and place them in this directory. Run below command under your tool project:
Expand Down
34 changes: 33 additions & 1 deletion docs/how-to-guides/init-and-test-a-flow.md
Original file line number Diff line number Diff line change
Expand Up @@ -222,7 +222,7 @@ Promptflow CLI provides a way to start an interactive chat session for chat flow
pf flow test --flow <flow-name> --interactive
```

After executing this command, customer can interact with the chat flow in the terminal. Customer can press **Enter** to send the message to chat flow. And customer can quit with **ctrl+Z**.
After executing this command, customer can interact with the chat flow in the terminal. Customer can press **Enter** to send the message to chat flow. And customer can quit with **ctrl+C**.
Promptflow CLI will distinguish the output of different roles by color, <span style="color:Green">User input</span>, <span style="color:Gold">Bot output</span>, <span style="color:Blue">Flow script output</span>, <span style="color:Cyan">Node output</span>.

Using this [chat flow](https://github.com/microsoft/promptflow/tree/main/examples/flows/chat/basic-chat) to show how to use interactive mode.
Expand All @@ -243,6 +243,38 @@ If a flow contains chat inputs or chat outputs in the flow interface, there will

::::

When the [LLM node](https://promptflow.azurewebsites.net/tools-reference/llm-tool.html) in the chat flow that is connected to the flow output, Promptflow SDK streams the results of the LLM node.

::::{tab-set}
:::{tab-item} CLI
:sync: CLI
The flow result will be streamed in the terminal as shown below.

![streaming_output](../media/how-to-guides/init-and-test-a-flow/streaming_output.gif)

:::

:::{tab-item} SDK
:sync: SDK

The LLM node return value of `test` function is a generator, you can consume the result by this way:

```python
from promptflow import PFClient

pf_client = PFClient()

# Test flow
inputs = {"<flow_input_name>": "<flow_input_value>"} # The inputs of the flow.
flow_result = pf_client.test(flow="<flow_folder_path>", inputs=inputs)
for item in flow_result["<LLM_node_output_name>"]:
print(item)
```

:::

::::

### Debug a single node in the flow

Customer can debug a single python node in VScode by the extension.
Expand Down
8 changes: 4 additions & 4 deletions docs/how-to-guides/quick-start.md
Original file line number Diff line number Diff line change
Expand Up @@ -139,10 +139,10 @@ api_version: <test_version>
If you are using OpenAI, sign up account via [OpenAI website](https://openai.com/), login and [find personal API key](https://platform.openai.com/account/api-keys), then use this yaml:
```yaml
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/AzureOpenAIConnection.schema.json
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/OpenAIConnection.schema.json
name: open_ai_connection
type: azure_open_ai
api_key: <test_key>
type: open_ai
api_key: "<user-input>"
organization: "" # optional
```
Then we can use CLI command to create the connection.
Expand Down Expand Up @@ -273,7 +273,7 @@ Use the code lens action on the top of the yaml editor to trigger flow test


Click the run flow button on the top of the visual editor to trigger flow test.
![visual_editor_flow_test](../media/how-to-guides/quick-start/run_flow_visual_editor.png)
![visual_editor_flow_test](../media/how-to-guides/quick-start/test_flow_dag_editor.gif)
:::

::::
Expand Down
Binary file modified docs/media/how-to-guides/init-and-test-a-flow/chat.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file not shown.
Binary file modified docs/media/how-to-guides/quick-start/vs_code_interpreter_1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file modified docs/media/how-to-guides/vscode_interactive_chat_1.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
18 changes: 9 additions & 9 deletions examples/flows/chat/chat-with-pdf/tests/chat_with_pdf_test.py
Original file line number Diff line number Diff line change
Expand Up @@ -70,15 +70,15 @@ def test_bulk_run_valid_mapping(self):
details = self.pf.get_details(run)
self.assertEqual(details.shape[0], 3)

def test_bulk_run_mapping_missing_one_column(self):
# in this case, run won't be created.
with self.assertRaises(ValidationException):
self.create_chat_run(
column_mapping={
"question": "${data.question}",
"pdf_url": "${data.pdf_url}",
}
)
# def test_bulk_run_mapping_missing_one_column(self):
# # in this case, run won't be created.
# with self.assertRaises(ValidationException):
# self.create_chat_run(
# column_mapping={
# "question": "${data.question}",
# "pdf_url": "${data.pdf_url}",
# }
# )

def test_bulk_run_invalid_mapping(self):
# in this case, run won't be created.
Expand Down

This file was deleted.

25 changes: 23 additions & 2 deletions scripts/docs/doc_generation.ps1
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,11 @@ param(
[string] $TempDocPath = New-TemporaryFile | % { Remove-Item $_; New-Item -ItemType Directory -Path $_ }
[string] $PkgSrcPath = [System.IO.Path]::Combine($RepoRootPath, "src\promptflow\promptflow")
[string] $OutPath = [System.IO.Path]::Combine($ScriptPath, "_build")
[string] $SphinxApiDoc = [System.IO.Path]::Combine($DocPath, "sphinx_apidoc.log")
[string] $SphinxBuildDoc = [System.IO.Path]::Combine($DocPath, "sphinx_build.log")
[string] $WarningErrorPattern = "WARNING:|ERROR:"
$apidocWarningsAndErrors = $null
$buildWarningsAndErrors = $null

if (-not $SkipInstall){
# Prepare doc generation packages
Expand Down Expand Up @@ -65,7 +70,8 @@ if($WithReferenceDoc){
}
Remove-Item $RefDocPath -Recurse -Force
Write-Host "===============Build Promptflow Reference Doc==============="
sphinx-apidoc --module-first --no-headings --no-toc --implicit-namespaces "$PkgSrcPath" -o "$RefDocPath"
sphinx-apidoc --module-first --no-headings --no-toc --implicit-namespaces "$PkgSrcPath" -o "$RefDocPath" | Tee-Object -FilePath $SphinxApiDoc
$apidocWarningsAndErrors = Select-String -Path $SphinxApiDoc -Pattern $WarningErrorPattern
}


Expand All @@ -78,7 +84,22 @@ if($WarningAsError){
if($BuildLinkCheck){
$BuildParams.Add("-blinkcheck")
}
sphinx-build $TempDocPath $OutPath -c $ScriptPath $BuildParams
sphinx-build $TempDocPath $OutPath -c $ScriptPath $BuildParams | Tee-Object -FilePath $SphinxBuildDoc
$buildWarningsAndErrors = Select-String -Path $SphinxBuildDoc -Pattern $WarningErrorPattern

Write-Host "Clean path: $TempDocPath"
Remove-Item $TempDocPath -Recurse -Confirm:$False -Force

if ($apidocWarningsAndErrors) {
Write-Host "=============== API doc warnings and errors ==============="
foreach ($line in $apidocWarningsAndErrors) {
Write-Host $line -ForegroundColor Red
}
}

if ($buildWarningsAndErrors) {
Write-Host "=============== Build warnings and errors ==============="
foreach ($line in $buildWarningsAndErrors) {
Write-Host $line -ForegroundColor Red
}
}
8 changes: 6 additions & 2 deletions src/promptflow/CHANGELOG.md
Original file line number Diff line number Diff line change
@@ -1,11 +1,15 @@
# Release History

## 0.1.0b5 (Upcoming)
## 0.1.0b5 (2023.09.08)

### Features added
### Features Added

- **pf run visualize**: support lineage graph & display name in visualize page

### Bugs Fixed

- Add missing requirement `psutil` in `setup.py`

## 0.1.0b4 (2023.09.04)

### Features added
Expand Down
9 changes: 7 additions & 2 deletions src/promptflow/promptflow/_cli/_pf/_flow.py
Original file line number Diff line number Diff line change
Expand Up @@ -298,7 +298,12 @@ def _init_flow_by_template(flow_name, flow_type, overwrite=False):
copy_extra_files(flow_path=flow_path, extra_files=["requirements.txt", ".gitignore"])

print(f"Done. Created {flow_type} flow folder: {flow_path.resolve()}.")
flow_test_args = "--interactive" if flow_type == "chat" else f"--input {os.path.join(flow_name, 'data.jsonl')}"
if flow_type == "chat":
flow_test_args = "--interactive"
print("The generated chat flow is requiring a connection named open_ai_connection, "
"please follow the steps in README.md to create if you haven't done that.")
else:
flow_test_args = f"--input {os.path.join(flow_name, 'data.jsonl')}"
flow_test_command = f"pf flow test --flow {flow_name} " + flow_test_args
print(f"You can execute this command to test the flow, {flow_test_command}")

Expand Down Expand Up @@ -340,7 +345,7 @@ def test_flow(args):
environment_variables=environment_variables,
variant=args.variant,
node=args.node,
streaming_output=False,
allow_generator_output=False,
)
# Dump flow/node test info
flow = load_flow(args.flow)
Expand Down
22 changes: 19 additions & 3 deletions src/promptflow/promptflow/_cli/_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@

from promptflow._sdk._utils import print_red_error, print_yellow_warning
from promptflow._utils.utils import is_in_ci_pipeline
from promptflow._utils.exception_utils import ExceptionPresenter
from promptflow.exceptions import ErrorTarget, PromptflowException, UserErrorException

AzureMLWorkspaceTriad = namedtuple("AzureMLWorkspace", ["subscription_id", "resource_group_name", "workspace_name"])
Expand Down Expand Up @@ -334,6 +335,12 @@ def pretty_print_dataframe_as_table(df: pd.DataFrame) -> None:
print(tabulate(df, headers="keys", tablefmt="grid", maxcolwidths=column_widths, maxheadercolwidths=column_widths))


def is_format_exception():
if os.environ.get("PROMPTFLOW_STRUCTURE_EXCEPTION_OUTPUT", "false").lower() == "true":
return True
return False


def exception_handler(command: str):
"""Catch known cli exceptions."""

Expand All @@ -342,9 +349,18 @@ def decorator(func):
def wrapper(*args, **kwargs):
try:
return func(*args, **kwargs)
except PromptflowException as e:
print_red_error(f"{command} failed with {e.__class__.__name__}: {str(e)}")
exit(1)
except Exception as e:
if is_format_exception():
# When the flag format_exception is set in command,
# it will write a json with exception info and command to stderr.
error_msg = ExceptionPresenter.create(e).to_dict(include_debug_info=True)
error_msg["command"] = " ".join(sys.argv)
sys.stderr.write(json.dumps(error_msg))
if isinstance(e, PromptflowException):
print_red_error(f"{command} failed with {e.__class__.__name__}: {str(e)}")
exit(1)
else:
raise e

return wrapper

Expand Down
12 changes: 8 additions & 4 deletions src/promptflow/promptflow/_cli/data/chat_flow/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,11 @@ Currently, there are two connection types supported by LLM tool: "AzureOpenAI" a

```bash
# Override keys with --set to avoid yaml file changes
pf connection create --file openai.yaml --set api_key=<your_api_key> organization=<your_organization> --name open_ai_connection
# Create open ai connection
pf connection create --file openai.yaml --set api_key=<your_api_key> --name open_ai_connection

# Create azure open ai connection
# pf connection create --file azure_openai.yaml --set api_key=<your_api_key> api_base=<your_api_base> --name open_ai_connection
```

Note in [flow.dag.yaml](flow.dag.yaml) we are using connection named `open_ai_connection`.
Expand Down Expand Up @@ -38,13 +42,13 @@ Promptflow CLI provides a way to start an interactive chat session for chat flow
pf flow test --flow <flow_folder> --interactive
```

After executing this command, customer can interact with the chat flow in the terminal. Customer can press **Enter** to send the message to chat flow. And customer can quit with **ctrl+Z**.
After executing this command, customer can interact with the chat flow in the terminal. Customer can press **Enter** to send the message to chat flow. And customer can quit with **ctrl+C**.
Promptflow CLI will distinguish the output of different roles by color, <span style="color:Green">User input</span>, <span style="color:Gold">Bot output</span>, <span style="color:Blue">Flow script output</span>, <span style="color:Cyan">Node output</span>.

> =========================================<br>
> Welcome to chat flow, <You-flow-name>.<br>
> Press Enter to send your message.<br>
> You can quit with ctrl+Z.<br>
> You can quit with ctrl+C.<br>
> =========================================<br>
> <span style="color:Green">User:</span> What types of container software there are<br>
> <span style="color:Gold">Bot:</span> There are several types of container software available, including:<br>
Expand All @@ -61,7 +65,7 @@ If customer adds "--verbose" in the pf command, the output of each step will be
> =========================================<br>
> Welcome to chat flow, Template Chat Flow.<br>
> Press Enter to send your message.<br>
> You can quit with ctrl+Z.<br>
> You can quit with ctrl+C.<br>
> =========================================<br>
> <span style="color:Green">User:</span> What types of container software there are<br>
> <span style="color:Cyan">chat:</span> There are several types of container software available, including:<br>
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,6 @@
$schema: https://azuremlschemas.azureedge.net/promptflow/latest/AzureOpenAIConnection.schema.json
name: open_ai_connection
type: azure_open_ai
api_key: "<user-input>"
api_base: "<user-input>"
api_type: "azure"
8 changes: 7 additions & 1 deletion src/promptflow/promptflow/_core/_errors.py
Original file line number Diff line number Diff line change
@@ -1,7 +1,13 @@
from traceback import TracebackException

from promptflow._utils.exception_utils import ADDITIONAL_INFO_USER_EXECUTION_ERROR, last_frame_info
from promptflow.exceptions import ErrorTarget, UserErrorException, ValidationException
from promptflow.exceptions import ErrorTarget, SystemErrorException, UserErrorException, ValidationException


class NotSupported(SystemErrorException):
"""Exception raised when the feature is not supported."""

pass


class PackageToolNotFoundError(ValidationException):
Expand Down
9 changes: 9 additions & 0 deletions src/promptflow/promptflow/_core/metric_logger.py
Original file line number Diff line number Diff line change
Expand Up @@ -45,6 +45,15 @@ def remove_metric_logger(self, logger_func: Callable):


def log_metric(key, value, variant_id=None):
"""Log a metric for current promptflow run.
:param key: Metric name.
:type key: str
:param value: Metric value.
:type value: float
:param variant_id: Variant id for the metric.
:type variant_id: str
"""
MetricLoggerManager.get_instance().log_metric(key, value, variant_id)


Expand Down
12 changes: 10 additions & 2 deletions src/promptflow/promptflow/_core/tool.py
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
import logging
from abc import ABC
from enum import Enum
from typing import Optional
from typing import Callable, Optional

module_logger = logging.getLogger(__name__)

Expand Down Expand Up @@ -39,7 +39,15 @@ def active_instance(cls) -> Optional["ToolInvoker"]:
return cls._active_tool_invoker


def tool(f):
def tool(f: Callable) -> Callable:
"""Decorator for tool functions. The decorated function will be registered as a tool and can be used in a flow.
:param f: The tool function.
:type f: Callable
:return: The decorated function.
:rtype: Callable
"""

@functools.wraps(f)
def new_f(*args, **kwargs):
tool_invoker = ToolInvoker.active_instance()
Expand Down
Loading

0 comments on commit cb90270

Please sign in to comment.