Skip to content

Commit 9345ad3

Browse files
authored
Fix Build Doc CI issues (#3783)
# Description Please add an informative description that covers that changes made by the pull request and link all relevant issues. # All Promptflow Contribution checklist: - [ ] **The pull request does not introduce [breaking changes].** - [ ] **CHANGELOG is updated for new features, bug fixes or other significant changes.** - [ ] **I have read the [contribution guidelines](https://github.com/microsoft/promptflow/blob/main/CONTRIBUTING.md).** - [ ] **I confirm that all new dependencies are compatible with the MIT license.** - [ ] **Create an issue and link to the pull request to get dedicated review from promptflow team. Learn more: [suggested workflow](../CONTRIBUTING.md#suggested-workflow).** ## General Guidelines and Best Practices - [ ] Title of the pull request is clear and informative. - [ ] There are a small number of commits, each of which have an informative message. This means that previously merged commits do not appear in the history of the PR. For more information on cleaning up the commits in your PR, [see this page](https://github.com/Azure/azure-powershell/blob/master/documentation/development-docs/cleaning-up-commits.md). ### Testing Guidelines - [ ] Pull request includes test coverage for the included changes.
1 parent f08e576 commit 9345ad3

File tree

7 files changed

+8
-145
lines changed

7 files changed

+8
-145
lines changed

docs/cloud/azureai/tracing/index.md

-103
This file was deleted.

docs/cloud/azureai/tracing/run_tracking.md

-32
This file was deleted.

docs/concepts/concept-connections.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@ Prompt flow provides a variety of pre-built connections, including Azure OpenAI,
1515
| [OpenAI](https://openai.com/) | LLM or Python |
1616
| [Cognitive Search](https://azure.microsoft.com/products/search) | Vector DB Lookup or Python |
1717
| [Serp](https://serpapi.com/) | Serp API or Python |
18-
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis) | LLM or Python |
18+
| [Serverless](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview) | LLM or Python |
1919
| Custom | Python |
2020

2121
By leveraging connections in prompt flow, you can easily establish and manage connections to external APIs and data sources, facilitating efficient data exchange and interaction within their AI applications.

docs/reference/tools-reference/llm-tool.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -1,7 +1,7 @@
1-
# LLM
1+
# LLM
22

33
## Introduction
4-
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
4+
Prompt flow LLM tool enables you to leverage widely used large language models like [OpenAI](https://platform.openai.com/), [Azure OpenAI (AOAI)](https://learn.microsoft.com/en-us/azure/cognitive-services/openai/overview), and models in [Azure AI Studio model catalog](https://learn.microsoft.com/en-us/azure/ai-studio/how-to/model-catalog) for natural language processing.
55
> [!NOTE]
66
> The previous version of the LLM tool is now being deprecated. Please upgrade to latest [promptflow-tools](https://pypi.org/project/promptflow-tools/) package to consume new llm tools.
77
@@ -25,7 +25,7 @@ Create OpenAI resources, Azure OpenAI resources or MaaS deployment with the LLM
2525

2626
- **MaaS deployment**
2727

28-
Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview#deploy-models-as-serverless-apis)
28+
Create MaaS deployment for models in Azure AI Studio model catalog with [instruction](https://learn.microsoft.com/azure/ai-studio/concepts/deployments-overview)
2929

3030
You can create serverless connection to use this MaaS deployment.
3131

examples/tutorials/run-flow-with-pipeline/pipeline.ipynb

+2-2
Original file line numberDiff line numberDiff line change
@@ -136,7 +136,7 @@
136136
"cell_type": "markdown",
137137
"metadata": {},
138138
"source": [
139-
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2#why-are-parallel-jobs-needed)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
139+
"When using the `load_component` function and the flow YAML specification, your flow is automatically transformed into a __[parallel component](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-use-parallel-job-in-pipeline?view=azureml-api-2&tabs=cliv2)__. This parallel component is designed for large-scale, offline, parallelized processing with efficiency and resilience. Here are some key features of this auto-converted component:\n",
140140
"\n",
141141
" - Pre-defined input and output ports:\n",
142142
"\n",
@@ -176,7 +176,7 @@
176176
"## 3.1 Declare input and output\n",
177177
"To supply your pipeline with data, you need to declare an input using the `path`, `type`, and `mode` properties. Please note: `mount` is the default and suggested mode for your file or folder data input.\n",
178178
"\n",
179-
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli#path-and-mode-for-data-inputsoutputs)."
179+
"Declaring the pipeline output is optional. However, if you require a customized output path in the cloud, you can follow the example below to set the path on the datastore. For more detailed information on valid path values, refer to this documentation - [manage pipeline inputs outputs](https://learn.microsoft.com/en-us/azure/machine-learning/how-to-manage-inputs-outputs-pipeline?view=azureml-api-2&tabs=cli)."
180180
]
181181
},
182182
{

examples/tutorials/tracing/autogen-groupchat/trace-autogen-groupchat.ipynb

+1-3
Original file line numberDiff line numberDiff line change
@@ -15,8 +15,6 @@
1515
"AutoGen offers conversable agents powered by LLM, tool or human, which can be used to perform tasks collectively via automated chat. This framework allows tool use and human participation through multi-agent conversation.\n",
1616
"Please find documentation about this feature [here](https://microsoft.github.io/autogen/docs/Use-Cases/agent_chat).\n",
1717
"\n",
18-
"This notebook is modified based on [autogen agent chat example](https://github.com/microsoft/autogen/blob/main/notebook/agentchat_groupchat.ipynb). \n",
19-
"\n",
2018
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",
2119
"\n",
2220
"- Trace LLM (OpenAI) Calls and visualize the trace of your application.\n",
@@ -45,7 +43,7 @@
4543
"\n",
4644
"You can create the config file named `OAI_CONFIG_LIST.json` from example file: `OAI_CONFIG_LIST.json.example`.\n",
4745
"\n",
48-
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/docs/reference/oai/openai_utils#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
46+
"Below code use the [`config_list_from_json`](https://microsoft.github.io/autogen/0.2/docs/reference/oai/openai_utils/#config_list_from_json) function loads a list of configurations from an environment variable or a json file. \n"
4947
]
5048
},
5149
{

examples/tutorials/tracing/langchain/trace-langchain.ipynb

+1-1
Original file line numberDiff line numberDiff line change
@@ -15,7 +15,7 @@
1515
"The tracing capability provided by Prompt flow is built on top of [OpenTelemetry](https://opentelemetry.io/) that gives you complete observability over your LLM applications. \n",
1616
"And there is already a rich set of OpenTelemetry [instrumentation packages](https://opentelemetry.io/ecosystem/registry/?language=python&component=instrumentation) available in OpenTelemetry Eco System. \n",
1717
"\n",
18-
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/get_started/quickstart) apps.\n",
18+
"In this example we will demo how to use [opentelemetry-instrumentation-langchain](https://github.com/traceloop/openllmetry/tree/main/packages/opentelemetry-instrumentation-langchain) package provided by [Traceloop](https://www.traceloop.com/) to instrument [LangChain](https://python.langchain.com/docs/tutorials/) apps.\n",
1919
"\n",
2020
"\n",
2121
"**Learning Objectives** - Upon completing this tutorial, you should be able to:\n",

0 commit comments

Comments
 (0)