Skip to content

Commit a6301db

Browse files
Merge pull request #2214 from MicrosoftDocs/main638937262564322782sync_temp
For protected branch, push strategy should use PR and merge to target branch method to work around git push error
2 parents f87ec1d + 5e297d0 commit a6301db

19 files changed

+109
-91
lines changed

docs/data-factory/connector-amazon-s3-compatible-overview.md

Lines changed: 7 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -14,19 +14,14 @@ ms.custom:
1414

1515
This Amazon S3 Compatible connector is supported in Data Factory for [!INCLUDE [product-name](../includes/product-name.md)] with the following capabilities.
1616

17-
## Support in Dataflow Gen2
17+
## Supported capabilities
1818

19-
Data Factory in [!INCLUDE [product-name](../includes/product-name.md)] doesn't currently support Amazon S3 Compatible connectors in Dataflow Gen2.
19+
| Supported capabilities| Gateway | Authentication|
20+
|---------| --------| --------|
21+
| **Data pipeline**<br>- [Copy activity](connector-amazon-s3-compatible-copy-activity.md) (source/destination) <br>- Lookup activity<br>- Get Metadata activity<br>- Delete activity |None<br> On-premises<br> Virtual network |Access Key |
22+
| **Copy job** (source/-) <br>- Full load |None<br> On-premises<br> Virtual network |Access Key |
2023

21-
## Support in a pipeline
24+
## Related content
2225

23-
The Amazon S3 Compatible connector supports the following capabilities in a pipeline:
26+
To learn more about the copy activity configuration for Amazon S3 Compatible in data pipelines, go to [Configure in a data pipeline copy activity](connector-amazon-s3-compatible-copy-activity.md).
2427

25-
| Supported capabilities | Gateway | Authentication |
26-
| --- | --- | ---|
27-
| **Copy activity (source/destination)** | None <br> On-premises | Access Key |
28-
| **Lookup activity** | None <br> On-premises | Access Key |
29-
| **GetMetadata activity** | None <br> On-premises | Access Key |
30-
| **Delete activity** | None <br> On-premises | Access Key |
31-
32-
To learn more about the copy activity configuration for Amazon S3 Compatible in a pipeline, go to [Configure in a pipeline copy activity](connector-amazon-s3-compatible-copy-activity.md).

docs/data-factory/connector-greenplum-for-pipeline-overview.md

Lines changed: 6 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -13,17 +13,13 @@ ms.custom:
1313

1414
The Greenplum for Pipeline connector is supported in Data Factory for [!INCLUDE [product-name](../includes/product-name.md)] with the following capabilities.
1515

16-
## Support in Dataflow Gen2
16+
## Supported capabilities
1717

18-
Data Factory in Microsoft Fabric doesn't currently support Greenplum for Pipeline in Dataflow Gen2.
18+
| Supported capabilities| Gateway | Authentication|
19+
|---------| --------| --------|
20+
| **Data pipeline**<br>- [Copy activity](connector-greenplum-for-pipeline-copy-activity.md) (source/-) <br>- Lookup activity |None<br> On-premises<br> Virtual network |Basic |
1921

20-
## Support in a pipeline
22+
## Related content
2123

22-
The Greenplum for Pipeline connector supports the following capabilities in a pipeline:
24+
To learn more about the copy activity configuration for Greenplum for Pipeline in data pipelines, go to [Configure in a Data pipeline copy activity](connector-greenplum-for-pipeline-copy-activity.md).
2325

24-
| Supported capabilities | Gateway | Authentication |
25-
| --- | --- | ---|
26-
| **Copy activity (source/-)** | None <br>On-premises| Basic |
27-
| **Lookup activity** | None <br>On-premises | Basic |
28-
29-
To learn more about the copy activity configuration for Greenplum for Pipeline in a pipeline, go to [Configure in a pipeline copy activity](connector-greenplum-for-pipeline-copy-activity.md).

docs/data-factory/connector-impala-overview.md

Lines changed: 8 additions & 4 deletions
Original file line numberDiff line numberDiff line change
@@ -14,11 +14,15 @@ ms.custom:
1414

1515
The Impala connector is supported in Data Factory for [!INCLUDE [product-name](../includes/product-name.md)] with the following capabilities.
1616

17+
## Supported capabilities
1718

18-
## Support in Dataflow Gen2
19+
This Impala connector is supported for the following capabilities:
1920

20-
For information on how to connect to an Impala database in Dataflow Gen2, go to [Set up your Impala database connection](connector-impala.md).
21+
| Supported capabilities| Gateway | Authentication|
22+
|---------| --------| --------|
23+
| **Dataflow Gen2** (source/-)|None<br> On-premises<br> Virtual network |Anonymous<br> Database<br> Windows (Only for on-premises gateway) |
2124

22-
## Support in pipelines
25+
## Related content
26+
27+
For information on how to connect to an Impala database, go to [Set up your Impala database connection](connector-impala.md).
2328

24-
Data Factory in [!INCLUDE [product-name](../includes/product-name.md)] doesn't currently support an Impala database in pipelines.

docs/data-factory/connector-overview.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -219,7 +219,7 @@ Fabric supports these connectors in Dataflow Gen2, pipelines, and Copy job. Sele
219219
| Topcon Aptix Insights | ✓/− | | |
220220
| [Usercube (Beta)](/power-query/connectors/usercube) | ✓/− | | |
221221
| Vena | ✓/− | | |
222-
| [Vertica](connector-vertica-overview.md) | ✓/− | ✓/ | ✓/− |
222+
| [Vertica](connector-vertica-overview.md) | ✓/− | ✓/- | ✓/− |
223223
| [Vessel Insight](/power-query/connectors/vessel-insight) | ✓/− | | |
224224
| Viva Insights | ✓/− | | |
225225
| [Warehouse](/power-query/connectors/warehouse) | ✓/− | | |

docs/data-factory/connector-salesforce-service-cloud-overview.md

Lines changed: 6 additions & 10 deletions
Original file line numberDiff line numberDiff line change
@@ -14,17 +14,13 @@ ms.custom:
1414

1515
The Salesforce Service Cloud connector is supported in Data Factory for [!INCLUDE [product-name](../includes/product-name.md)] with the following capabilities.
1616

17-
## Support in Dataflow Gen2
17+
## Supported capabilities
1818

19-
Data Factory in Microsoft Fabric doesn't currently support Salesforce reports in Dataflow Gen2.
19+
| Supported capabilities| Gateway | Authentication|
20+
|---------| --------| --------|
21+
| **Data pipeline**<br>- [Copy activity](connector-salesforce-service-cloud-copy-activity.md) (source/destination) <br>- Lookup activity |None<br> On-premises<br> Virtual network |Organizational account |
22+
| **Copy job** (source/destination) <br>- Full load<br>- Append<br>- Merge |None<br> On-premises<br> Virtual network |Organizational account |
2023

21-
## Support in pipelines
22-
23-
The Salesforce Service Cloud connector supports the following capabilities in pipelines:
24-
25-
| Supported capabilities | Gateway | Authentication |
26-
| --- | --- | ---|
27-
| **Copy activity (source/destination)** | None <br> On-premises | Organizational account |
28-
| **Lookup activity** | None <br> On-premises | Organizational account |
24+
## Related content
2925

3026
To learn more about the copy activity configuration for Salesforce Service Cloud in pipelines, go to [Configure in a pipeline copy activity](connector-salesforce-service-cloud-copy-activity.md).

docs/data-factory/connector-vertica-overview.md

Lines changed: 10 additions & 9 deletions
Original file line numberDiff line numberDiff line change
@@ -14,16 +14,17 @@ ms.custom:
1414

1515
The Vertica connector is supported in Data Factory in [!INCLUDE [product-name](../includes/product-name.md)] with the following capabilities.
1616

17-
## Support in pipelines
17+
## Supported capabilities
1818

19-
The Vertica connector supports the following capabilities in pipelines:
20-
21-
| Supported capabilities | Gateway | Authentication |
22-
| --- | --- | ---|
23-
| **Copy activity (source/-)** | On-premises (version 3000.238.11 or above) | Basic |
24-
| **Lookup activity** | On-premises (version 3000.238.11 or above) | Basic |
25-
26-
To learn about the copy activity configuration for Vertica in pipelines, go to [Configure Vertica in a copy activity](connector-vertica-copy-activity.md).
19+
| Supported capabilities| Gateway | Authentication|
20+
|---------| --------| --------|
21+
| **Dataflow Gen2** (source/-)|On-premises |Basic |
22+
| **Data pipeline**<br>- [Copy activity](connector-vertica-copy-activity.md) (source/-) <br>- Lookup activity |On-premises (version 3000.238.11 or above)|Basic |
23+
| **Copy job** (source/-) <br>- Full load<br>- Incremental load |On-premises |Basic |
2724

2825
> [!NOTE]
2926
> To use Vertica connector in date pipelines, please install [Vertica ODBC driver](https://www.vertica.com/download/vertica/client-drivers/) on the computer running on-premises data gateway. For detailed steps, go to [Prerequisites](connector-vertica-copy-activity.md#prerequisites).
27+
28+
## Related content
29+
30+
To learn about the copy activity configuration for Vertica in data pipelines, go to [Configure Vertica in a copy activity](connector-vertica-copy-activity.md).

docs/data-factory/dataflows-gen2-fast-copy.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -45,7 +45,7 @@ Fast copy works with these Dataflow Gen2 connectors:
4545
- Warehouse
4646
- Oracle
4747
- Snowflake
48-
- Fabric SQL database
48+
- SQL database in Fabric
4949

5050
### Transformation limitations
5151

docs/data-factory/decision-guide-data-movement.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -13,17 +13,17 @@ ai-usage: ai-assisted
1313

1414
Microsoft Fabric gives you several ways to bring data into Fabric, based on what you need. Today, you can use **Mirroring**, **Copy activities in Pipelines**, or **Copy job**. Each option offers a different level of control and complexity, so you can pick what fits your scenario best.
1515

16-
Mirroring is designed to be simple and free, but it won't cover every advanced scenario. Copy activities in pipelines give you powerful data ingestion features, but they require you to build and manage pipelines. Copy job fills the gap between these options. It gives you more flexibility and control than Mirroring, plus native support for both batch and incremental copying, without the complexity of building pipelines.
16+
Mirroring is designed to be simple and free solution to mirror database to Fabric, but it won't cover every advanced scenario. Copy activities in pipelines give you fully customizable data ingestion features, but they require you to build and manage pipeline by yourself. Copy job fills the gap between these 2 options. It gives you more flexibility and control than Mirroring, plus native support for both batch and incremental copying, without the complexity of building pipelines.
1717

1818
:::image type="content" source="media/decision-guide-data-movement/decision-guide-data-movement.svg" alt-text="Screenshot of a data movement strategy decision tree, comparing mirroring, copy job, and copy activity." lightbox="media/decision-guide-data-movement/decision-guide-data-movement.svg":::
1919

2020
## Key concepts
2121

22-
- **Mirroring** gives you a **simple and free** way to copy operational data into Fabric for analytics. It's optimized for ease of use with minimal setup, and it writes to a single, read-only destination in OneLake.
22+
- **Mirroring** gives you a **simple and free** way to mirror operational data into Fabric for analytics. It's optimized for ease of use with minimal setup, and it writes to a single, read-only destination in OneLake.
2323

24-
- **Copy activities in Pipelines** is built for users who need **orchestrated, pipeline-based data ingestion workflows**. You can customize it extensively and add transformation logic, but you need to define and manage pipeline components.
24+
- **Copy activities in Pipelines** is built for users who need **orchestrated, pipeline-based data ingestion workflows**. You can customize it extensively and add transformation logic, but you need to define and manage pipeline components yourself, including tracking the state of the last run for incremental copy.
2525

26-
- **Copy Job** gives you a complete data ingestion experience from any source to any destination. It **makes data ingestion easier with native support for both batch and incremental copying, so you don't need to build pipelines**, while still giving you access to many advanced options. It supports many sources and destinations and works well when you want more control than Mirroring but less complexity than managing pipelines with Copy activity.
26+
- **Copy Job** gives you a complete data ingestion experience from any source to any destination. It **makes data ingestion easier with native support for multiple delivery styles, including bulk copy, incremental copy, and change data capture (CDC) replication, and you don't need to build pipelines**, while still giving you access to many advanced options. It supports many sources and destinations and works well when you want more control than Mirroring but less complexity than managing pipelines with Copy activity.
2727

2828
## Data movement decision guide
2929

@@ -86,4 +86,4 @@ Now that you have an idea of which data movement strategy to use, you can get st
8686

8787
- [Get started with Mirroring](/fabric/mirroring/overview)
8888
- [Create a Copy Job](/fabric/data-factory/create-copy-job)
89-
- [Create a Copy Activity](/fabric/data-factory/copy-data-activity)
89+
- [Create a Copy Activity](/fabric/data-factory/copy-data-activity)

docs/data-science/ai-functions/overview.md

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -33,12 +33,16 @@ You can incorporate these functions as part of data-science and data-engineering
3333

3434
- To use AI functions with the built-in AI endpoint in Fabric, your administrator needs to enable [the tenant switch for Copilot and other features that are powered by Azure OpenAI](../../admin/service-admin-portal-copilot.md).
3535
- Depending on your location, you might need to enable a tenant setting for cross-geo processing. Learn more about [available regions for Azure OpenAI Service](../../get-started/copilot-fabric-overview.md#available-regions-for-azure-openai-service).
36-
- You also need an F2 or later edition or a P edition. If you use a trial edition, you can bring your own Azure Open AI resource.
36+
- You need a paid Fabric capacity (F2 or higher, or any P edition). Bring-your-own Azure OpenAI resources aren't supported on the Fabric trial edition.
37+
38+
> [!IMPORTANT]
39+
>
40+
> The Fabric trial edition doesn't support bring-your-own Azure OpenAI resources for AI functions. To connect a custom Azure OpenAI endpoint, upgrade to an F2 (or higher) or P capacity.
3741
3842
> [!NOTE]
3943
>
4044
> - AI functions are supported in [Fabric Runtime 1.3](../../data-engineering/runtime-1-3.md) and later.
41-
> - AI functions use the *gpt-4o-mini (2024-07-18)* model by default. Learn more about [billing and consumption rates](../ai-services/ai-services-overview.md).
45+
> - Unless you configure a different model, AI functions default to *gpt-4o-mini (2024-07-18)*. Learn more about [billing and consumption rates](../ai-services/ai-services-overview.md).
4246
> - Most of the AI functions are optimized for use on English-language texts.
4347
4448
## Getting started with AI functions
@@ -109,7 +113,7 @@ Each of the following functions allows you to invoke the built-in AI endpoint in
109113
110114
### Calculate similarity with ai.similarity
111115

112-
The `ai.similarity` function invokes AI to compare input text values with a single common text value, or with pairwise text values in another column. The output similarity score values are relative, and they can range from `-1` (opposites) to `1` (identical). A score of `0` indicates that the values are unrelated in meaning. Get [detailed instructions](./similarity.md) about the use of `ai.similarity`.
116+
The `ai.similarity` function compares each input text value either to one common reference text or to the corresponding value in another column (pairwise mode). The output similarity score values are relative, and they can range from `-1` (opposites) to `1` (identical). A score of `0` indicates that the values are unrelated in meaning. Get [detailed instructions](./similarity.md) about the use of `ai.similarity`.
113117

114118
#### Sample usage
115119

@@ -242,7 +246,7 @@ The `ai.extract` function invokes AI to scan input text and extract specific typ
242246
# Read terms: https://azure.microsoft.com/support/legal/preview-supplemental-terms/.
243247

244248
df = pd.DataFrame([
245-
"MJ Lee lives in Tuscon, AZ, and works as a software engineer for Microsoft.",
249+
"MJ Lee lives in Tucson, AZ, and works as a software engineer for Microsoft.",
246250
"Kris Turner, a nurse at NYU Langone, is a resident of Jersey City, New Jersey."
247251
], columns=["descriptions"])
248252

@@ -257,7 +261,7 @@ display(df_entities)
257261
# Read terms: https://azure.microsoft.com/support/legal/preview-supplemental-terms/.
258262

259263
df = spark.createDataFrame([
260-
("MJ Lee lives in Tuscon, AZ, and works as a software engineer for Microsoft.",),
264+
("MJ Lee lives in Tucson, AZ, and works as a software engineer for Microsoft.",),
261265
("Kris Turner, a nurse at NYU Langone, is a resident of Jersey City, New Jersey.",)
262266
], ["descriptions"])
263267

@@ -450,7 +454,7 @@ display(responses)
450454
- Calculate similarity with [`ai.similarity`](./similarity.md).
451455
- Detect sentiment with [`ai.analyze_sentiment`](./analyze-sentiment.md).
452456
- Categorize text with [`ai.classify`](./classify.md).
453-
- Extract entities with [`ai_extract`](./extract.md).
457+
- Extract entities with [`ai.extract`](./extract.md).
454458
- Fix grammar with [`ai.fix_grammar`](./fix-grammar.md).
455459
- Summarize text with [`ai.summarize`](./summarize.md).
456460
- Translate text with [`ai.translate`](./translate.md).

docs/data-science/data-agent-configuration-best-practices.md

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -6,7 +6,7 @@ author: jonburchel
66
ms.reviewer: midesa
77
reviewer: midesa
88
ms.topic: how-to
9-
ms.date: 06/13/2024
9+
ms.date: 08/15/2025
1010
---
1111

1212
# Best practices for configuring your data agent

0 commit comments

Comments
 (0)