Skip to content

Commit

Permalink
Update from SAP DITA CMS (squashed):
Browse files Browse the repository at this point in the history
commit 563939aeb3e0ad32505737130272bcda56eb68a9
Author: REDACTED
Date:   Mon Feb 12 17:57:24 2024 +0000

    Update from SAP DITA CMS 2024-02-12 17:57:24
    Project: dita-all/dca1705009695343
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit c4fc3b5d1776019eed5de210f0ad1c8cb8a69984
Author: REDACTED
Date:   Mon Feb 12 17:54:54 2024 +0000

    Update from SAP DITA CMS 2024-02-12 17:54:54
    Project: dita-all/dca1705009695343
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit 645483bbe863faa521c869b9bced7cfce49fc819
Author: REDACTED
Date:   Mon Feb 12 17:53:27 2024 +0000

    Update from SAP DITA CMS 2024-02-12 17:53:27
    Project: dita-all/dca1705009695343
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit b27ce8afb48735d4ad76caaf08ffcf19a9c326b3
Author: REDACTED
Date:   Mon Feb 12 17:30:30 2024 +0000

    Update from SAP DITA CMS 2024-02-12 17:30:30

##################################################
[Remaining squash message was removed before commit...]
  • Loading branch information
ditaccms-bot committed Feb 13, 2024
1 parent fe57cef commit ecf55c7
Show file tree
Hide file tree
Showing 83 changed files with 1,204 additions and 136 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Space administrators and integrators prepare connections and other sources to al

## Federate and Replicate Data in Remote Tables

Many connections \(including most connections to SAP systems\) support importing remote tables to federate or replicate data \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).
Many connections \(including most connections to SAP systems\) support importing remote tables to federate or replicate data \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of remote systems, cloud as well as on-premise, SAP as well as Non-SAP, and partner tools. They allow users assigned to a space to use objects from the connected remote system as source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).

You can import remote tables to make the data available in your space from the *Data Builder* start page, in an entity-relationship model, or directly as a source in a view.

Expand All @@ -48,7 +48,7 @@ You can import remote tables to make the data available in your space from the *

## Extract, Transform, and Load Data with Data Flows

Many connections \(including most connections to SAP systems\) support loading data to SAP Datasphere via data flows \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).
Many connections \(including most connections to SAP systems\) support loading data to SAP Datasphere via data flows \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of remote systems, cloud as well as on-premise, SAP as well as Non-SAP, and partner tools. They allow users assigned to a space to use objects from the connected remote system as source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).

Data flows support a wide range of extract, transform, and load \(ETL\) operations.

Expand Down Expand Up @@ -85,7 +85,7 @@ Create a transformation flow to load data from one or more source repository tab

## Import Entities from SAP S/4HANA

The *Import Entities* wizard allows you to import entities from SAP S/4HANA Cloud and SAP S/4HANA on-premise systems with rich metadata \(see [Importing Entities with Semantics from SAP S/4HANA](importing-entities-with-semantics-from-sap-s-4hana-845fedb.md)\).
The *Import Entities* wizard allows you to import entities from and SAP S/4HANA on-premise systems with rich metadata \(see [Importing Entities with Semantics from SAP S/4HANA](importing-entities-with-semantics-from-sap-s-4hana-845fedb.md)\).



Expand Down Expand Up @@ -211,7 +211,7 @@ All the objects you import or create in the *Data Builder* are listed on the *Da
- Local Table \(see [Creating a Local Table](creating-a-local-table-2509fe4.md)\)
- Graphical View \(see [Creating a Graphical View](../creating-a-graphical-view-27efb47.md)\)
- SQL View \(see [Creating an SQL View](../creating-an-sql-view-81920e4.md)\)
- Data Access Control \(see [Create a Data Access Control](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/5246328ec59045cb9c2aa693daee2557.html "Users with the DW Space Administrator role (or equivalent privileges) can create data access controls in which criteria are defined as single values. Each user can only see the records that match any of the single values she is authorized for in the permissions entity.") :arrow_upper_right:\)
- Data Access Control \(see [Create a "Single Values" Data Access Control](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/5246328ec59045cb9c2aa693daee2557.html "Users with the DW Space Administrator role (or equivalent privileges) can create data access controls in which criteria are defined as single values. Each user can only see the records that match any of the single values she is authorized for in the permissions entity.") :arrow_upper_right:\)
- Analytic Model \(see [Creating an Analytic Model](../Modeling-Data-in-the-Data-Builder/creating-an-analytic-model-e5fbe9e.md)\)
- Task Chain \(see [Creating a Task Chain](creating-a-task-chain-d1afbc2.md)\)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Add a source table to read data from. You can add multiple source tables and com

## Procedure

1. Navigate to the *View Transform Editor*.
1. Navigate to the *Graphical View Editor*.

2. Drag a table from the *Repository* and drop it onto the diagram.

Expand Down Expand Up @@ -106,4 +106,7 @@ Add a source table to read data from. You can add multiple source tables and com

If the delta capture setting is enabled for a table, two additional columns are present in the table to track changes. For more information, see [Capturing Delta Changes in Your Local Table](capturing-delta-changes-in-your-local-table-154bdff.md).

> ### Note:
> If the delta capture setting is enabled for a source table, the columns *Change Date* and *Change Type* are automatically mapped to these columns in the target table. Mapping these columns \(or a calculated column that contains the content of these columns\) to any other target column is not permitted. For more information, see [Capturing Delta Changes in Your Local Table](capturing-delta-changes-in-your-local-table-154bdff.md).

Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,6 @@ The 2 objects are consumed differently by SAP Datasphere apps:



> ### Note:
> The Delta Capture Table is an internal table whose structure can incompatibly change at any time. It is not allowed for external data access and is only consumed by the above SAP Datasphere internal apps.
> ### Restriction:
> The Delta Capture Table is an internal table whose structure can incompatibly change at any time. It is not permitted for external data access and is only consumed by the above SAP Datasphere internal apps. Using the internal delta capture columns \(*Change Date* or *Change Type*\) or their content directly or indirectly for external delta replication outside the Premium Outbound Integration is also not permitted. For more information, see [Premium Outbound Integration](https://blogs.sap.com/2023/11/16/replication-flow-blog-series-part-2-premium-outbound-integration/).
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ Define settings and properties for your replication flow and individual replicat

- *Initial Only*: Load all selected data once.

- *Initial and Delta*: After the initial load, the system checks for source data changes \(delta\) at regular intervals and copies the changes to the target. The default value for the delta load interval is 60 minutes. You can change it in the side panel by entering an integer between 0 and 24 for hours and 0 and 59 for minutes, respectively. The maximum allowed value is 24 hours 0 minutes. If you enter 0 hours and 0 seconds, the system replicates any source changes immediately.
- *Initial and Delta*: After the initial load, the system checks for source data changes \(delta\) at regular intervals and copies the changes to the target. The default value for the delta load interval is 60 minutes. You can change it in the side panel by entering an integer between 0 and 24 for hours and 0 and 59 for minutes, respectively. The maximum allowed value is 24 hours 0 minutes. If you enter 0 hours and 0 minutes, the system replicates any source changes immediately.

> ### Note:
> - A replication flow that contains objects with load type *Initial and Delta* does not have an end date. Once started, it remains in status *Running* until it is stopped or paused or an issue occurs.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -34,6 +34,9 @@ If you are not comfortable with SQL, you can still build a view by using the *Gr

- Drag source tables from the *Repository* to the *SQL View Editor*.

> ### Note:
> If the delta capture setting is enabled for a source table, the columns *Change Date* and *Change Type* are automatically mapped to these columns in the target table. Mapping these columns \(or a calculated column that contains the content of these columns\) to any other target column is not permitted. For more information, see [Capturing Delta Changes in Your Local Table](capturing-delta-changes-in-your-local-table-154bdff.md).
- Add comments to document your code:

- Comment out a single line or the rest of a line with a double dash: `-- Your comment here`.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -80,7 +80,10 @@ Create a data flow to move and transform data in an intuitive graphical interfac

> ### Note:
> - If the table is wide or contains a number of large column types, the result in data preview may be truncated in order to avoid out of memory issues.
> - Data preview is **not** available for ABAP sources.
> - Data preview is **not** available for ABAP sources. Except for:
> - CDS views if the source connection is SAP S/4HANA Cloud 2302, SAP S/4HANA on-premise 1909 or higher.
> - SAP Landscape Transformation Replication Server objects if ABAP Add-on DMIS 2018 SP08 / DMIS 2020 SP04 is installed.
>
> - You can't perform data preview on the transformation operators.

Expand Down Expand Up @@ -248,7 +251,7 @@ Create a data flow to move and transform data in an intuitive graphical interfac

- Under Input Parameters:Create a new input parameter or modify an existing one. For more information, see [Create an Input Parameter](create-an-input-parameter-a6fb3e7.md)
- Under Advanced Properties
- *Dynamic Memory Allocation*: You can allocate memory usage manually. Set the *Expected Data Volume* to *Small*, *Medium*, or *Large*.
- *Dynamic Memory Allocation* You can allocate memory usage manually. Set the *Expected Data Volume* to *Small*, *Medium*, or *Large*.

> ### Note:
> - Dynamic memory allocation should be done only if your data flow run is facing out-of-memory failures.
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -19,7 +19,7 @@ You can use replication flows to copy data from the following source objects:
- Objects from ODP providers, such as extractors or SAP BW artifacts


For more information about available connection types, sources, and targets, see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:.
For more information about available connection types, sources, and targets, see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of remote systems, cloud as well as on-premise, SAP as well as Non-SAP, and partner tools. They allow users assigned to a space to use objects from the connected remote system as source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:.

> ### Note:
> Replication flows may not be available in SAP Datasphere tenants provisioned prior to version 2021.03. To request the migration of your tenant, see SAP note [3268282](https://launchpad.support.sap.com/#/notes/3268282).
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@

# Creating a Task Chain

Group multiple tasks into a task chain and run them manually once, or periodically, through a schedule. You can create linear task chains in which one task is run after another. Or, you can create task chains in which individual tasks are run in parallel and successful continuation of the entire task chain run depends on whether ANY or ALL parallel tasks are completed successfully. In addition, when creating or editing a task chain, you can also set up email notification for deployed task chains to notify selected users of task chain completion.
Group multiple tasks into a task chain and run them manually once, or periodically, through a schedule. You can create linear task chains in which one task is run after another. \(You can also nest other task chains within a task chain.\) Or, you can create task chains in which individual tasks are run in parallel and successful continuation of the entire task chain run depends on whether ANY or ALL parallel tasks are completed successfully. In addition, when creating or editing a task chain, you can also set up email notification for deployed task chains to notify selected users of task chain completion.



Expand All @@ -30,7 +30,7 @@ Linear task chains allow you to define a group or series of tasks and execute th

Parallel task chain branches allow you to specify that some individual tasks are run in parallel and successful continuation of the entire task chain run depends on whether ANY or ALL parallel tasks are completed successfully.

Tasks chain scheduling may include execution of Remote Table Replication, View Persistency, Intelligent Lookup, Data Flow, and Transformation Flow runs.
Tasks chain scheduling may include execution of Remote Table Replication, View Persistency, Intelligent Lookup, Data Flow, and Transformation Flow runs. You can also nest other task chains within a task chain.

> ### Note:
> For optimal performance, it is recommended that you consider staggering the scheduled run time of tasks such as data flows and task chains that may contain these tasks. There is a limit on how many tasks can be started at the same time. If you come close to this limit, scheduled task runs may be delayed and, if you go beyond the limit, some scheduled task runs might even be skipped.
Expand Down Expand Up @@ -295,7 +295,12 @@ A basic or linear task chain allows you to define a group or series of tasks and

![](images/Properties_Update_with_Deploy_3674719.png)

Once the task chain is deployed, you can then run the task chain or create a schedule to run your task chain periodically, and navigate to the *Task Chains* monitor to chech your task chain runs. For more information, see [Scheduling Data Integration Tasks](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7fa07621d9c0452a978cb2cc8e4cd2b1.html "Schedule data integration tasks to run periodically at a specified date or time.") :arrow_upper_right: and [Monitoring Task Chains](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/4142201ec1aa49faad89a688a2f1852c.html "Monitor the status and progress of running and previously run task chains.") :arrow_upper_right:.
Once the task chain is deployed, you can then run the task chain or create a schedule to run your task chain periodically, and navigate to the *Task Chains* monitor to check your task chain runs. For more information, see [Scheduling Data Integration Tasks](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/7fa07621d9c0452a978cb2cc8e4cd2b1.html "Schedule data integration tasks to run periodically at a specified date or time.") :arrow_upper_right: and [Monitoring Task Chains](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/4142201ec1aa49faad89a688a2f1852c.html "Monitor the status and progress of running and previously run task chains.") :arrow_upper_right:.

Once a task chain run has started, it will continue running as long as possible. Until all tasks in the chain have been executed and are in a non-running state, the task chain itself is considered to be "running". When finished, the overall state or status of the task chain will be reported as “failed” if any task in the chain has "failed". The final status of COMPLETED for a task chain is reported only if all tasks are COMPLETED.

> ### Note:
> If a nested task chain within a parent task chain fails, you need to retry the parent task chain, not the specific nested chain that failed. In that case, when you retry the parent task chain, only the nested task chain that failed will be run again, not any of the other tasks in the chain that had already run successfully.

**Executing Parallel Tasks in a Task Chain**
Expand Down Expand Up @@ -356,8 +361,6 @@ In addition to linear task chains in which one task is executed after another, y
When a task chain is run that includes a parallel task chain branch, execution of all the branch tasks are triggered to be run in parallel. The ANY or ALL condition applied to the branch specifies whether ANY or ALL branch tasks must be completed successfully to continue with execution of remaining tasks in the chain.

A task chain will continue running as long as possible. Until all tasks in the chain have been executed and are in a non-running state, the task chain itself is considered to be "running". When finished, the overall state or status of the task chain will be reported as “failed” if any task in the chain is "failed". The final status of COMPLETED for a task chain is reported only if all tasks are COMPLETED.

After finishing a task chain run that includes one or more parallel task branches, it may be possible that one or more tasks may be reported in an error state \(in each branch\). For example, in branches where completion of tasks is evaluated with the ANY operator. In that case, if you restart or retry the task chain, SAP Datasphere will then restart previously-failed tasks and run all subsequent tasks that had not yet run. In particular, this means that if a failed task is in a parallel branch which was evaluated with the ANY operator, those tasks in the same branch which had run successfully will not be run again. Only those tasks that have failed will be retried or run again.


Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -22,6 +22,9 @@ Creating a transformation flow involves two important steps:
> ### Note:
> A transformation flow only supports the loading of data to a local table in the SAP Datasphere repository.
> ### Note:
> A transformation flow generates internal SQL objects that must only be consumed by SAP Datasphere internal apps, and are not allowed for external data access.


## Procedure
Expand Down
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading

0 comments on commit ecf55c7

Please sign in to comment.