Skip to content

Commit

Permalink
Update from SAP DITA CMS (squashed):
Browse files Browse the repository at this point in the history
commit 2db343bb1e5d134c172d64995f6842e393ad0f9c
Author: REDACTED
Date:   Tue Jan 9 08:08:31 2024 +0000

    Update from SAP DITA CMS 2024-01-09 08:08:31
    Project: dita-all/mpr1699965423580
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit 2a17bbf27d3ee3d779ce14b6ff945d0f3e5b0bc8
Author: REDACTED
Date:   Tue Jan 9 08:07:12 2024 +0000

    Update from SAP DITA CMS 2024-01-09 08:07:12
    Project: dita-all/mpr1699965423580
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit 70172f8f87ec07187a181cd98d97f33fa4525a5b
Author: REDACTED
Date:   Tue Jan 9 08:04:57 2024 +0000

    Update from SAP DITA CMS 2024-01-09 08:04:57
    Project: dita-all/mpr1699965423580
    Project map: af2fcb3e6dd448f3af3c0ff9c70daaf9.ditamap
    Language: en-US

commit 000bda54e86a515786a3e537bd8aa9045fa1d170
Author: REDACTED
Date:   Tue Jan 9 07:53:50 2024 +0000

    Update from SAP DITA CMS 2024-01-09 07:53:50

##################################################
[Remaining squash message was removed before commit...]
  • Loading branch information
ditaccms-bot committed Jan 9, 2024
1 parent 182d820 commit c2b0b26
Show file tree
Hide file tree
Showing 172 changed files with 3,652 additions and 4,638 deletions.
Original file line number Diff line number Diff line change
Expand Up @@ -25,7 +25,7 @@ Space administrators and integrators prepare connections and other sources to al

## Federate and Replicate Data in Remote Tables

Many connections \(including most connections to SAP systems\) support importing remote tables to federate or replicate data \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. To connect to different sources, SAP Datasphere provides different connection types.") :arrow_upper_right:\).
Many connections \(including most connections to SAP systems\) support importing remote tables to federate or replicate data \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).

You can import remote tables to make the data available in your space from the *Data Builder* start page, in an entity-relationship model, or directly as a source in a view.

Expand All @@ -46,7 +46,7 @@ You can import remote tables to make the data available in your space from the *

## Extract, Transform, and Load Data with Data Flows

Many connections \(including most connections to SAP systems\) support loading data to SAP Datasphere via data flows \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. To connect to different sources, SAP Datasphere provides different connection types.") :arrow_upper_right:\).
Many connections \(including most connections to SAP systems\) support loading data to SAP Datasphere via data flows \(see [Integrating Data via Connections](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/eb85e157ab654152bd68a8714036e463.html "Connections provide access to data from a wide range of sources, cloud as well as on-premise sources, SAP as well as Non-SAP sources, and partner tools. They allow users assigned to a space to use objects from the connected source to acquire, prepare and access data from those sources in SAP Datasphere. In addition, you can use certain connections to define targets for replication flows.") :arrow_upper_right:\).

Data flows support a wide range of extract, transform, and load \(ETL\) operations.

Expand Down Expand Up @@ -91,7 +91,7 @@ The *Import Entities* wizard allows you to import entities from SAP S/4HANA Clou

## Import Entities from SAP BW Bridge

SAP BW bridge enables you to use SAP BW functionality in the public cloud and to import bridge entities into SAP Datasphere \(see[Importing Entities with Semantics from SAP BW Bridge](importing-entities-with-semantics-from-sap-bw-bridge-7bcd321.md) \).
SAP BW bridge enables you to use SAP BW functionality in the public cloud and to import bridge entities into SAP Datasphere \(see[Importing Entities with Semantics from SAP BW∕4HANA or SAP BW Bridge](importing-entities-with-semantics-from-sap-bw-4hana-or-sap-bw-br-7bcd321.md) \).



Expand Down Expand Up @@ -132,7 +132,7 @@ You can create and import empty tables and views to receive and prepare data:
All the objects you import or create in the *Data Builder* are listed on the *Data Builder* start page. You can act on objects in the list in the following ways:

- Click one of the tabs to filter the list by object type.
- Click a tile to create a new object.
- Click a tile to create a new object
- Enter a string in the *Search* field to filter the list on business and technical names and users.
- Click a column header to sort or filter the list by values in the column.
- Select one or more objects and use any of the following tools:
Expand Down Expand Up @@ -208,7 +208,7 @@ All the objects you import or create in the *Data Builder* are listed on the *Da
- Local Table \(see [Creating a Local Table](creating-a-local-table-2509fe4.md)\)
- Graphical View \(see [Creating a Graphical View](../creating-a-graphical-view-27efb47.md)\)
- SQL View \(see [Creating an SQL View](../creating-an-sql-view-81920e4.md)\)
- Data Access Control \(see [Create a Data Access Control](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/5246328ec59045cb9c2aa693daee2557.html "Space administrators can create data access controls to define criteria on which data can be displayed to users.") :arrow_upper_right:\)
- Data Access Control \(see [Create a "Simple Values" Data Access Control](https://help.sap.com/viewer/9f36ca35bc6145e4acdef6b4d852d560/DEV_CURRENT/en-US/5246328ec59045cb9c2aa693daee2557.html "Users with the DW Space Administrator role (or equivalent privileges) can create data access controls in which criteria are defined as simple values. Each user can only see the records that match any of the single values she is authorized for in the permissions entity.") :arrow_upper_right:\)
- Analytic Model \(see [Creating an Analytic Model](../Modeling-Data-in-the-Data-Builder/creating-an-analytic-model-e5fbe9e.md)\)
- Task Chain \(see [Creating a Task Chain](creating-a-task-chain-d1afbc2.md)\)

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -33,16 +33,24 @@ Define the source for your replication flow \(connection, container, and objects

To narrow down the selection, start typing a part of the folder name in the *Search* field.

If you choose SAP Datasphere as the source connection, the source container is automatically defined as the space you are in. In addition, the load type is automatically set to Initial because Initial and Delta is not supported for SAP Datasphere as the source.

3. Choose *Add Source Objects*. A list of available objects appears. Select the relevant ones for your use case and choose *Next*. A list of the objects you selected appears.

> ### Note:
> Objects for which replication is not supported are not shown in the list. For example, when replicating data from an SAP S/4HANA source you will only be shown CDS views that have the required annotations for data extraction.
> - The list only shows objects for which replication is supported. For example, when replicating data from an SAP S/4HANA source you will only be shown CDS views that have the required annotations for data extraction.
>
> If a CDS view for which replication is enabled is not shown in the CDS\_EXTRACTION folder, please ensure that the user in the source connection has the required authorizations. For connections to an SAP S/4HANA Cloud source system, this might mean that the user must be assigned to an authorization group that contains the CDS view \(as described in [Integrating CDS Views Using ABAP CDS Pipeline](https://help.sap.com/docs/SAP_S4HANA_CLOUD/0f69f8fb28ac4bf48d2b57b9637e81fa/f509eddda867452db9631dae1ae442a3.html?version=2308.503)\).
>
> - If you use SAP Datasphere as the source connection, your source objects must be local tables that have been deployed, are **not** enabled for delta capturing, and have a primary key.
4. If you decide that you do not want to include an object after all, select it and choose *Remove from Selection*. If you want to include more objects, go back to the *Available* tab and select the relevant objects. When you are done, choose *Add Selection*. The system then imports the object definitions so that they are available for the subsequent process steps.

> ### Note:
> An object can only be included in one replication flow at any given point in time \(not multiple ones\).
5. \(Optional\) To modify the throughput, you can change the number of replication threads. To do so, choose <span class="FPA-icons"></span> \(Browse source settings\), replace the default value of 10 with the value you want to use, and save your change.




Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ Add a source to read data from. You can add multiple sources and combine them to

2. Browse or search for the object you want to add on either of the tabs.

- The *Repository* tab lists all the tables, views, and intelligent lookups that are available in the space \(including objects shared to the space\).. You can search and expand the categories \(see [Add Objects from the Repository](../add-objects-from-the-repository-13fcecd.md)\).
- The *Repository* tab lists all the tables, views, and intelligent lookups that are available in the space \(including objects shared to the space\).. For more information, see [Add Objects from the Repository](../add-objects-from-the-repository-13fcecd.md).

- The *Sources* tab lists all the connections and other data sources that have been integrated to the space from which you can import tables. However it shows only limited records. If you can't see the sources you are looking for, use *Import from Connection* to perform search. You can:

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Select a target \(connection and container\) to define the target environment fo

## Context

- Connections are created by your system administration. You can only use a target if a connection has been created for it in your space and if you have the necessary authorizations.
- Connections are created by your system administration. You can only use a target if a connection has been created for it in your space and if you have the necessary authorizations. For more information about connections and connection types, see [Integrating Data via Connections](https://help.sap.com/docs/SAP_DATASPHERE/be5967d099974c69b77f4549425ca4c0/eb85e157ab654152bd68a8714036e463.html).

- Containers are the parent objects that hold the data.

Expand All @@ -31,7 +31,7 @@ Select a target \(connection and container\) to define the target environment fo

To narrow down the selection, start typing a part of the container name in the *Search* field.

4. If you select a cloud storage \(Data Lake Files from SAP HANA Cloud, data lake\) as the target, a list of additional options is displayed. Choose the relevant ones for your use case.
4. If you select a cloud storage provider as the target, a list of additional options is displayed. Choose the relevant ones for your use case.


<table>
Expand Down Expand Up @@ -95,4 +95,6 @@ Select a target \(connection and container\) to define the target environment fo
</tr>
</table>

5. \(Optional\) To modify the throughput, you can change the number of replication threads. To do so, choose <span class="FPA-icons"></span> \(Browse source settings\), replace the default value of 10 with the value you want to use, and save your change.


Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,7 @@ Add an *Aggregation* node to perform `SUM`, `COUNT`, `MIN`, and `MAX` calculatio

3. Select the projection node in order to display its context tools, and click <span class="FPA-icons"></span> \(Aggregation\).

![](images/Create_Aggregation_Gif_DWC_e8edbdf.gif)An *Aggregation* node is created, its symbol is selected, and its properties are displayed in the side panel.
![](images/Create_Aggregation_Gif_DWC_e8edbdf.gif) An *Aggregation* node is created, its symbol is selected, and its properties are displayed in the side panel.

4. Optional. Rename the node in its side panel to clearly identify it. This name can be changed at any time and can contain only alphanumeric characters and underscores.

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -79,7 +79,7 @@ When *Delta Capture* is switched on:
</td>
<td valign="top">

This column will track the type of last change made to a record.When a record is inserted or updated corresponding change types are used \(for example "I" or "U"\). When an existing record is deleted other specific change types are used \(for example "D"\). Note that deleting a record will not physically delete it, so that the changes can be propagated to the different objects that consume it in delta mode. It is however filtered out when accessing the Local Table \(using the Active Records Table\). Also, note that the change types provided by the different SAP Datasphere apps vary and may depend on the actual source that is connected. The handling of the different change types is implemented internally by SAP Datasphere apps that consume the Delta Capture Table with no need for consideration in modeling.
This column will track the type of last change made to a record.When a record is inserted or updated corresponding change types are used \(for example "I" or "U"\). When an existing record is deleted other specific change types are used \(for example "D"\). Note that deleting a record will not physically delete it, so that the changes can be propagated to the different objects that consume it in delta mode. It is however filtered out when accessing the Local Table \(using the Active Records Table\). Also, note that the change types provided by the different SAP Datasphere apps vary and may depend on the actual source that is connected. The handling of the different change types is implemented internally by SAP Datasphere apps that consume the Delta Capture Table with no need for consideration in modeling. For more information on records deletion, see [Load or Delete Local Table Data](load-or-delete-local-table-data-870401f.md)

</td>
</tr>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -133,7 +133,7 @@ Change Type
</td>
<td valign="top">

This column will track the type of last change made to a record. When a record is inserted or updated corresponding change types are used \(for example "I" or "U"\). When an existing record is deleted other specific change types are used \(for example "D"\). Note that deleting a record will not physically delete it, so that the changes can be propagated to the different objects that consume it in delta mode. It is however filtered out when accessing the Local Table \(using the Active Records Table\). Also, note that the change types provided by the different SAP Datasphere apps vary and may depend on the actual source that is connected. The handling of the different change types is implemented internally by SAP Datasphere apps that consume the Delta Capture Table with no need for consideration in modeling.
This column will track the type of last change made to a record. When a record is inserted or updated corresponding change types are used \(for example "I" or "U"\). When an existing record is deleted other specific change types are used \(for example "D"\). Note that deleting a record will not physically delete it, so that the changes can be propagated to the different objects that consume it in delta mode. It is however filtered out when accessing the Local Table \(using the Active Records Table\). Also, note that the change types provided by the different SAP Datasphere apps vary and may depend on the actual source that is connected. The handling of the different change types is implemented internally by SAP Datasphere apps that consume the Delta Capture Table with no need for consideration in modeling. For more information on records deletion, see [Load or Delete Local Table Data](load-or-delete-local-table-data-870401f.md) .

</td>
</tr>
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -14,9 +14,14 @@ Define general settings for your replication flow, such as the load type.

- *Initial Only*: Load all selected data once.

- *Initial and Delta*: After the initial load, the system checks for source data changes \(delta\) once every 60 minutes and copies the changes to the target.
- *Initial and Delta*: After the initial load, the system checks for source data changes \(delta\) at regular intervals and copies the changes to the target. The default value for the delta load interval is 60 minutes. You can change it in the *Details* side panel by entering an integer between 0 and 24 for hours and 0 and 59 for minutes, respectively. The maximum allowed value is 24 hours 0 minutes. If you enter 0 hours and 0 seconds, the system replicates any source changes immediately.

For more information about using this load type in connection with local tables, see [Capturing Delta Changes in Your Local Table](capturing-delta-changes-in-your-local-table-154bdff.md).
> ### Note:
> - The system load caused by the delta load operations can vary substantially depending on the frequency of changes in your data source in combination with the interval length you define. Make sure that your tenant configuration supports your settings. For more information, see [Configure the Size of your SAP Datasphere Tenant](https://help.sap.com/docs/SAP_DATASPHERE/9f804b8efa8043539289f42f372c4862/33f8ef4ec359409fb75925a68c23ebc3.html).
>
> - The next interval starts after all changes from the previous interval have been replicated. For example, if replicating a set of changes starts at 10:30 a. m. and takes until 10:45 a. m., and you have defined one-hour intervals, the next delta replication starts at 11:45 a. m.
For more information about using this load type in connection with local tables, see [Capturing Delta Changes in Your Local Table](https://help.sap.com/docs/SAP_DATASPHERE/c8a54ee704e94e15926551293243fd1d/154bdffb35814d5481d1f6de143a6b9e.html).


3. The *Truncate* setting is relevant if the target structure already exists and contains data. Review the default setting and change it if required:
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ Create a data flow to move and transform data in an intuitive graphical interfac
## Context

> ### Note:
> Data flows support loading data exclusively to local tables in the SAP Datasphere repository.
> For optimal performance, it is recommended that you consider staggering the scheduled run time of tasks such as data flows and task chains that may contain these tasks. There is a limit on how many tasks can be started at the same time. If you come close to this limit, scheduled task runs may be delayed and, if you go beyond the limit, some scheduled task runs might even be skipped.


Expand All @@ -25,11 +25,12 @@ Create a data flow to move and transform data in an intuitive graphical interfac

2. Add one or more objects from the *Source Browser* panel on the left of the screen as sources \(see [Add a Source](add-a-source-7b50e8e.md)\).

> ### Note:
> Data flow currently doesn't support double quotes in column names, table names, owners, or other identifiers. If the source or target operators in a data flow contains double quotes, we recommend you to create a view in the source or in SAP Datasphere that renames the columns containing double quotes.
> ### Restriction:
> Data flows don't support spatial data type columns.
> - Data flows support loading data exclusively to local tables in the SAP Datasphere repository.
>
> - Data flow currently doesn't support double quotes in column names, table names, owners, or other identifiers. If the source or target operators in a data flow contains double quotes, we recommend you to create a view in the source or in SAP Datasphere that renames the columns containing double quotes.
>
> - Data flows don't support spatial data type columns.
3. Transform your data using one or more operators:

Expand Down
Loading

0 comments on commit c2b0b26

Please sign in to comment.