Example: bankruptcy

Azure Data Factory Integration

SAP Data IntegrationUsing Azure Data FactoryUpdate: Jun 28,2020I N G E S TP R E P A R ET R A N S F O R M& E N R I C HS E R V ES T O R EV I S U A L I Z EOn-premises dataCloud dataSaaS dataData Pipeline Orchestration & Monitoring SAP dataI N G E S TP R E P A R ET R A N S F O R M& E N R I C HS E R V ES T O R EV I S U A L I Z EOn-premises dataCloud dataSaaS dataData Pipeline Orchestration & Monitoring SAP dataTypical SAP data Integration scenarios: Ongoing batch ETL from SAP to data lake Historical migration from SAP to AzureAzure Data FactoryA fully-managed data Integration service for cloud-scale analytics in AzureS c a l a b l e & C o s t-E f f e c t i v eC o n n e c t e d & I n t e g r a t e dP r o d u c t i v eS e c u r e & C o m p l i a n tRich connectivityBuilt-in transformationFlexible orchestrationFu

• Run on Self-hosted Integration Runtime if SAP in private network • SAP side config: set up SAP Gateway, activate OData service, and expose entities. ADF Self-hosted Integration Runtime Azure Data Stores Pipeline SAP Gateway OData • If your ECC is publicly accessible, you can use

Tags:

  Integration

Information

Domain:

Source:

Link to this page:

Please notify us if you found a problem with this document:

Other abuse

Transcription of Azure Data Factory Integration

1 SAP Data IntegrationUsing Azure Data FactoryUpdate: Jun 28,2020I N G E S TP R E P A R ET R A N S F O R M& E N R I C HS E R V ES T O R EV I S U A L I Z EOn-premises dataCloud dataSaaS dataData Pipeline Orchestration & Monitoring SAP dataI N G E S TP R E P A R ET R A N S F O R M& E N R I C HS E R V ES T O R EV I S U A L I Z EOn-premises dataCloud dataSaaS dataData Pipeline Orchestration & Monitoring SAP dataTypical SAP data Integration scenarios: Ongoing batch ETL from SAP to data lake Historical migration from SAP to AzureAzure Data FactoryA fully-managed data Integration service for cloud-scale analytics in AzureS c a l a b l e & C o s t-E f f e c t i v eC o n n e c t e d & I n t e g r a t e dP r o d u c t i v eS e c u r e & C o m p l i a n tRich connectivityBuilt-in transformationFlexible orchestrationFull Integration with Azure Data servicesDrag & drop UISingle-pane-of-glass monitoringCICD modelServerless scalability without infra mgmtPay for useCertified complianceEnterprise

2 Grade securityMSI and AKV supportAzure Machine Learning integrationCode-free datatransformationSAP data ingestionSingle tool to enable data ingestion from SAP as well as other various sources, and data transformation via built-in Data Flow, Integration with Databricks/ & DWFile StorageFile FormatsNoSQLS ervices & AppsGenericBlob StorageAmazon RedshiftPhoenixAmazon S3 AvroCassandraAmazon MWSPayPalHTTPC osmos DB SQL APIDB2 PostgreSQLFile SystemBinaryCouchbaseCDS for AppsQuickBooksODataCosmos DB MongoDB APID rillPrestoFTPC ommon Data ModelMongoDBConcurSalesforceODBCADLS Gen1 Google BigQuerySAP BW Open HubGoogle Cloud StorageDelimited TextDynamics 365SF Service CloudRESTADLS Gen2 GreenplumSAP BW MDXHDFSE xcelDynamics AXSF Marketing CloudData ExplorerHBaseSAP HANASFTPJSOND ynamics CRMSAP

3 C4 CDatabase for MariaDBHiveSAP TableORCG oogle AdWordsSAP ECCD atabase for MySQLI mpalaSnowflakeParquetHubSpotServiceNowDa tabase for PostgreSQLI nformixSparkJiraSharePoint ListFile StorageMariaDBSQL ServerMagentoShopifySQL DatabaseMicrosoft AccessSybaseMarketoSquareSQL Managed InstanceMySQLT eradataOffice 365 Web TableSynapse AnalyticsNetezzaVerticaOracle EloquaXeroSearch IndexOracleOracle ResponsysZohoTable StorageOracle Service CloudSAP Data Integration OverviewSAP HANA ConnectorSAP Table ConnectorSAP BW Open Hub ConnectorSAP ECC ConnectorSAP BW MDX ConnectorMore about Azure Data Factory Copy ActivityResources I want to extract data from SAP HANA database ADF connector.

4 (Connector deep-dive) I want to extract data from SAP BW ADF connector optionsSAP TableSAP BW Open HubSAP BW via MDXO bjects to extractTable (Transparent,Pooled,ClusterTable) and ViewDSO, InfoCube, MultiProvider, DataSource, etcInfoCubes, QueryCubesSAP side configurationN/ASAP Open Hub DestinationN/APerformanceFast w/ built-in parallel loading based on configurable partitioningFast w/ built-in parallel loading based on OHD specific schemaSlowerSuitable workloadLarge volumeWell-thought-through workloadLarge volumeExploratory workload Small volumeSuggested decision direction(Connector deep-dive)(Connector deep-dive)(Connector deep-dive)

5 ADF connector optionsSAP TableSAP ECCO bjects to extractTable (Transparent,Pooled,ClusterTable) and ViewOData entities exposed via SAP Gateway (BAPI, ODP)SAP side configurationN/ASAP GatewayPerformanceFast w/ built-in parallel loadingSlowerSuitable workloadLarge volumeSmall volume I want to extract data from SAP ECC, S/4 HANA, or other SAP applications Suggested decision direction(Connector deep-dive)(Connector deep-dive)Supported versions All SAP HANA versions, on-prem or in the cloudSupported SAP objects HANA Information Models (Analytic/Calculation views)

6 Row & Column TablesSupported authentications Basic username & password Windows Single Sign-On via Kerberos-constrained delegationMechanism and prerequisites Built on top of SAP s HANA ODBC driver Pulldata via custom query Run on Self-hosted Integration RuntimePerformance & Scalability Built-in parallel loading option based on configurable data partitioning NEW Performant to handle TB level datawith hundred millions to billion of rows per run, observed several to several dozensMB/s (varies per customers data/env.)SAP HANA ODBC DriverADF Self-hostedIntegration RuntimeAzure Data StoresPipelineOut-of-box optimization for SAP HANA: Built-in parallel copy bypartitionsto boost performance for large table ingestion.

7 Options of HANA physical table partition and dynamic range Activity set Parallel Copy = dataFor each copy activity run, ADF issue the specified query to source to retrieve the data. * FROM MyTableWHERE LastModifiedDate>= 'yyyy/MM/dd ) AND LastModifiedDate< 'yyyy/MM/dd ) Execution start time: 2019/03/19 00:00:00 (window end time)Delta extraction: last modified time between 2019/03/18 2019/03/19 Execution start time: 2019/03/20 00:00:00 (window end time)Delta extraction: last modified time between 2019/03/19 2019/03/20 WorkflowPipelineSupported versions SAP ECC or other applicationsin Business Suite version and above, on-prem or in the cloud S/4 HANAS upported SAP objects SAP Transparent Table, Pooled Table, ClusterTable and ViewSupported server type Connect to Application Server or Message ServerSupported authentications Basic username & password SNC (Secure Network Communications)Mechanism and prerequisites Built on top of SAP.

8 NET Connector , pull data via NetWeaver RFC w/ field selection & row filter Run on Self-hosted Integration RuntimePerformance & Scalability Built-in parallel loadingoption based on configurable data partitioning Performant to handle TB level data, with per run dozen millions to billion of rows& observed several to 20s MB/s (varies per customers data/env.) Field/column selection Row filter using SAP query operators Use default /SAPDS/RFC_READ_TABLE2 or custom RFC module to retrieve dataSAP .NET ConnectorADF Self-hostedIntegration RuntimeAzure Data StoresPipelineCapabilities: Field selection Row filter (SAP query operators) Default or custom RFC func Built-in partition + parallel loadADFS ingle Copy Activity set Parallel Copy = tableTips:Enable partitioning when ingesting large dataset, dozen millions of speed up, choose the proper partition column and partition numbers, and adjust parallel morePattern I: my data has timestamp column calendar date Solution.

9 Tumbling window trigger + dynamic query with system variables via SAP table option (filter)Pattern II: my data has an incremental column id/last copied date Solution:externalcontrol table/file + high started via solution template: Supported versions SAP BW version and above, on-prem or in the cloud*Supported SAP objects Open Hub Destination (OHD) local table Underneath objects can be DSO, InfoCube, MultiProvider, server type Connect to Application Server or Message ServerNEWS upported authentications Basic username & passwordMechanism and prerequisites Built on top of SAP.

10 NET Connector , pull data via NetWeaver RFC Run on ADF Self-hosted Integration Runtime SAP side config: create SAP OHD in SAP BW to expose dataPerformance & Scalability Built-in parallel loadingoption based on OHD specific schema Performant to handle TB level data, with per run dozens millions to billion of rows& observed several to 20s MB/s (varies per customers data/env.) Base request ID for incremental copy to filter out already copied data Exclude last request to avoid partial data Built-in parallel copy to boost perf based on OHD s specific schema What is OHD: Supported data: OHD types: DSOPSAT ransform(DTP)ActivateExtraction(InfoPack age)F-factE-factE-factECCData SourcesCubesInfoObjectMaster DataOHDT ransform(DTP)OHDOHDOHDData Store ObjectsDTPDTPOpen Hub Destination TableSAP.


Related search queries