New to Real Time Replication? with this blog I would like to share basic information on SAP Landscape Transformation Replication Server i.e SLT running on the Netweaver platform .
SLT is the SAP first ETL tool that allows you to load and replicate data in real-time or schedule data from the source system and Non-Source System into SAP HANA Database.
SAP SLT server uses a trigger-based replication approach to pass data from source system to target system. SLT server can be installed on the separate system or on SAP ECC System.
SAP HANA does not really differentiate whether the data comes from SAP or from non-SAP source systems. Data is data, regardless of where it comes from. Combined with the fact that SAP HANA is a full platform-and not just a dumb database —this makes it an excellent choice for all companies. SAP HANA is meant not only for traditional SAP customers but also for a wide range of other companies, from small startups to large corporations. In fact, SAP now actively supports thousands of startups, helping them use SAP HANA to deliver innovative solutions that are “nontraditional” in the SAP sense.
Data provisioning is used to get data from a source system and provide that data to a target system. In our case, we get this data from various source systems, whether they’re SAP or non-SAP systems, and our target is always the SAP HANA system.
Data provisioning goes beyond merely loading data into the target system. With SAP HANA and some of its new approaches to information modeling, we also have more options available ,when working with data from other systems. We do not always have to store the data we use in SAP HANA as with traditional Key Concepts.
◉ Allows real-time or schedule time data replication.
◉ During replicating data in real-time, we can migrate data in SAP HANA Format.
◉ SLT handles Cluster and Pool tables.
◉ This support automatically non-Unicode and Unicode conversion during load/replication. (Unicode is a character encoding system similar to ASCII. Non-Unicode is encoding system covers more character than ASCII).
◉ This is fully integrated with SAP HANA Studio.
◉ SLT have table setting and transformation capabilities.
◉ SLT have monitoring capabilities with SAP HANA Solution Manager.
Extract, Transform, and Load (ETL) is viewed by many people as the traditional process of retrieving data from a source system and loading it into a target system. Figure illustrates the traditional ETL process of data provisioning.
SLT is the SAP first ETL tool that allows you to load and replicate data in real-time or schedule data from the source system and Non-Source System into SAP HANA Database.
SAP SLT server uses a trigger-based replication approach to pass data from source system to target system. SLT server can be installed on the separate system or on SAP ECC System.
INTRODUCTION:
SAP HANA does not really differentiate whether the data comes from SAP or from non-SAP source systems. Data is data, regardless of where it comes from. Combined with the fact that SAP HANA is a full platform-and not just a dumb database —this makes it an excellent choice for all companies. SAP HANA is meant not only for traditional SAP customers but also for a wide range of other companies, from small startups to large corporations. In fact, SAP now actively supports thousands of startups, helping them use SAP HANA to deliver innovative solutions that are “nontraditional” in the SAP sense.
Data provisioning is used to get data from a source system and provide that data to a target system. In our case, we get this data from various source systems, whether they’re SAP or non-SAP systems, and our target is always the SAP HANA system.
Data provisioning goes beyond merely loading data into the target system. With SAP HANA and some of its new approaches to information modeling, we also have more options available ,when working with data from other systems. We do not always have to store the data we use in SAP HANA as with traditional Key Concepts.
Benefit of SLT system
◉ Allows real-time or schedule time data replication.
◉ During replicating data in real-time, we can migrate data in SAP HANA Format.
◉ SLT handles Cluster and Pool tables.
◉ This support automatically non-Unicode and Unicode conversion during load/replication. (Unicode is a character encoding system similar to ASCII. Non-Unicode is encoding system covers more character than ASCII).
◉ This is fully integrated with SAP HANA Studio.
◉ SLT have table setting and transformation capabilities.
◉ SLT have monitoring capabilities with SAP HANA Solution Manager.
Extract, Transform, and Load
Extract, Transform, and Load (ETL) is viewed by many people as the traditional process of retrieving data from a source system and loading it into a target system. Figure illustrates the traditional ETL process of data provisioning.
Figure 1.1 Extract, Transform, and Load Process
The ETL process consists of the following phases:
Extraction
During this phase, data is read from a source system. Traditional ETL tools can read many data sources, and most can also read data from SAP source systems.
Transform
During this phase, the data is transformed. Normally this means cleaning the data, but it can refer to any data manipulation. It can combine fields to calculate something (e.g., sales tax, or filter data, or limit the data to the year 2016). You set up transformation rules and data flows in the ETL tool.
Load
In the final phase, the data is written into the target system.
The ETL process is normally a batch process. Due to the time required for the complex transformations that are sometimes necessary, tools do not deliver the data in real time.
Initial Load and Delta Loads
It is important to understand that loading data into a target system happens in two distinct phases
Figure 1.2 Initial Load of Tables and Subsequent Delta Updates
Replication:
Replication emphasizes a different aspect of the data loading process.
ETL focuses on ensuring that the data is clean and in the correct format in the target system. Replication does less of that, but gets the data into the target system as quickly as possible. As such, the replication process appears quite simple, as shown in Figure 1.3.
The extraction process of ETL tools can be demanding on a source system. Because they have the potential to dramatically slow down the source system due to the large volumes of data read operations, the extraction processes are often only run after hours.
Replication tools aim to get the data into the target system as fast as possible. This implies that data must be read from the source system at all times of the day and with minimal impact to the performance of the source system. The exact manner in which different replication tools achieve this can range from using database triggers to reading database log files.
FIGURE 1.3 Replication process
With SAP HANA as the target system, we achieve real-time replication speeds. (Worked on projects where the average time between when a record is updated in the source system to when the record is updated in SAP HANA was only about 50 milliseconds!)
SAP Extractors:
Normally, we think of data provisioning as reading the data from a single table in the source system and then writing the same data to a similar table in the target system. However, it is possible to perform this process differently, such as reading the data from a group of tables and delivering all this data as a single integrated data unit to the target system. This is the idea behind SAP extractors.
Database Connections:
SAP HANA provides drivers known as SAP HANA clients that allow you to connect to other types of databases. Let’s look at some of the common database connectivity terminology you might encounter.
Open Database Connectivity (ODBC):
ODBC acts as a translation layer between an application and a database via an ODBC driver. You write your database queries using a standard application 476 programming interface (API) for accessing database information. The ODBC driver translates these queries to database-specific queries, making your database queries database and operating system independent. By changing the ODBC driver to that of another database and changing your connection information, your application will work with another database.
Java Database Connectivity (JDBC):
JDBC is similar to ODBC but specifically aimed at the Java programming language.
Object Linking and Embedding, Database for Online Analytical Processing (ODBO)
ODBO provides an API for exchanging metadata and data between an application and an OLAP server (like a data warehouse using cubes). You can use ODBO to connect Microsoft Excel to SAP HANA.
Multidimensional expressions (MDX)
MDX is a query language for OLAP databases, similar to how SQL is a query language for relational (OLTP) databases. The MDX standard was adopted by a wide range of OLAP vendors, including SAP.
Business Intelligence Consumer Services (BICS)
BICS is an SAP-proprietary database connection. It is a direct client connection that performs better and faster than MDX or SQL. Hierarchies are supported, negating the need for MDX in SAP environments. Because this is an SAP-only connection type, you can only use it between two SAP systems —for example, from an SAP reporting tool to SAP BW.
The fastest way to connect to a database is via dedicated database libraries. ODBC and JDBC insert another layer in the middle that can impact your database query performance. Many times, however, convenience is more important than speed.
SAP HANA Replication allows migration of data from source systems to SAP HANA database. Simple way to move data from existing SAP system to HANA is by using various data replication techniques.SAP HANA Replication allows migration of data from source systems to SAP HANA database. Simple way to move data from existing SAP
system to HANA is by using various data replication techniques.System replication can be set up on the console via command line or by using HANA studio. The primary ECC or transaction systems can stay online during this process. We have three types of data replication methods in HANA system −
◉ SAP LT Replication method
◉ ETL tool SAP Business Object Data Service (BODS) method
◉ Direct Extractor connection method (DXC)
SAP Landscape Transformation (SLT)
One of the main features of HANA is that, it can provide real time data to the customer at any point of time. This is made possible with the help of SLT (SAP Landscape Transformation) where real time data is loaded to HANA from SAP or Non-SAP source systems.
SAP Landscape Transformation Replication Server (“SLT”)
◉ is for all SAP HANA customers who need real-time or scheduled data replication, sourcing from SAP and NON-SAP sources
◉ Uses trigger-based technology to transfer the data from any source to SAP HANA in real-time.
SLT server can be installed on the separate system or on SAP ECC System.
Benefit of SLT system:
◉ Allows real-time or schedule time data replication.
◉ During replicating data in real-time, we can migrate data in SAP HANA Format.
◉ SLT handles Cluster and Pool tables.
◉ This is fully integrated with SAP HANA Studio.
◉ SLT have table setting and transformation capabilities.
◉ SLT have monitoring capabilities with SAP HANA Solution Manager.
SLT Architecture overview between SAP System and SAP HANA:
SLT Replication Server transforms all metadata table definitions from the ABAP source system to SAP HANA.
For SAP source, the SLT connection has the following features –
◉ If your source system is SAP then you can install SLT as separate system or in Source itself.
◉ If your source system is SAP then you can install SLT as separate system or in Source itself.
◉ When a table is replicated, SLT Replication server create logging tables in the source system.• Read module is created in the SAP Source System.
◉ The connection between SLT and SAP Source is established as RFC connection.
◉ The connection between SLT and SAP HANA is established as a DB connection.
◉ If you install SLT in Source system itself, then we no more need to have an RFC Connection SLT ◉ Server automatically create DB connection for SAP HANA database (when we create a new configuration via transaction LTR). There is no need to create it manually.
The SAP Note (1605140) provides complete information to install SLT system.
In case SLT Replication Server is installed in the Source System the architecture will be as shown below
SLT Architecture overview between Non-SAP System and SAP HANA:
• The above figure shows real time replication of data from non-sap sources to HANA system. When the source is non-sap we have to install SLT as a separate system.
• The main changes when compared to first scenario where source is SAP System are Connection between Source and SLT is going to be a DB Connection.Read modules will be in SLT instead of Source.
Components of SLT:
The main components involved in real-time replication using SLT are
Logging Tables: Logging tables are used to capture the changed/new records from application tables since last successful replication to HANA.
Read Modules: Read modules are used to read the data from application tables for initial loading and convert the cluster type tables into transparent.
Control Module: The control module is used to perform small transformation on the source data. Data from here will be moved to write tables.
Write Modules: The functionality of write table is to write the data to HANA system.
Multi System Support:
SLT Replication Server supports both 1:N replication and and N:1 replication.
Multiple source system can be connected to one SAP HANA system.
One source system can be connected to multiple SAP HANA systems. Limited to 1:4 only.
No comments:
Post a Comment