Friday, 30 December 2022

Consuming SAP HANA Cloud from the Kyma environment

Overview


With the December 2022 release of the SAP HANA Cloud Tools, you can now develop Kyma applications that work with HANA HDI containers and schemas. This blog post explains one essential part of what you need to do.

The overall process is as follows:

1. From the multi-environment edition of SAP HANA Cloud Central or using the btp CLI, provision a HANA database instance in a subaccount.

2. From the multi-environment edition of SAP HANA Cloud Central, use the “instance mapping” feature to map the instance into a Kyma namespace (or Cloud Foundry space), either in the same subaccount or in a different subaccount.

Thursday, 29 December 2022

HANA Docker Install with NFS mounts – (Fully Automated)

With the new evolving modern methodologies around many SAP applications and database technology, i am trying to contribute with some idea that can be helpful or spark a new idea for others.

The thought is …

What if we can create HANA docker container with database filesystems mounted to the container using NFS filesystems from an external source?

This will allow us to create HANA containers instantly with existing data and no dependency on container build process, we can even use native Linux Container images directly without modifying. This will reduce the HANA container build time, keep a persistent layer for data, less image space required for the container, reuse/redeploy the container instantly, etc..

Wednesday, 28 December 2022

Accessing SAP HANA Cloud, data lake Files from Python

Overview:


In this blog, we will learn how to use the SAP HANA data lake REST API to Create/Write, Access/Read and list your files through a python script. The REST API reference documentation link can be found at (SAP HANA Cloud, Data Lake Files REST API), and it may be used to access the file containers of the SAP HANA data lake. The Python demonstrations that follow, however, use some of the most typical endpoints. We will learn how to use a Python http client to fire a http request and then parse a response status and get response body data. In this post on python http module, we will try attempting making connections and making http requests like GET, POST, PUT, DELETE. Let’s get started.

Friday, 23 December 2022

NSE Implementation Experience

Introduction:

NSE is a fully functional warm data store on HANA database, which we can leverage to move less frequently accessed data without loading fully into memory. 

This is a very good feature, where we can avoid increasing server capacity and thereby control costs on hardware.

For example, one of our customers has a 2TB production instance on Azure and they have an average memory consumption of 1.7-1.8 TB. To increase hardware capacity, keeping similar CPU size, we need to take a larger VM of around 4TB, which is oversized according to the current growth rate of the database; so customers will end up paying more for a larger VM, even though it’s not being fully utilized. 

Wednesday, 21 December 2022

Getting Started on your SAP S/4HANA implementation

SAP HANA Exam, SAP HANA Career, SAP HANA Prep, SAP HANA Preparation, SAP HANA Tutorial and Material, SAP HANA S/4HANA
Starting line

Are you ready to modernize your information technology platform because it is a set of outdated legacy systems that no longer supports your business adequately? Today’s technology provides innovative solutions that bring more value, streamline processes, improve business agility, accelerate efficiency, and improve the bottom line. They can help scale, grow, and transform your business into an industry-leading entity. Many organizations are migrating from SAP ECC to S/4HANA or implementing SAP for the first time with S/4HANA. The key to success is getting started the right way.

Monday, 19 December 2022

MTA project Integration with Git in Business Application Studio: HANA XSA

Today I am going to discuss about the MTA project Integration with GitHub as a source control in SAP Business Application Studio: HANA XSA. earlier I discussed integration with  Web IDE which is slide difference.

SAP Business Application studio support  Personal access tokens instead of passwords and to access GitHub repo  in a SAP Business Application Studio we need create an access token on GitHub

Before perform the steps, I want everybody know the flow and keep the diagram so that you have a clear concept on this migration –

Friday, 16 December 2022

MTA project Integration with Git in Web IDE: HANA XSA

Today I am going to discuss about the MTA project Integration with GitHub as a source control in Web IDE: HANA XSA.

Before perform the steps, I want everybody know the flow and keep the diagram so that you have a clear concept on this migration –

SAP Web IDE, SAP HANA XSA, SAP HANA Exam, SAP HANA Career, SAP HANA Skills, SAP HANA Job

Wednesday, 14 December 2022

Add New Fields To “S/4HANA Manage Purchase Requisition- Professional Fiori App” With CDS Extension

This topic demonstrates how you can extend the original CDS view with a view extension to provide some additional fields.

Problem:


Our customer asked us to add some fields that are in the “EBAN” table but not in the Manage Purchase Requisition Professional Fiori application(F2229).

Saturday, 10 December 2022

Spend Reporting with S/4HANA Embedded Analytics

Spend reporting is an important AP requirement that helps companies better understand and control their expenses. Traditionally, companies that run SAP approach Spend reporting with BW solutions.
When it comes to Spend reporting BW gives little to no advantage compared to S/4HANA Embedded Analytics due to rigid extraction process and need of complex transformations and native modeling. It makes total sense to bring Spend reporting back to S/4HANA where it belongs.

S/4HANA Embedded Analytics Spend data can be sliced and diced in Analysis for Office or visualized using SAP Analytics Cloud or Power BI.

Friday, 9 December 2022

Replicate artifacts data from an HDI Container in SAP Business Application Studio to SAP HANA On_Premise

The SAP HANA Deployment Infrastructure (HDI) is a service layer of the SAP HANA database and helps to create runtime database objects from design-time artifacts.

It uses containers to store design-time artifacts and the corresponding deployed run-time objects.

Inside the database, the HDI container is represented as a schema but is owned by a technical user and isolated from other database objects. Only local object access inside the container is allowed and to access objects outside the container explicit grants is required from the object owner.

Wednesday, 7 December 2022

Consume CDS View inside CDS Table Function by using AMDP

We are familiar with getting data from a table using AMDP procedure or AMDP table function. But, how about CDS View, I tried a case and got the following error

SAP HANA Exam, SAP HANA Exam Prep, SAP HANA Tutorial and Materials, SAP HANA Skills, SAP HANA Jobs, SAP HANA Guides

Basically, AMDP will get data from database, and it will get all client (assume your system have several clients), that is a reason why you catch this error when working with CDS view.

Friday, 2 December 2022

Use multi-tenant capabilities of SAP Job Scheduling Service to schedule tenant-specific jobs to pull data from the S/4HANA system

In this blog, we will see how to schedule periodic jobs for each of the subscribing tenants to pull data periodically from the backend system and ensure data isolation for each tenant.

SAP Job Scheduling Service


SAP Job Scheduling service allows you to define and manage jobs that run once or on a recurring schedule. Use this runtime-agnostic service to schedule action endpoints in your application or long-running processes using Cloud Foundry tasks. Use REST APIs to schedule jobs, including long-running jobs asynchronously, and create multiple schedule formats for simple and complex recurring schedules. Manage jobs and tasks and manage schedules with a web-based user interface.

Wednesday, 30 November 2022

Whatsapp Integration in SAP S/4 HANA On-Premise

Introduction:


In this blog I am going to share you my experience about whatsapp integration. Nowadays people are running beyond their level, Whatsapp Integration Module is the effective product module. Also people use for their business as well as personal use. It increases business profitability and helps business in strategizing goals to stand out from the competition. Whatsapp is using everywhere in the world for fast communication.

making this solution more secure than a conventional email.

Monday, 28 November 2022

Automating SAP HANA Installation in Minutes (AWS) – Part 2

Introduction:


This  blog is part 2 of 2 in this series  of  Automating SAP Installation in minutes, refer the part 1 of this blog for the prerequisite to get started.


This article provides detailed steps and commands required  to get an EC2 instance provisioned, with the ‘XFS’ file system built automatically  with a successfully SAP HANA DB installation in AWS.

Monday, 21 November 2022

Automating SAP HANA Installation in Minutes (AWS) – Part 1

This blog describes installing HANA database automatically in less than 15 minutes in AWS with some prerequisite like VPC, AMI, Subnet, Security Group, IG, EC2, EBS, AWS CLI, AWS Access Keys and SAP HANA media  others mandatory services available in place to host the HANA workloads in AWS cloud.

Operating System (AMI):

AMI is created with SUSE Linux environment (SLES-15 with SP4) with the latest OS update, with the following packages amazon-ssm-agent.rpm, nfs-utility, insserv-compat and libltdl7.

Terraform & AWS CLI :

Terraform v0.15.5 for Windows (64) bit has been installed for building the workloads though API with AWS CLI.

Friday, 18 November 2022

Extensibility Guide for SAP S/4HANA Cloud: SAP Extensibility Explorer

When I first started looking into various ways to extend SAP S/4HANA Cloud, I found a wide range of information from multiple sources. As a result, finding the most recent extension choices required me to browse through several blog posts, which was a cumbersome task. So, I recently came across SAP Extensibility Explorer tool which has various options, and I could find all the information together.

In this post, I wanted to share my knowledge and experience about the SAP Extensibility Explorer tool.

Wednesday, 16 November 2022

Building Multi-tenant SaaS Solution and Extension using Nodejs on BTP

This will be a blog series, we will see how to build Nodejs-based applications using Multi-tenant capabilities offered by SAP Business Technology Platform(BTP) to build extensions.

Context


When developing tenant-aware applications in the Cloud Foundry environment, keep in mind the following general programming guidelines:
 
◉ Shared in-memory data that may be available to all tenants.

◉ Avoid any possibility that an application user can execute custom code in the application JVM, as this may give them access to other tenants’ data.

Monday, 14 November 2022

Load Tables Asynchronously in SAP HANA Cloud, data lake Relational Engine

Overview


It is common to load data into HANA Cloud, Data Lake. Loads can also take a long time, and because they run through a database connection, it can be tedious to keep an open DBX (or client application) session to facilitate a long-running load from object storage.

A built-in event scheduler in SAP HANA Cloud, data lake relational engine can be used to schedule SQL functionality. Through this blog you will learn how to schedule data movement from a SAP HANA Cloud, HANA database to a SAP HANA Cloud, data lake relational engine instance using this event scheduler.

Friday, 11 November 2022

Installation Guide Eclipse in Mac OS

Introduction:

Before Driving deep into Technical details. let me give some brief about why we need to do this ABAP Development Tool (ADT) is an Eclipse  based tool provided by SAP, You will need ADT if you have to work on the  ABAP CDS views. Even though,CDS views are emdedded into the ABAP Dictionary, there are some Difference in the features available between the Eclipse and the Data Dictionary environments.

Now in this Blog we help, trouble shoot if any issues and learn to Install Studio in Mac OS X from update-site

Wednesday, 9 November 2022

How can a single Supplier can have multiple Contact Persons in SAP S/4HANA Cloud?

Lately, we have received queries on single supplier having multiple contact person belonging to different sales orgs who raises Purchase Orders in SAP S/4HANA Cloud.

In this blog post, we shall see how to set-up the use case in SAP S/4HANA Cloud.

Step 1: Login to SAP S/4HANA Cloud system with user: BUPA_MASTER_SPECIALIST.

Step 2: Once logged in search for the app “Maintain Business Partner“.

Monday, 7 November 2022

Linking Contract Accounting with an operative service system (Disconnection and Reconnection of Services)

Ever wondered how to link your sub-ledger accounting with an operative service system? My blog article will show you how to use the Disconnection and Reconnection of Services function in SAP S/4HANA Cloud  (also available in on-premise versions) and how to configure it to your own solution!

Content:

◉ Understand the feature
◉ Understand the process
◉ Master data
◉ Disconnection of services
◉ Reconnection of services
◉ Transfer disconnection and reconnection requests
◉ Monitoring of disconnection and reconnection requests
◉ Understand the configuration

Saturday, 5 November 2022

Keep-Data-Clean: Practical approaches to keep your core (S/4 HANA) system clean

“Every company will become a technology company, and every company will become a data
company” – Steve Brown ‘The Innovation Ultimatum’

In SAP world, almost every company intended to use S/4HANA as their digital core.
But to run any sort of business, many companies also need various other SAP and non-SAP
systems/SaaS solutions. (On top of S/4 HANA). Like:
– PIM to manage all categories of products,
– WebShop to handle direct customer orders,
– EWM (advanced) to manage a warehouse,
– CRM to manage customer relations,
– PSP (payment service provider) for digital payments,
– SAP Ariba for spend management,
– Success factors for human capital management,
Etc… just to give some names, there could be many possibilities.

Saturday, 22 October 2022

SAP Data Intelligence and SAP PaPM Cloud integration

For those of you who enjoy using SAP Data Intelligence Cloud (SAP DI Cloud) for data management, integration and processing, and are wondering how it can be integrated with SAP Profitability and Performance Management Cloud (SAP PaPM Cloud). Well, you’re in luck because this blogpost will cover this topic, so just keep on reading!

Scenario:


A source table is created as an input for the View function. 

Note: 

The source table can be created in: 

a) SAP HANA Cloud and consumed by SAP PaPM Cloud using Model Table HANA

b) SAP PaPM Cloud directly using Model Table and Environment fields

Friday, 21 October 2022

Analyze SAP S/4HANA On-Premises Data using SAP Analytics Cloud Powered by SAP Data Warehouse Cloud

This simple blog post helps you get an understanding of how data from an SAP S/4HANA on-premises system can be Analysed using SAP Analytics Cloud which is connected using Data Warehouse Cloud in SAP Business Technology Platform (BTP).

This blog post covers –

◉ Connectivity of SAP S/4HANA On-premise system to SAP Data Warehouse Cloud.
◉ Creation of a View using SAP Data Warehouse Cloud Data Builder.
◉ Consume the View in SAP Analytics Cloud and create a story.

Wednesday, 19 October 2022

AWS EC2 OS patching automation for SAP Landscape

For SAP Landscape hosted in cloud Infrastructure, keeping Linux patching up to date is a very key requirement to address security hardening, vulnerability assessment and for other mandatory compliance requirement on a regular basis. Applying Linux patching for a large SAP Landscape is a very time-consuming efforts, as it does require a clean stop and start of all SAP applications in the landscape during this exercise.

This document particularly describes the procedure that how OS patching can be fully automated using AWS System Manager for a SAP landscape hosted on AWS EC2 running on Redhat Linux with multiple SAP applications running on HANA DB.

Monday, 17 October 2022

SAP S/4HANA Cloud Content Federation with SAP BTP Launchpad Site

In today’s world end-users are spending valuable time accessing multiple access points to gain access to their required apps and content. Even business processes are often spanned across multiple entry points and do not provide this single point of entry that users require.

With companies adopting the Two-Tier strategy, SAP is also providing various deployment options for realizing this two-tier ERP strategy. Two-Tier provides enterprises with an opportunity to standardize the end-to-end business processes across multiple tiers. By using SAP S/4HANA Cloud for their Tier 2, customers get the benefit of Software as a Service (SaaS) which can be implemented by standard template, thereby reducing the cost and ancillary IT expenses by having pre-configured solution. But this also introduces an additional entry point from an end user perspective.

A central entry point for business applications simplifies access and increases user productivity. Designing and configuring a central point of access to SAP and third-party solutions (both cloud and on-premise), in particular accessing multiple SAP S/4HANA systems from one common launchpad on SAP Business Technology Platform alleviates a lot of the pain points mentioned above for end-users

Saturday, 15 October 2022

Receive Notifications from Amazon Simple Notification Service for SAP S/4HANA BTP Extension App

Nowadays many companies choose to build business process extension projects for the SAP product they use and deploy on SAP Business Technology Platform, so that they could write the custom code and integrate with other SAP managed services or SaaS (software as a service) easily and quickly. At the same time, hyperscalers (AWS, GCP, Azure, etc…) are playing an increasingly significant role today as they help companies lower the capital expenses, increase scalability and elasticity to the system, and enhance the performance of the system. Under such circumstances, it would be better for us to understand how to leverage the services provided by the hyperscaler while developing the BTP based business process extension project, so that we could be benefit from the advantages provided by both SAP Business Technology Platform and hyperscaler. 

In this blog, I will show you how to integrate Amazon Simple Notification Service (Amazon SNS) with SAP Cloud Application Programming Model (CAP), to build an SAP S/4HANA business process extension App and receive email notifications by leverage the Amazon SNS (Simple Notification Service) service. In this blog, we will focus on how to implement Amazon SNS Service within CAP (Cloud Application Programming) App to send out email notifications. 

Friday, 14 October 2022

Develop a Machine Learning Application on SAP BTP – Data Science Part

In a series of blog posts, we address the topic of how to develop a Machine Learning Application on SAP BTP. The overall sequence of steps performed by the involved personas is depicted below:

SAP HANA Exam, SAP HANA Prep, SAP HANA Career, SAP HANA Certification, SAP HANA Guides, SAP HANA Tutorial and Materials, SAP HANA Learning, SAP HANA Preparation Exam

In this particular blog of the series, we focus on the data scientist’s work, i.e., understanding the business problem, performing experiments, creating appropriate machine learning models and finally generating the corresponding design time artifacts. These artifacts can then be exchanged with the application developer by pushing/pulling them to a common git repository.

Wednesday, 12 October 2022

HA/DR Architecture on SAP Shared File system on Windows in Azure Cloud

Introduction: This blog explains a reference architecture of a High available SAP system in Windows in Azure Cloud  with DR solution. This is one of the recommended architectures from Microsoft & SAP in Windows in Azure cloud where we will be utilizing Windows DFS-N to support flexible SAP shared file systems.

We will be referring shared file system as below, but same architecture can also be followed in case we have any other shared file system (e.g Interfaces which we would like to have in both regions):

1. \\<ANF or AFS name >\sapmnt
2. \\<ANF or AFS name >\trans

Scenario: We can refer this in below scenario:

Monday, 10 October 2022

Data migration from SAP S/4HANA Cloud and SAP HANA Smart Data Integration

Background


This blog post is aimed to share my experiences working on data migration activities migrating data from a SAP S/4HANA Cloud system to a SP S/4HANA On-Premise system and the usage of SAP Migration Cockpit together with SAP HANA Smart Data Integration (SAP HANA SDI) helping on this purpose.

There are a few challenges on this scenario for data migration:

◉ Extract data from SAP S/4HANA Cloud
◉ Transform the data extracted (in files) to use as a source data for Migration Cockpit

Friday, 7 October 2022

How to recreate a HANA Cloud service key aka password rotation

Problem:


SAP HANA Cloud uses BTP services and service keys.

There might be the need to update the service-keys.

Solution:


Warning: This is advanced scripting and you could harm your configurations. Please test carefully with dedicated spaces before you are applying this to production instances. This includes also development environments/spaces.

Wednesday, 5 October 2022

GDAL with SAP HANA driver in OSGeo4W

I got great news from our SAP HANA multi-model engineering team: now OGR driver for SAP HANA is included in OSGeo4W too!

OSGeo4W is a binary distribution of a broad collection of open-source geospatial software for Windows environments.

Setup SAP HANA plug-in…


Now, when installing or configuring OSGeo4W on your Windows machine, search for hana and include gdal-hana package to be installed.

Monday, 3 October 2022

Python hana_ml: Classification Training with APL(GradientBoostingBinaryClassifier)

I am writing this blog to show training with APL using python package hana_ml. With APL, you can automate preprocessing to some extent.

Environment


Environment is as below.

◉ Python: 3.7.14(Google Colaboratory)
◉ HANA: Cloud Edition 2022.16
◉ APL: 2209

Wednesday, 28 September 2022

SAP HANA Cockpit Installation.

In this blog, we would like to explain HANA Installation and how to monitor the HANA Landscape systems using HANA Cockpit.

In older HANA 1.0 SP12 by default cockpit was installed along with HANA Server but the newer version after HANA 2.0 SP01 higher SP level, we must install COCKPIT on a separate server to manage and monitor the HANA Landscape systems.

HANA COCKPIT can be installed on a separate server, or we can install within the HANA server.

SAP recommended to install the HANA COCKPIT a separate server for production environment.

Monday, 26 September 2022

Hana Table Migration using Export & Import

Requirement –


One of most common requirement in Hana is to move user defined table from one Hana environment to another (Ex: Dev > QA > Prod). Here is one of the method i.e. Export & Import we can use.

Solution –


Will migrate the Hana Table from Source Hana Environment to Target Hana Environment.

Step 1 Export Table – First we have to export the table from source system.

Connect to source Hana system and create sample table and load data using below script or any other script.

Wednesday, 21 September 2022

What’s New in SAP HANA Cloud in September 2022

Summer is slowly coming to an end, at least for the colleagues located in the northern hemisphere.  And with the end of the summer, we are also approaching yet another end of a quarter. With this blogpost, I want to provide you with a summary of the exciting innovations that have been introduced to SAP HANA Cloud in Q3 2022.     

To get a detailed overview of individual functionalities and changes, including demos and a Q&A, don’t miss out and register now for our release webinar hosted by the SAP HANA Cloud product management team.

Friday, 16 September 2022

Jupyter Notebook and SAP HANA: Persisting DataFrames in SAP HANA

Introduction


Jupyter Notebook or R are often the tools of choice for data scientists. It makes using data operations, data exploration and sharing convenient. SAP HANA is the database of choice for the most important businesses around the globe. Connecting both worlds, Jupyter Notebook and SAP HANA provides an incredible potential which needs to be seized.

SAP HANA


Jürgen Müller already provided a great blogpost about SAP HANA. I want to shortly formulate what SAP HANA is also with my own words. SAP HANA is an in-memory database which provides a lot of benefits, especially for analytical use cases. It connects different aspects of databases within one database. Besides typical properties of relational databases, it also delivers properties of NoSQL databases, like column-based databases. Depending on use-case, it is possible to activate or deactivate specific properties so that you can get the best performance out of your system.

Wednesday, 14 September 2022

Python with SAP Databases

How can we leverage Python programming language in SAP infrastructure? – in multiple ways I must say.

In this blog we will try to understand how we can establish connection with the database/s and how can we execute some basic queries.

But why Python?

Python is easy to learn, flexible, supports connectivity for almost all databases and OS platforms, supports connectivity to SAP using rfc modules, strong community support and has a lot of potential in automation.

Monday, 12 September 2022

HANA SPS upgrade from HANA 2.0 Rev 37 to HANA 2.0 Rev 59 on the DR server when primary and secondary server setup as replication are present

HANA SPS upgrade from HANA 2.0 Rev 37 to HANA 2.0 Rev 59 on the DR server when primary and secondary server setup as replication are present

There are many environments and larger landscapes where we have cluster environment.But in this environment we have Primary DB replicated with the secondary DB where the primary system is up an running and replicated to the secondary DB/DR Server.

Note:

While performing the HANA SPS upgrade or HANA revision upgrade first the upgrade needs to be performed to the DR/secondary server before performing it on the primary server

Reason:The version present on the DR/secondary server should be higher or similar to that of the primary server.

Saturday, 10 September 2022

SAP Analytics Cloud – TroubleShooting – Timeline Traces

There are tons of different ways how you can performance traces on SAC for troubleshooting. In this blog I will talk about one of the tracing methods called (TIMELINE TRACES) that can be very helpful when there are performance problems being observed in SAP Analytics cloud (SAC) regardless if the connection is with BW or S/4HANA etc.

Step 1:

Log on to your SAC Tenant.

Step 2:

Once you are logged on to the SAC tenant open CHROME developer tools in the same browser window where you have the Tenant open.

Friday, 9 September 2022

SAP HANA Capture and Replay Tool End-to-end Hands-On Tutorial

This tutorial will walk us through the end-to-end steps in SAP HANA capture and replay tool.

What can I do with SAP HANA capture and replay?


Capture and replay tool could capture the real workload from source SAP HANA system and replay the captured workload to a target SAP HANA system. Both source system and target system should be on premise system.

SAP HANA, SAP HANA Certification, SAP HANA Prep, SAP HANA Career, SAP HANA Jobs, SAP HANA Skills, SAP HANA Learning, SAP HANA Tutorial and Materials

Wednesday, 7 September 2022

Python hana_ml: PAL Classification Training(UnifiedClassification)

I am writing this blog to show basic classification training procedures using python package hana_ml. Wtih class UnifiedClassification, you can use several classification algorithms. Besides, training result can be exported as HTML report easily.

SAP HANA Exam, SAP HANA Exam Prep, SAP HANA, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA Certification, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs

Sunday, 28 August 2022

Load Data from A Local File into HANA Cloud, Data Lake

Overview

Ever wondered how will you load data/file from your local machine into HANA Cloud, Data Lake without hesitation?  

Then you are at the right place. This blog will provide you a step-by-step guide as to how anyone can easily load data from their local machine to the HANA Cloud, Data Lake using Data Lake File store. 

Step-by-step process:

Firstly, one must provision a Data Lake instance from the SAP HANA Cloud Central through SAP BTP Cockpit. One can learn to do so by going through the following tutorial – Provision a Standalone Data Lake in SAP HANA Cloud | Tutorials for SAP Developers  

Friday, 26 August 2022

Integrating SAP PaPM Cloud & SAP DWC

Since integration topics are currently the talk of the town when it comes to SAP Profitability and Performance Management Cloud (SAP PaPM Cloud), it would just make sense to let the community know that one of the most popular data warehousing service that is: SAP Data Warehouse Cloud (SAP DWC) could be easily integrated to your SAP PaPM Cloud Tenant. If you’re curious on how to do it, read on. 

At the end of this blog post, my goal is for you to be able to: 

◉ Consume a HANA Table from SAP PaPM Cloud’s underlying HANA Database into SAP DWC 
◉ Consume a View created from SAP DWC into SAP PaPM Cloud’s Modeling via Connections 

Let’s start!

Wednesday, 24 August 2022

E-Mail Templates in S/4 HANA- Display table in Email Template

SAP has a very interesting feature in S/4 HANA (cloud and on premise both) – E-Mail Templates.

In this blog post, we will learn how to embed table with multiple records in SE80 email template.

Example:

◉ Requirement is to send an email by end of the month to all employees who has pending hours in his/her time sheet. Data is taken from Std HR tables and pending hours is calculated using formula for each employee for project on which he is assigned. Pending hours = Planned hours – (Approved + Submitted) hours.

Monday, 22 August 2022

CDS Views – selection on date plus or minus a number of days or months

Problem

Need to be able to select data in a CDS view where the records selected are less than 12 months old. This needs a comparison of a date field in the view with the system date less 12 months.

The WHERE clause should look something like the following.

Where row_date >= DATS_ADD_MONTHS ($session.system_date,-12,'UNCHANGED')

The problem with this is that the CDS view SQL statement above is not permitted, giving the following error message on activation.

Friday, 19 August 2022

Monitoring Table Size in SAP HANA

This is a second blogpost about RybaFish Charts tool, If you never heard about the tool – please check the introduction article.

The real power of RybaFish Charts is in custom KPIs. RybaFish supports three KPI types: regular, gantt and multiline. Today we are going to create regular KPI to track the memory consumption by a certain column store (CS) table:

SAP HANA, SAP HANA Exam Prep, SAP HANA Exam Certification, SAP HANA Prep, SAP HANA Preparations, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA Tutorial and Materials
Table Size Monitoring

Monday, 8 August 2022

Flatten Parent-Child Hierarchy into Level Hierarchy using HANA (2.0 & above) Hierarchy Functions in SQL

Knock knock! Anyone else also looking for handy illustrations of the hierarchy functions introduced with HANA 2.0? Well count me in then.

While trying hard not to write SQLs with recursive self joins to flatten a hierarchical data format presented in parent-child relationship, the hierarchy functions in HANA can be a saviour for sure. Let’s look into something easy to implement using pre-defined hierarchy functions available with HANA 2.0 & above.

As a starter, let’s assume we have a miniature article hierarchy structured as below:

Saturday, 6 August 2022

Integration of SAP Ariba Sourcing with Qualtrics XM for Suppliers, HANA Cloud Database

In this blog, I will give you an overview of a solution to extract supplier data from a Sourcing event in SAP Ariba Sourcing, and save it in a mailing list in SAP Qualtrics XM for Suppliers, using BTP services.

Process Part 1

First, to extract the information from SAP Ariba Sourcing, I use the Operational Reporting for Sourcing (Synchronous) API to get events by date range, and also the Event Management API which returns supplier bid and invitation information from the sourcing events.

Then, I store the information I need in a SAP HANA Cloud database. I created 3 tables to store the information that I will send to SAP Qualtrics XM for Suppliers: RFx header information, Invitations, and Organizations contact data.

Finally, I send all the information needed to a SAP Qualtrics XM for Suppliers mailing list, which will then handle automatically sending surveys to suppliers that participated in the Ariba sourcing events.

To send the information to SAP Qualtrics XM for Suppliers, I use the Create Mailing List API.

Integration

All this is orchestrated by the SAP Integration Suite, where I created 2 iFlows:

◉ The first iFlow is to get the information from the SAP Ariba APIs, and store it in the SAP HANA Cloud database.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get RFx Information from Ariba, and store it in SAP HANA Cloud

◉ The second iFlow is to get the information from the SAP HANA Cloud database, and send it to SAP Qualtrics XM for Suppliers via the Mailing List API.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get information from HANA Cloud and Send Contacts information to Qualtrics 

The SAP HANA Cloud database was created with a CAP application developed in Visual Studio Code.

Final Thoughts

By using SAP Ariba and Qualtrics APIs, as well as a couple of BTP services, integration between the two can be achieved in a very simple way with only a few steps.

Process Part 2


There are two methods to create SAP HANA Cloud artifacts:

HANA Database project

In this method we create database artifacts in a classic database schema using declarative SQL

For this exercise, I created the project in the SAP Business Application Studio as explained in the tutorial. I used these files to create the tables:

rfx.hdbtable

COLUMN TABLE rfx (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  title NVARCHAR(1024) COMMENT 'Title',
  created_at DATETIME COMMENT 'Created at',
  updated_at DATETIME COMMENT 'Updated at',
  event_type NVARCHAR(30) COMMENT 'Event Type',
  event_state NVARCHAR(30) COMMENT 'Event State',
  status NVARCHAR(30) COMMENT 'Event Status',
  PRIMARY KEY(id)
) COMMENT 'RFx Information'
 
rfx_invited_users.hdbtable

COLUMN TABLE rfx_invited_users (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  unique_name NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  full_name NVARCHAR(1024) COMMENT 'Full Name',
  first_name NVARCHAR(250) COMMENT 'First Name',
  last_name NVARCHAR(250) COMMENT 'Last Name',
  email NVARCHAR(250) COMMENT 'Email',
  phone NVARCHAR(30) COMMENT 'Phone',
  fax NVARCHAR(30) COMMENT 'Fax',
  awarded BOOLEAN COMMENT 'Awarded',
  PRIMARY KEY(id, unique_name)
) COMMENT 'Users invited to RFx'
 
rfx_organizations.hdbtable

COLUMN TABLE rfx_organizations (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  item INTEGER COMMENT 'Item No',
  org_id NVARCHAR(15) COMMENT 'Organization ID',
  name NVARCHAR(1024) COMMENT 'Name',
  address NVARCHAR(1024) COMMENT 'Address',
  city NVARCHAR(1024) COMMENT 'City',
  state NVARCHAR(1024) COMMENT 'State',
  postal_code NVARCHAR(10) COMMENT 'Postal Code',
  country NVARCHAR(100) COMMENT 'Country',
  contact_id NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  PRIMARY KEY(id, item)
) COMMENT 'RFx Organizations'

After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud database explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use JDBC (for example) to access the tables in the SAP HANA Cloud database.

Multi-target application project, CAP and CDS

In this method we create a multi-target application and use CAP and CDS to generate the SAP HANA database tables, as well as the OData services to access the database.

I used these files for the CDS artifacs:

schema.cds

namespace com.aribaxm.service;

type InternalId : String(15);
type SDate : DateTime;
type XLText : String(2050);
type LText : String(1024);
type SText : String(30);
type MText : String(250);

entity Rfx {
    key id           : InternalId;
        title        : LText;
        createdAt    : SDate;
        updatedAt    : SDate;
        eventType    : SText;
        eventState   : SText;
        status       : SText;
}

entity RfxInvitedUsers {
    key id         : InternalId;
    key uniqueName : LText;
    fullName       : LText;
    firstName      : MText;
    lastName       : MText;
    email          : MText;
    phone          : SText;
    fax            : SText;
    awarded        : Boolean;
}

entity RfxOrganizations {
    key id      : InternalId;
    key item    : Integer;
    orgId       : SText;
    name        : LText;
    address     : LText;
    city        : LText;
    state       : LText;
    postalCode  : SText;
    country     : MText;
    contactId   : LText;
}
 
service.cds

using com.aribaxm.service as aribaxm from '../db/schema';

service CatalogService @(path:'/api/v1') {
    entity Rfx as projection on aribaxm.Rfx;
    entity RfxInvitedUsers  as projection on aribaxm.RfxInvitedUsers;
    entity RfxOrganizations as projection on aribaxm.RfxOrganizations;
}

server.js

"use strict";

const cds = require("@sap/cds");
const cors = require("cors");
//const proxy = require("@sap/cds-odata-v2-adapter-proxy");

cds.on("bootstrap", app => app.use(cors()));

module.exports = cds.server;

The mta.yaml should look something like this:

---
_schema-version: '3.1'
ID: AribaXM
version: 1.0.0
parameters:
  enable-parallel-deployments: true
build-parameters:
  before-all:
    - builder: custom
      commands:
        - npm install --production
        - npx -p @sap/cds-dk cds build --production

modules:
  - name: aribaxm-srv
    type: nodejs
    path: gen/srv
    parameters:
      buildpack: nodejs_buildpack
    build-parameters:
      builder: npm-ci
    provides:
      - name: srv-api # required by consumers of CAP services (e.g. approuter)
        properties:
          srv-url: ${default-url}
    requires:
      - name: aribaxm-db

  - name: aribaxm-db-deployer
    type: hdb
    path: gen/db
    parameters:
      buildpack: nodejs_buildpack
    requires:
      - name: aribaxm-db

resources:
  - name: aribaxm-db
    type: com.sap.xs.hdi-container
    parameters:
      service: hana # or 'hanatrial' on trial landscapes
      service-plan: hdi-shared
    properties:
      hdi-service-name: ${service-name}
 
And the package.json should look something like this:

{
  "name": "aribaxm",
  "version": "1.0.0",
  "description": "A simple CAP project.",
  "repository": "<Add your repository here>",
  "license": "UNLICENSED",
  "private": true,
  "dependencies": {
    "@sap/cds": "^5",
    "express": "^4",
    "@sap/hana-client": "^2",
    "cors": "^2"
  },
  "devDependencies": {
    "sqlite3": "^5",
    "@sap/hdi-deploy": "^4"
  },
  "engines": {
    "node": "^16"
  },
  "scripts": {
    "start": "cds run"
  },
  "eslintConfig": {
    "extends": "eslint:recommended",
    "env": {
      "es2020": true,
      "node": true,
      "jest": true,
      "mocha": true
    },
    "globals": {
      "SELECT": true,
      "INSERT": true,
      "UPDATE": true,
      "DELETE": true,
      "CREATE": true,
      "DROP": true,
      "CDL": true,
      "CQL": true,
      "CXL": true,
      "cds": true
    },
    "rules": {
      "no-console": "off",
      "require-atomic-updates": "off"
    }
  },
  "cds": {
    "requires": {
        "db": {
            "kind": "hana"
        }
    },
    "hana": {
        "deploy-format": "hdbtable"
    }
  }
}
 
After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud Database Explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

And these 2 applications in the BTP space:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use OData to access the database. In this exercise I didn’t added access control, so you should use your BTP user to execute the OData services from Postman and check the access.

Final Thoughts

Now you have 2 methods of creating SAP HANA Cloud database artifacts, both from the SAP Business Application Studio or Visual Studio, so you can access the database from JDBC or OData services.

Friday, 5 August 2022

Processing of Prepayments with SAP S/4HANA Accruals Management

For sure you have already heard about the SAP S/4HANA Accruals Management?

Starting with S4 HANA OP 2021, the Accrual Management provides a new functionality to process deferrals. So far, the processing of prepayments is a typical and manual activity for the Finance users. With this new functionality Finance can achieve efficiency gains and it’s quite easy to implement as well. And you can also use the deferrals’ part even if you didn’t implement the accruals’ part yet.

In this blog I will show you how to set up the SAP S/4HANA Accruals Management for deferrals in order to optimize this process.

Thursday, 4 August 2022

App Extensibility for Create Sales Orders – Automatic Extraction: Custom Proposal for Sales Order Request Fields (Using BAdI)

In the Create Sales Orders – Automatic Extraction app, the system starts data extraction and data proposal for sales order requests immediately after your purchase order files are uploaded. If SAP pre-delivered proposal rules do not satisfy your business needs, key users can create custom logic to implement your own proposal rules.

Here is an example procedure. In your custom logic, you want the system to set the sales area to 0001/01/01 if the company code is 0001, and set the request delivery date to the current date plus seven days if this date is initial. In case ML can not extract the date from file or date is not exist in the file.

Wednesday, 3 August 2022

How to Display Situations in Your Custom Apps with the Extended Framework of Situation Handling

Situation Handling in SAP S/4HANA and SAP S/4HANA Cloud detects exceptional circumstances and displays them to the right users.

I’m the lead UI developer within the Situation Handling framework for end-user facing apps and I’m excited to present a new feature in the Situation Handling extended framework to you. Together with the team from SAP Fiori elements, we introduce a new, simplified way of displaying situations right in your business apps. You can enable a great user experience for your end users with low effort. You should read through this tutorial-style article which explains the details and points you to further resources that you can take as a blueprint for your successful implementation.

With the extended framework of Situation Handling, situations can be displayed in apps based on SAP Fiori elements for OData version 4. You can do this without any front-end development, just by extending the app’s OData service.

Tuesday, 2 August 2022

Currency conversion in BW/4HANA, Enterprise HANA Modelling

Most the business model cost center or profit center are located in different country and with different currency profit and cost is generated. At the end of the year finance team try to calculate total cost or profit against a target currency (USD,EUR or different one) which is the headquarter of the company located to generate laser and balance sheet. In that scenario we need to perform currency conversion to generate analytics report. This blog I am going to discuss about the currency conversion step for different scenarios i.e. BW/4HANA, Enterprise HANA Modelling  SAP Analytics Cloud

1. Currency conversion in BW/4HANA:

In our scenario, A finance analyst wants all the profits generated in the Belgium and France plant (in EURO) need to converted into the USD which is the currency of the organization head office to generate the Laser posting.

Monday, 1 August 2022

Backup and Recovery for the SAP HANA (BTP)

SAP HANA (HANA Cloud, HAAS ..) offers comprehensive functionality to safeguard your database i.e. SAP HANA offers automatic Backup to back up your database and ensure that it can be recovered speedily and with maximum business continuity even in cases of emergency. The recovery point objective (RPO) is no more than 15 minutes.

A full backup of all SAP HANA Cloud instances is taken automatically once per day for last 14 days

These Backups are encrypted using the capabilities of the SAP Business Technology Platform. The retention time for backups is 14 days. This means that an instance can be recovered for 14 days.

Wednesday, 27 July 2022

Decoding S/4HANA Conversion

To start with there is always a lot of confusion on S/4HANA Conversion projects.

Too many tasks, too many teams, and too many responsibilities. Also moving away from our beloved Business Suite sparks fear in us.

Lets explain why we should move away from ECC 6.0 :

SAP Maintenance Strategy:

SAP provides mainstream maintenance for core applications of SAP Business Suite 7 software (including SAP ERP 6.0, SAP Customer Relationship Management 7.0, SAP Supply Chain Management 7.0, and SAP Supplier Relationship Management 7.0 applications and SAP Business Suite powered by SAP HANA®) on the respective latest three Enhancement Packages (EhPs) until December 31, 2027.

Saturday, 23 July 2022

XaaS Digital Assets with SAP S/4HANA Public Cloud

In this blog let us go in more details of various possible business models and corresponding pricing models that we can have as part of XaaS Digital Assets. Our focus will be on E2E business process for Subscription based products in SAP S/4HANA Public Cloud with an out of the box integration with SAP Subscription Billing & Entitlement Management solutions.

Let us imagine that there is a Digital Assets Software company having a software portfolio including various software & different pricing models like:

◉ Fixed Recurring Charge

◉ Tier Based Pricing

◉ Volume/Usage Based Pricing

Friday, 22 July 2022

XSD Validation for DMEEX

SAP HANA, SAP HANA Exam, SAP HANA Exam Prep, SAP HANA Certification, SAP HANA Preparation, SAP HANA Tutorial and Materials, SAP HANA Skills, SAP HANA Jobs, SAP HANA News

There is new functionality available in DMEEX and delivered across SAP S/4HANA on-premise that allows you to define XSD (XML Schema Definition) validation for your format trees.

If you were provided with an XSD with a criterion for the output file from the requester (e.g., bank or other financial institution) you can now upload the XSD file and take advantage of the file being checked against the schema as soon as the file is created.

Wednesday, 20 July 2022

SAP Data Warehouse Cloud bulk provisioning

As our customers adopt SAP Data Warehouse Cloud, we often need to help them set up new users for both training and productive use.  This can be a significant administrative task when there are many users, spaces, connections, and shares needed for each user.  NOTE: SAP provides the SAP Data Warehouse Cloud command line interface (CLI) for automating some provisioning tasks.

For a specific customer, we needed to create 30 training users, along with a Space per user, multiple Connections per user, and numerous shares from a common space.  This could all have been accomplished using the SAP Data Warehouse Cloud user interface but we wanted to go faster, and more importantly make it repeatable.

Monday, 18 July 2022

How can SAP applications support the New Product Development & Introduction (NPDI) Process?

In this blog you will get an overview of “How can SAP Applications support the New Product development and Introduction (NPDI) Process for Discrete and Process industry .

Introduction of NPDI Process:

NPDI stands for “New Product Development and Introduction is the complete process of bringing a new product to the customer/Market. New product development is described in the literature as the transformation of a market opportunity into a product available for sale and it can be tangible (that is, something physical you can touch) or intangible (like a service, experience, or belief).

Friday, 15 July 2022

SAP AppGyver – Handling Image : Loading and displaying data

This article is a continuation of the previous one. This article assumes the environment created in the previous article, so please refer to that article if you have not yet done so.

This time, I will explain how to display the image data stored in the BLOB type column of HANA Cloud using the SAP AppGyver application.

Additional development to the AppGyver application

Add a page

In this case, I would like to create a function that displays a list of image IDs, and when I tap on one, the image is displayed.I would like to add a separate page for this function, although it could be created on the same page.

Wednesday, 13 July 2022

The SAP Geoenablement Framework (GEF) now authenticates with ArcGIS Enterprise

SAP HANA, SAP HANA Exam, SAP HANA Exam Prep, SAP HANA Certification, SAP HANA Preparation, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs, SAP HANA News, SAP HANA Guides
Transmission line repair

GEF is integrated into SAP Plant Maintenance.GEF allows Plant Maintenance users to do their tasks using a map – regardless of whether the customer is running SAP ERP Central Component (ECC), SAP Business Suite powered by SAP HANA (Suite on HANA), or SAP S/4HANA.

Monday, 11 July 2022

Difference between Role, Authorization Object/s, and Profile

As a Functional Consultant, one may wonder what a Role is and how different it is from the Authorization Object and Profile. While it is mostly the job of the Security team to assign the required Role for a user, it is also the Functional Consultant’s responsibility to provide inputs about the required Transactions, restrictions within a Transaction, and how these restrictions should vary depending on the user.

Let’s begin this blog by defining what a user is. In simple terms, if a system has our users already created in it, only then we will be able to log in using a username and password. In SAP, Transaction code SU01 is used to create a user. Using this Tr. Code, users can be created, modified, deleted, locked, unlocked, and copied to create a new one. Typically, in a project user creation has certain prerequisites. Initially, the user or the concerned manager requests the user creation by filling in the access form and providing all the required details. This is followed by one or two stages of approval and finally the creation of the user by the Security team.

Friday, 8 July 2022

How to use Custom Analytical Queries in SAP S/4HANA Embedded Analytics?

In this blog post you will learn step-by-step how to create a report in a SAP Fiori environment on operational SAP S/4HANA data. This is done using the ‘Custom Analytical Query’ SAP Fiori app. This SAP Fiori app is standard available in SAP S/4HANA Embedded Analytics and allows users to create reports themselves, directly on the operational data. These reports can be consumed in SAP Fiori or in any other visualization application like SAP Analysis for Office or SAP Analytics Cloud.

How to create a Custom Analytical Query?

To create a Custom Analytical Query the following steps need to be executed:

Step 1: Start the Custom Analytical Query app

Step 2: Create a new report

Step 3: Select the fields

Step 4: Assign to rows and columns

Step 5: Add custom fields

Step 6: Add filters

Step 7: Publish

Wednesday, 6 July 2022

Pass Input Parameters to CV from Parameters derived from Procedure/Scalar Function

I am writing this blog post on SAP HANA Input Parameters. There are few blogs on HANA IPs but they are not giving clear understanding. Here I am giving a basic example which will make the understanding easy for HANA developers.

Those who are working on HANA for quite sometime and developed SAP HANA CVs they must have worked on Input Parameters and Variables.

A Variable:

Variables are bound to columns and are used for filtering using WHERE clauses. As such, they can only contain the values available in the Columns they relate to.

Friday, 1 July 2022

SAP AppGyver – Handling Image: Data Writing

Now, here is an article on SAP AppGyver.

Today I would like to explain how to handle image data. Images can be found at …..There are pros and cons to storing it in HANA Cloud’s BLOB type column.

Assumption

In this article, I will explain how to create an application that takes a photo and stores it in a BLOB type column in HANA Cloud.

Wednesday, 29 June 2022

Providing a solution to an agile business requirement with SAP BTP

In this blog, we will describe the process of identifying and adjusting the correct pieces from SAP BTP Platform in order to solve a specific customer request. This process starts from fully understanding the business needs. Then, how this translates to different SAP BTP components, in order not only to answer the current requirement but also and the future ones.

Business Case

A self service mechanism was requested by a customer (mainly from business users) in order to quickly create or edit new derived time dependent measures. This mechanism will help them to take faster and better business decisions. The key points which driven the provided solution were mainly two, who will use it (Business users) and how / where (from reporting layer).

Monday, 27 June 2022

Analyzing High User Load Scenarios in SAP HANA

If you are a ERP/NetWeaver system administrator, you will face many scenarios where you experience high resource utilization in the HANA DB. In order to correct these situations, you need to analyze the root cause of this load. This blog post will help you in the analysis process to find the root cause of the load. It will help you find the exact application user which caused this load on your HANA DB.

When get reports from your monitoring tools or users about performance issues in the system, do the following:

◉ Login to HANA Cockpit

◉ Open the Database (usually tenant) which is affected by the issue.

◉ Go to CPU Usage -> Analyze Workloads

Friday, 24 June 2022

SAP PaPM Cloud: Downloading Output Data Efficiently

Let’s say that as a Modeler, you have successfully uploaded your data into SAP Profitability and Performance Management Cloud (SAP PaPM Cloud) and utilized SAP PaPM Cloud’s extensive modeling functions for enrichment and calculation. And as a result of your Modeling efforts, you now have the desired output that you would like to download from the solution. The question is: Depending on the number of records, what would be the most efficient way to do this?

To make it simple, I’ll be using ranges to differentiate small and large output data. Under these two sections are step-by-step procedure on how to download results generated from SAP PaPM Cloud – in which based on my experience, would be the most efficient way.

Friday, 10 June 2022

SAP Tech Bytes: CF app to upload CSV files into HANA database in SAP HANA Cloud

Prerequisites

◉ SAP BTP Trial account with SAP HANA Database created and running in SAP HANA Cloud

◉ cf command-line tool (CLI)

If you are not familiar with deploying Python applications to SAP BTP, CloudFoundry environment, then please check Create a Python Application via Cloud Foundry Command Line Interface tutorial first.

I won’t repeat steps from there, like how to logon to your SAP BTP account using cf CLI. But I will cover extras we are going to work with / experiment with here.

Thursday, 9 June 2022

SAP HANA On-Premise SDA remote source in SAP HANA Cloud using Cloud Connector

SAP Cloud Connector serves as a link between SAP BTP applications and on-premise systems. Runs as on-premise agent in a secured network and provides control over on-premise systems and resources that can be accessed by cloud applications.

In this blog, you will learn how to enable cloud connector for HANA Cloud Instance, install and configure the cloud connector. Also to connect an SAP HANA on-premise database to SAP HANA Cloud using SDA remote source.

Wednesday, 8 June 2022

How to deal with imported Input Data’s NULL values and consume it in SAP PaPM Cloud

Hello there! I will not bother you with some enticing introduction anymore and get straight to the point. If you are:

(a) Directed here because of my previous blog post SAP PaPM Cloud: Uploading Input Data Efficiently or;

(b) Redirected here because of a quick Google search result or what not…

Either way, you are curious on how a User could use a HANA Table with NULL values upon data import and consume this model in SAP Profitability and Performance Management Cloud (SAP PaPM Cloud). Then, I got you covered with this blog post. 

Monday, 6 June 2022

Exception Aggregation in SAP SAC, BW/BI and HANA: A Practical approach

Today am going to discuss about a very useful topic about exception aggregation in term of concepts and usage scenario in SAP Analytics Cloud, BW/BI and HANA.

In all the Analytics reports or dashboards Key figures are shown in the aggregated level. But the main question is how the aggregation done and shown in the report ?

In Standard Aggregation applied to a Calculated key figure, Key figure aggregation done by group by all the dimensions in a single row for a single select.

Monday, 30 May 2022

Transforming Hierarchy using HANA Calculation view

Introduction:

This blog post is on usage of two powerful Nodes namely Hierarchy function and Minus Node in HANA Calculation view. Both Nodes are available in SAP HANA 2.0 XSA and HANA cloud.

Minus and Hierarchy function Node are available starting SAP HANA 2.0 SPS01 and SPS03 respectively for on premise and available in SAP HANA CLOUD version.

This use case will be helpful in business scenario where one wants to migrate from SAP BW 7.X to SAP HANA 2.0 XSA or SAP HANA Cloud. SAP BW is known for Data warehousing and strong reporting capabilities. While migration from SAP BW to SAP HANA 2.0 or SAP HANA cloud some of features are not available out the box. In this case HANA cloud is considered as backend for data processing and modelling purpose and Analysis for office for reporting.

Saturday, 28 May 2022

Extending business processes with product footprints using the Key User Extensibility in SAP S/4HANA

With SAP Product Footprint Management, SAP provides a solution, giving customers transparency on their product footprints, as well as visibility into the end-to-end business processes and supply chains. 

SAP S/4HANA comes with the Key User Extensibility concept, which is available both in the cloud and on-premise versions.

Key User Extensibility, together with product footprints calculated in SAP Product Footprint Management, enables customers to enrich end-to-end business processes with sustainability information, helping to implement the “green line” in the sustainable enterprise. With Key User Extensibility, this can be achieved immediately, as the extension of the business processes can be introduced right away, by customers and partners, during an implementation project.

Friday, 27 May 2022

Configuration of Fiori User/Web Assistant with/without Web Dispatcher for S/4 HANA On-Premise System

Overview –

The Web Assistant provides context-sensitive in-app help and is an essential part of the user experience in SAP cloud applications. It displays as an overlay on top of the current application.

You can use the Web Assistant to provide two forms of in-app help in SAP Fiori apps:

◉ Context help: Context-sensitive help for specific UI elements.

◉ Guided tours: Step-by-step assistance to lead users through a process.

Wednesday, 25 May 2022

Migration Cockpit App Step by Step

Migration Cockpit is a S/4HANA app that replaces LTMC from version 2020 (OP).

This is a powerful data migration tool included in the S/4HANA license and it delivers preconfigured content with automated mapping between source and target, this means that if your need matches the migration objects available, you do not have to build a tool from the scratch, it is all ready to use, reducing the effort of your data load team.

Migration Cockpit App, SAP HANA, SAP HANA Exam Prep, SAP HANA Exam, SAP HANA Preparation, SAP HANA Certification, SAP HANA Career, SAP HANA Jobs, SAP HANA News, SAP HANA Prep
Migration Cockpit Illustration by SAP

Monday, 23 May 2022

LO Data source enhancement using SAPI

In this blog we will discuss about LO Data source enhancement using SAPI. The scenario is same.

For a Particular Order we need to have material Status and other fields in our data flow for which Material Number is available in our datasource 2LIS_04_P_MATNR.

But before going to the implementation I want to discuss about the enhancement framework architecture which given below –

BW (SAP Business Warehouse), SAP HANA, SAP HANA Career, SAP HANA Learning, SAP HANA Career, SAP HANA Skills, SAP HANA Jobs

It is better and good practice to enhance i.e. append to communication structure instead of directly appending extract structure. It will increase the scope for reusability.

Sunday, 22 May 2022

SAP BW4HANA DS connation (MS SQL DB) Source system via SDA

Introduction:

As you are all aware now, you cannot connect DS directly to the SAP BW4HANA system, Hance needs to connect DS DB with the HANA database via SDA and create a source system.

Based on customer requirements set up HANA DB connection with MS SQL DB and set up source system.

DISCLAIMER

The content of this blog post is provided “AS IS”. This information could contain technical inaccuracies, typographical errors, and out-of-date information. This document may be updated or changed without notice at any time. Use of the information is therefore at your own risk. In no event shall SAP be liable for special, indirect, incidental, or consequential damages resulting from or related to the use of this document.

Friday, 20 May 2022

Data driven engineering change process drives Industry 4.0

We are “Bandleaders for the Process.” Our mission is to orchestrate plant operation, with leadership across the manufacturing value chain. This reminds me of my job in the plant 20 years ago.

One important mission was to manage engineering change; it required a lot of time and attention to plan, direct, control and track all the activities across the team with multiple files and paper documents:

◉ What is the impact of change?

◉ When will the new parts come from the suppliers? How many old parts do we have in stock?

◉ Which production order should be changed? What is the status of production orders?

◉ Are new tools ready? Have all of build package documents been revised?

Wednesday, 18 May 2022

Rise with SAP: Tenancy Models with SAP Cloud Services

Introduction

Transitioning to Rise with SAP cloud services, SAP customers have a choice of opting for either single tenanted or multi-tenanted landscape. The choice of tenancy model largely depends on the evaluation of risk, type of industry, classification of data, security, sectorial and data privacy regulations. Other considerations include performance, reliability, shared security governance, migration, cost, and connectivity. While customer data is always isolated and segregated between tenants, the level of isolation is a paramount consideration in choosing a Tenancy Model.

In this blog, we will cover tenancy models available under Rise with SAP cloud services and explore nuanced differences and some of the consideration for choosing each of the tenancy models.