Thursday, 29 December 2016

SAP HANA 2.0 SPS 00 What’s New: Administration – by the SAP HANA Academy

Introduction

We will be posting new videos to the SAP HANA Academy to show new features and functionality introduced with SAP HANA 2.0 Support Package Stack (SPS) 00.

Tutorial Video

Tuesday, 27 December 2016

Input parameter based on procedure of type Date

Use case: The user is asked to enter a date. If nothing is specified then it should be default to first day of the current month else user specified value should be used to filter the data, using Graphical calculation view.

If you are thinking to do by using Input parameter with “Derived from Procedure/Scalar function”
then you are almost there with just 2 hurdles to cross.

For this demonstration I’m using the below table structure for which the field
DOJ is of type DATE on which the input parameter will be used.

Port of Antwerp from the Opendata challenge perspective – part 4

In the last part (part 3), we saw how to import a CSV file into HANA using HANA Studio where we converted the geoJSON filed into WKT, let’s now see how we can take care of the same content but in a JSON format.
if you remember, our CSV file (http://datasets.antwerpen.be/v4/gis/grondwaterstroming.csv) is also available in JSON format.

So the link to our JSON content is: http://datasets.antwerpen.be/v4/gis/grondwaterstroming.json, this will only return the first thousand rows (and luckily this has only 123 rows, but I will show you how to implement the pagination).

Saturday, 24 December 2016

Port of Antwerp from the Opendata challenge perspective – part 3

In the last part (part 2), we saw how to import a CSV file into HANA using HANA Studio, let’s now see how we can take care of the geometry column.
If you want to learn more about geoJSON, you can check the following website: http://geojson.org/

if you remember, our CSV file (http://datasets.antwerpen.be/v4/gis/grondwaterstroming.csv) has a field named “geometry” which is a geoJSON stream.

As stated before HANA does support this format yet with the proper ST_GEOMETRY data type constructor or constructor. So we will need to convert this field to a “Well-Known Text” format.

Friday, 23 December 2016

Port of Antwerp from the Opendata challenge perspective – part 2

Now that we have our HANA instance up and running in the SAP HANA Cloud Platform trial platform as described in part 1 of this blog series, we can now start importing CSV like type of data.

This dataset is available in different format: CSV, JSON, XML, KML and MAP.

Step 1: Download and explore the file locally

Open the following URL and save the file locally: http://datasets.antwerpen.be/v4/gis/grondwaterstroming.csv

Open the file with a text editor like Notepad++ or Textpad or any text editor that you are used to. It only contains 123 rows.

Thursday, 15 December 2016

Enterprise Readiness with SAP HANA – Persistence, Backup & Recovery

Enterprise Readiness – Planning Your Data Center

As High Availability and Disaster Recovery become increasingly important topics for many enterprise customers embarking on digitization, we will be taking time to explain more on this topic in a blog series to help customers better plan and prepare their data center operations, with SAP HANA as the platform of focus.

We began this series with an earlier blog on Enterprise Readiness, and will continue with more topics in the following months.

Wednesday, 14 December 2016

Enterprise Readiness with SAP HANA

Data Centers as the “Power Plants” of the digital Economy

Data centers are the power plants of today’s businesses. However, planning high availability into a system landscape may often be an overlooked aspect by some businesses. As more elements ranging from devices, people, equipment and other systems become increasingly connected, the costs after an unplanned outage is definitely becoming more significant. In the always-on digital economy, companies that rely on data to make decisions, conduct transactions, and interact with consumers cannot afford data center blackouts.

Tuesday, 13 December 2016

Port of Antwerp from the Opendata challenge perspective – part 1

Getting Access to the SAP HANA Cloud Platform trial platform

Here is how to register for a developer/trial account: Sign up for an free trial account on SAP HANA Cloud Platform

This should not take you more than 5 minutes to receive the account activation link and get to the HCP Cockpit. You will be assigned a “P” user id (unless you have an SAP account already and you want to use it with your “S” user id).

Make the new standalone HANA 2 Cockpit work for you

After installing my first HANA 2 system, I realized that the HANA Cockpit is no longer an integral part of the database instance as it was with HANA 1, as is clearly stated in the HANA Cockpit Installation and Update Guide:

Make the new standalone HANA 2 Cockpit work for you

Saturday, 10 December 2016

EHP upgrade and HANA migration. Lessons learnt

The system was based on SAP EHP Release 7.01 running on an Oracle database that had over 8.7 Terabytes. Additionally, the pre-existing landscape and the target one are hosted on premises, under virtual servers on vSphere 6.0. Software  wise, our customer system is based on Vehicle Management System (IS-A-VMS), although with a significant amount of customizations and custom code.

The original motivation to initiate this project followed an analysis of the usage of the system, and based on our predictions of additional data volume, additional markets, etc. it was quite clear that we needed to take some measures to keep the response times within reasonable limits. Furthermore, it was quite obvious that HANA was the future for SAP based systems, so at some point we would have had to contemplate a migration.

Friday, 9 December 2016

Dynamic filter / User Exit Variable for HANA Views

Requirement : Based on user input or current month , we want to restrict data in our view.  Common requirement is to show last 12/13 months data dynamically. In general , we want to go 'n' number of months back from current month or whatever month user would select.

Solution :  From  SPS09 , HANA allows Input Parameter of type   "Derived From Procedure /Scalar Function" - and I would use this to meet this requirement. Good thing about this solution is , we can restrict the data at the first projection node and avoid bringing unnecessary data to upper nodes.  We shall write a procedure which would take month input from user ( or current month if nothing supplied)  and return  "to" and "from" month.

Thursday, 8 December 2016

Upgrade your HANA, express edition to HANA 2

  1. Temporarily increase memory to 23.9 GB
  2. Create temporary user
  3. Uninstall HANA 1
  4. Install HANA 2
  5. Install AFL
In order to allow for the upgrade to HANA 2, I have to temporarily increase the memory of my virtual machine to 23.9 GB as before:

Wednesday, 7 December 2016

Using Synonyms in SAP HANA

In this blog post I try to explain what synonyms are and how they work in HANA SQL, but most of it will apply to other databases as well.

Use Case, Users and Schemas

Lets consider the following use case:
  • One user (SYN_PROV) creates/owns DB objects like tables, and provides access to its objects via synonyms. That could be e.g. a replicated ERP Schema.

Monday, 5 December 2016

Enable Operational Process Intelligence on your HANA, express edition

With the HANA Rules Framework (HRF) in my blog, I was wondering whether I could move my HANA, express edition to the next level and enable Operational Process Intelligence (OPI) that needs HRF as a prerequisite.

As it turned out, this was again quite straight forward:
  1. Download and install the delivery units
  2. Configure OPI
  3. Install Eclipse Business Scenario and Workflow Editor plugins
  4. Model Business Scenario

Saturday, 3 December 2016

SAP HANA 2.0 SPS 00 What’s New: Security – by the SAP HANA Academy

Encryption

Data volume encryption was introduced with SAP HANA 1.0 SPS 09. With 2.0, log volume encryption is now also supported. You enable log volume encryption using SQL. Data volume encryption can also be enabled using the new SAP HANA cockpit.

Friday, 2 December 2016

SAP HANA and BW Mixed Scenarios Architecture

What is a Data Warehouse Mixed Architecture?

It’s an SAP best practice for modern data warehousing. Simply put, it’s a data model that is implemented at the same time in SAP BW and native SAP HANA.

SAP HANA and BW Mixed Scenarios Architecture

Thursday, 1 December 2016

Enable the HANA Rules Framework on your HANA, express edition

With SDI in the bag, I was wondering whether I could enable the HANA Rules Framework (HRF) on my HANA, express edition in a similar way.

As it turned out, this was also absolutely possible and if fact quite straight forward as well:
  1. increase the virtual machine memory to 12GB
  2. Create a tenant database
  3. Enable HTTP and HTTPS access to the tenant database
  4. Import the HRF delivery unit
  5. Set Up a Technical User

Wednesday, 30 November 2016

Monday, 28 November 2016

Visualizing SAP HRF Rule Results in SAP Lumira – Part 2

SAP Lumira Desktop Preparation and Presentation

Data Source

The first thing to understand is that SAP Lumira cannot be directly connected to the SAP HRF column view “SAP_HRF”.“sap.demo-store.sales.resources ::Sales Analysis.VIEW”. The SAP Lumira data source must be a cube such as those created by an Analytic View or a Calculation View. I therefore need to build one of these against my rule service column view.

Saturday, 26 November 2016

SAP Earth Observation Analysis Microservices on YaaS

Getting Started:

The following describes the steps to use the SAP Earth Observation Analysis service:

1. Sign In under My Account on YaaS. If not a member, you will need to Register under My Account on YaaS. It is free.
2. Go to the YaaS Market and choose the option Worldwide (Beta) at the top-right corner of the window. Scroll down and click on the SAP Earth Observation Analysis package.
3. Once the request for subscription is made, create a project for the organization to subscribe to a package. Enter the display name of the project as you want it to display in the Builder. The project identifier is a unique identifier.

Friday, 25 November 2016

Visualizing SAP HRF Rule Results in SAP Lumira – Part 1

SAP HANA Rules Framework Overview

SAP HANA Rules Framework is perfect for handling complex analytical rule constructs in real-time (online) and batch (offline) scenarios and makes the best use of the power of SAP HANA in order to achieve this. The built-in Fiori user interface that is distributed as part of SAP HRF is excellent at providing rule creation and management and allows rule results to be displayed in a textual manner. SAP HRF is all about providing the ability to allow business users to control, influence and personalize decision making in highly dynamic scenarios within the organisation in the most efficient and robust manner. Outside of SAP HRF it is the calling applications responsibility to make use of the decision results and actions that SAP HRF has returned.

Securing SuccessFactors Extensions on SAP HANA Cloud Platform

Overview

SAP HANA Cloud Platform, extension package for SuccessFactors allows you to extend your SuccessFactors scope with applications running on the platform. The extension package makes it quick and easy for companies to adapt and integrate SuccessFactors cloud applications to their existing business processes, thus helping them maintain competitive advantage, engage their workforce and improve their bottom line.

Thursday, 24 November 2016

Enable point-in-time recovery for your HANA, express edition

When you first connect HANA Studio to your HANA, express edition, you will be presented with 1 alert with HIGH priority:

Enable point-in-time recovery for your HANA, express edition

HANA Cockpit paints the same picture:

Monday, 21 November 2016

Installing HANA Express on AWS – Detailed Walkthrough

Introduction

This blog details the steps to deploy your own HANA Express (HXE) Instance on Amazon Web Services (AWS).

A couple of reasons I used AWS to host a HXE instance –
  • Didn’t have a computer with the recommended specs (namely the 16GB of RAM)
  • Initially I tried using SAP’s Cloud Appliance Library to deploy a HXE instance but unfortunately its restricted to the region “US EAST”, which isn’t ideal from Australia.
  • AWS provides a lot of flexibility and is cost effective for hosting your own HANA system

Tuesday, 15 November 2016

Increasing the SAP-NLS Performance

Increasing the SAP-NLS Performance

With the Introduction of smart data access (SDA) especially between SAP HANA and IQ, the data provisioning process can be optimized. Never the less, some additional Parameters have to be introduced on the ABAP and HANA Backend as well.

Sunday, 13 November 2016

SAP HANA XS Classic, Access your first data in a SAP HANA XSC Application

Open the Web-based Development Workbench

Using the SAP HANA Developer Edition or SAP HANA Cloud Platform

The workbench allows you to develop on HANA without the need to set up a local development environment.

Login to the HANA Cloud Cockpit with your free developer edition account.

Saturday, 12 November 2016

SAP Text Analysis Microservices on YaaS

Introducing Text Analysis for everyone…

Text Analysis microservices – built on SAP HANA – allow you to process unstructured textual content from a variety of document formats and transform it into structured data without the liability or concern for hardware infrastructure, installation, or maintenance. This information discovery service facilitates enterprise decision-making processes by extending the value of your business intelligence investments that report off structured data. You can find these analytical microservices on the SAP Hybris as a Service (YaaS) Market and incorporate them into whichever technology platform you use.

Thursday, 10 November 2016

Transforming Database Management with the new SAP HANA 2 feature, Active/Active read-enabled

Tech Ed Barcelona 2016 is happening now this week from Nov 8 – 10, and an exciting announcement was made for SAP HANA 2. With the launch of the next generation platform SAP HANA 2, SAP is looking further to bring greater stability and agility for customers looking to build a solid foundation for innovation and digitalization.

A key feature announced is Active/Active Read-Enabled. Along with enhanced data encryption, workload management and the SAP HANA Cockpit, these new features look to help further simplify and Free up IT efforts to allow them to focus on innovation.

Wednesday, 9 November 2016

Backup the Database(s) for SAP HANA Express – by the SAP HANA Academy

There a three tools you can use to make a backup of your SAP HANA database:
  1. SAP HANA cockpit
  2. SAP HANA studio
  3. hdbsql
For the VM, I should maybe add that you can also – of course – create a snapshot of the VM but this does require quite a bit of hard disk space.

Tuesday, 8 November 2016

SAP HANA Live Installation Tips + Tables.

Whilst importing delivery units into your HANA System you can sometimes run into some common errors which can easily be fixed without the means of opening a SAP Incident.

Lets look at an example.

Here you are importing SAP HANA Analytics into your system. During the import you see an error:

SAP HANA Live Installation Tips + Tables.

Monday, 7 November 2016

SAP HANA TAKEOVER AND FAILBACK TEST

We have database on two different server melcoprd and melcodev. Our database is in mcod scenario in which it has 3 tenant database.

SAP HANA TAKEOVER AND FAILBACK TEST
MELCODEV 

Saturday, 5 November 2016

Secure your HANA Cloud Connector with OpenSSL certificates – Part 3

In parts 1 and 2 of this blog series, I showed how to secure your SCC with a trusted UI Certificate as well as how to further secure your SCC with a trusted System Certificate, put your CA certificate in the Trust Store, install a SCC CA Certificate and with that enable Principal Propagation. As a result, we got 4 green boxes in the SCC General Security Status:

Secure your HANA Cloud Connector with OpenSSL certificates – Part 3

Friday, 4 November 2016

Secure your HANA Cloud Connector with OpenSSL certificates – Part 2

In part 1 of this blog series, I showed how to secure your SCC with a trusted UI Certificate:

Secure your HANA Cloud Connector with OpenSSL certificates – Part 2

Therefore, in this blog, I will show how to further secure your SCC with a trusted System Certificate, put your CA certificate in the Trust Store, install a SCC CA Certificate and with that enable Principal Propagation.

Thursday, 27 October 2016

Enabling on premise Fiori SSO with OpenSSL certificates – Part 1

I had described how to enable single-sign-on based on OpenSSL for Windows on a Web AS ABAP sandbox system and this information is still valid. However, both the NWAS ABAP as well as the OpenSSL tools have evolved considerably since 2012 so that I will describe an updated approach for Fiori single-sign-on in this blog series. I will start with explaining how to setup a sucure SSL connection to the Fiori Launchpad based on OpenSSL certificates.

Again, this blog is intended for you to learn and understand the concepts. Neither key lengths nor other security considerations except for making this example work have been considered.

Wednesday, 26 October 2016

Secure your HANA Cloud Connector with OpenSSL certificates – Part 1

Out of the box, the HANA Cloud Connector (SCC) is not secure, as clearly documented by the General Security Status:

Secure your HANA Cloud Connector with OpenSSL certificates – Part 1

As mentioned in the General Security Status, the out of the box SSL certificate does not use the host name as its common name (CN) and is therefore not trusted:

Tuesday, 25 October 2016

Enable TLS on HANA Web Dispatcher with OpenSSL certificates

Out of the box, my HANA Web Dispatcher comes with a self-signed SSL certificate, which makes its connections insecure:

Enable TLS on HANA Web Dispatcher with OpenSSL certificates

Monday, 24 October 2016

Enable Smart Data Integration on your HANA, express edition

  1. Increase the virtual machine memory to 12GB
  2. Create a tenant database with the Data Provisioning Server enabled
  3. Import the Smart Data Integration delivery unit
  4. Configure the Smart Data Integration agent
  5. Verify the Smart Data Integration agent connection
To make room for an additional HANA database container, I increase the memory of my HANA, express edition from its the 9GB after the upgrade to 12GB:

Friday, 21 October 2016

SAP HANA XS Classic, Develop your first SAP HANA XSC Application

Using HANA Cloud Platform


Each Trial HANA instance comes with the HANA Web-based Development Workbench. The workbench allows you to develop on HANA without the need to set up a local development environment.

Login to the HANA Cloud Cockpit with your free developer edition account.
Choose Databases & Schemas. You will need to create your new instance, to do this simple give it a name, enable web access and of course give a password. This password you will need to remember as it is the password for your SYSTEM user and how you will be able to access the server.

Thursday, 20 October 2016

Deploy your mobile web app to SAP HANA Cloud Platform

Since any project that is created initially in the SAP Web IDE contains a neo-app.json file, it is ready to be deployed to HANA Cloud Platform. During the deployment process, Web IDE creates the HTML5 application in HANA Cloud Platform and also the related Git repository (which will track code changes) for your app automatically.

1. Open the SAP Web IDE.

2. In SAP Web IDE, select the northwind project folder and open the context menu by right-clicking on it. Choose Deploy > Deploy to SAP HANA Cloud Platform.

Wednesday, 19 October 2016

Upgrade your HANA, express edition

In my previous blog Secure your HANA, express edition I described how to register your HANA, express edition system with SUSE to receive critical security updates.

In this blog I will leverage this work to describe how to upgrade your HANA, express edition to the latest patch level. However, please be aware, that

  1. This increases the memory required from 7 GB to 9 GB for the Server only option.
  2. Also, temporarily, 24 GB of available RAM are needed for the upgrade.

To start with, I update the preinstalled VMware Tools to mount shared folders with the upgrade software packages.

Tuesday, 18 October 2016

Secure your HANA, express edition

As soon as I heard of the HANA, express edition I got it installed on my laptop. The Server only option requires 7 GB of available RAM, the Server + applications 12 GB.

Importing the images into VMware Player 7.1.4 went like a charm. However, the underlying SUSE Linux Enterprise Server for SAP Applications has received security updates since the HANA, express edition images had been build.

Therefore, the first step for me after logging into the hxehost system and changing the hxeadm password was to start the SLES setup tool YaST to add a SUSE subscription to the installation:

Monday, 17 October 2016

DIY: HANA express edition on Amazon Cloud

If you want to use HANA express edition on aws you probably want to install your own instance to get familiar with SAP HANA installation on the cloud and get more control over your configurations and application deployments.

Follow the below blog to install you free HANA expresses edition and make it accessible from the internet. Most of the steps are very straight forward and it will take you 3-4 hours to download and install.

1. Download the HANA express binary

Get yourself familiar with SAP HANA, express edition.

Saturday, 15 October 2016

HANA SP12 Upgrade

Software/Patch required 

IMDB_SERVER100_122_1-10009569
IMDB_SERVER100_122_1-10009569.SAR
SAPCAR
hdblcm_prepare.sh

Prepare HANA Package for Upgrade

Download the following components in the directory of your choice (here I have taken ‘/media/hana_sp12’ as download directory) from service market place using Maintenance Optimizer:

Saturday, 8 October 2016

SAP HANA Calculation View Columns Origin

After using so many Calc.Views that call another bunch of Calc.Views, etc. is really difficult to identify the correct origin (table.column/formula) of some field. This was causing some rework to understand all information needed and to correctly check and model the data

To solve that, I developed a really simple way to connect to HANA, get the dependency data and generates a JSON using Python and jQuery:

Friday, 7 October 2016

What is VORA and How it helps to Bridge the gap between Enterprise data and Big Data

Before getting into the Topic of VORA first we lets try to understand what is Enterprise data, Big Data, HADOOP, SPARK.

What is Enterprise Data – Data that comes from Day today business transactions eg. Sales order, Purchase Order, etc.

What is Big Data – Data that comes from information-sensing mobile devices, aerial (remote sensing), software logs, cameras, microphones, radio-frequency identification (RFID) readers and wireless sensor networks, Social Media and Archived Data.

Thursday, 6 October 2016

Transferring Eclipse Project Artifacts between HANA systems using Web IDE w/b Editor

Introduction

In some environments, it is not always the case where control and security is necessary. Especially in home based developments such as HANA Express Edition, where you may wish to lift your efforts and take to another system.

If you are developing content in Eclipse, under a Project, and you have Functions amongst other Project artifacts. Using Eclipse File -> Export will not pick up certain objects, i.e. functions.

The various Functions exist in Project, along with Views. In the Export Dialog Window, the Functions are not visible, like the Views. More so, adding the Function Package, still does not find them.

Wednesday, 5 October 2016

Switching over to new HANA hardware

Basically the migration to the new HANA server is done via a backup/restore.
Difficult part was that this HANA server got its data from SAP ECC via a SAP SLT server.
So replication and database triggers should be taken into account when switching over.

The connections between the applications look like:

Switching over to new HANA hardware

Tuesday, 4 October 2016

How to Install the Automated Predictive Library in SAP HANA

Having recently installed the the new Automated Predictive Library (APL), the former KXEN Infinite Insight Libraries inside HANA, I wanted to share my experience.

The benefit the APL brings is that we can now utilise the KXEN libraries directly in-memory within the HANA Platform so the data never needs to leave HANA.  These libraries are a core part of Predictive Analytics 2.0 and are found in both the Automated and Expert perspectives within this.  As the name suggests when using the APL we have attempted to automate as much of the predictive process as possible.

Monday, 3 October 2016

Thoughts on SAP HANA Express

Upon my return from SAP TechEd Las Vegas last week I posted SAP TechEd & HANA Express Edition that covered the SAP HANA Express announcement.

Thoughts on SAP HANA Express

Asked the question: 

"HANA Express appears to give individual developers the chance not just to learn but to actually build a HANA app … do you feel that ... HANA Express gives you the tools and licensing you need to take an app to market? … would this now make a viable launch point for entrepreneurial developers with fire in the belly?”

Saturday, 1 October 2016

How to analyze and retain Unused Memory from the HANA/BODS/BW servers

This document is based on one of the major issue on memory bottleneck we are facing in our HANA landscape where we have BODS, BW and HANA DB all running on the same server. The Capacity of the server is quite high with memory of around 512GB and is distributed among different servers. To get the detail on which user is using how much RAM please run the following script:

#!/bin/sh
LDIFS=$IFS
IFS=$'\n'
tempsum=0
totalmem=0
for m in `ps -eo user,rss --sort user | sed -e 's/  */ /g' | awk -F'[ ]' {'print $0'}`; do

Friday, 30 September 2016

SAP TechEd & HANA Express Edition

As always it was great to catch up with lots of friends in the SAP ecosystem - and to make some new ones. The show is a highlight for the SAP techie community and this year they celebrated the 20th SAP TechEd in North America.

Generally speaking those going to their first or second TechEd thought the show was great. Those with more TechEd experience were less enthusiastic - the best adjective I can come up with to describe the show is “subdued”.

In evaluating the success of TechEd 2016 there are many facets that need to be considered - but there are two obvious issues that changed the feel of the event.

Thursday, 29 September 2016

Handling Internal-table inside AMDP

While developing AMDP class there must be a situation like how to handle internal table inside AMDP.


Well below is the example for that in short and sweet manner.

Scenario : Suppose you have Airline code(Carrid) and connection number(Connid) in an internal table based on that you need to fetch data from inside AMDP.

Steps : AMDP Definition :

Wednesday, 28 September 2016

The Parent Child hierarchy in HANA

There are two types of hierarchies supported in HANA:

  • Parent-Child hierarchy
  • Level hierarchy

In this document, I would like to take the readers through the Parent child hierarchy: to create a very simple Parent Child hierarchy and to use it in a reporting tool that supports multidimensional reporting, like A-Office.

Monday, 26 September 2016

SAP HANA DB Installation Steps (Part 2)

Steps to follow:


1) Download the SAP HANA media file  from  service market place
2) Place the media file into the some safe directory location
3) Loginto OS level with” root user” or SIDadm
4) Go the directories which you the HANA media file cd <installation medium>/DATA_UNITS/HDB_LCM_LINUX_X86_64
5) Start the SAP HANA platform lifecycle management tool:
6) Find the file and execute “./hdblcm --compontent_root=/DVD"

Saturday, 24 September 2016

SAP HANA Installation Steps (Part 1)

Pre-requites:


1) Pre-study and go through the SAP Notes & recommendations
2) Prepare for Hardware requirements
3) Prepare for OS environment like UNIX, SUSE Linux, windows etc...
4) Install Java 1.6 or higher level
5) Install GUI Component
6) Download the HANA database software from service market place

Friday, 23 September 2016

Choosing The Most Effective Approach for Technical Migration to SAP HANA

There are three main technical migration paths to SAP HANA. All of these methods have their own advantages and disadvantages compared to each other. My aim is to provide guidance and recommendation to plan your HANA technical migration using most effective approach for your business case.

Before starting I have to mention that all these methods (except maybe greenfield approach) would require a general technical planning and preparations such as;
  • Housekeeping activities, removing obsolete data or archiving old data.
  • Collect and update database statistics
  • Perform consistency check for tables and dictionary objects.
  • Operating system and database related checks and verifications.
  • Parameterization for improved performance.

Thursday, 22 September 2016

Securing the Communication between SAP HANA Studio and SAP HANA Server through SSL

Pre-requisites:

HANA Server is installed and running
HANA studio is installed in the local system
Access to the HANA server
Putty / WinSCP tools

HANA Server and client without SSL configured:


Securing the Communication between SAP HANA Studio and SAP HANA Server through SSL

Wednesday, 21 September 2016

How to Get Dependent Object List (Models & Tables) of a Model using Graphical Calculation View

Finding SAP Dependent object / Catalog object form HANA System is always a challenge. Once we used to do some change in Information model, before doing so, we should have analyzed the dependent object list. Here is a Graphical way, how you can found the dependent object from Active_ObjectCrossRef table.

This will help you to understand the foot print of your model and other affecting areas. Moreover, it is required before modifying Modeling object or information model. To know, which all are affected object of this change. If you required, the list of models and tables for a particular Top information model you can use this model to identify those stuff.

Monday, 19 September 2016

Optimising HANA Query push-down from Apache Spark

As you start using it with larger Tables or Views in HANA, you may notice that the query that is pushed down into HANA is not optimised efficiently. SUM & GROUP BY Clauses are NOT pushed down into HANA,  this may cause a large granular result set to be move across the network, to only be Aggregated in Spark.  That is certainly a waste of HANA's powerful query engine.
In this blog I will demonstrate the problem and show several ways to help get around it, using Apache Spark.

Using the same dataset from the earlier blog  we can see how more complex Spark SQL ( executing against a test table in HANA - "RDATA") is actually pushed down into HANA.

Saturday, 17 September 2016

How to found Dependent Objects in SAP HANA

This is a Blog to help you out in finding SAP Dependent object / Catalog object form HANA System. Once we used to do some change in Information model, before doing so, we should have analyzed the dependent object list.

This will help you to understand the foot print of your model and other affecting areas. Moreover it is required before modifying Modeling object or information model. To know, which all are affected object of this change.

If you required the list of tables for a particular SCHEMA or the list of Field of a particular table, this this is one and only easiest way. Even if you would like to know, which Tables are using same field, this can be a simple way.

Friday, 16 September 2016

ABAP new Open SQL and CDS runtime

With the introduction of NW AS ABAP 740 SPS5, the New enhanced extended Open SQL was introduced. The new enhanced open sql has some good features such introduction of aggregate functions, right outer join, casting, etc making it a starting step for Code-Push down paradigm.

The Code push down techniques (CDS and New SQL) are compared in the demo here. Below is a demo program of data retrieval from BKPF and BSEG.

1) ABAP program using inner join between BKPF and BSEG
2) ABAP program using CDS view entity
3) ABAP program using enhanced Open SQL features

Thursday, 15 September 2016

SAP HANA: Using SAX Parser for Loading XML response from outbound HTTP into table

In this blog , I would like to share my thoughts around using XS destinations for internet connectivity and capturing the XML response and loading the same into HANA with the help of SAX XML Parser.

Scenario :

We shall take the familiar Google Maps API and let us see how can we connect from SAP HANA and capture the XML response. We would need to define xshttpdest for outbound connectivity.

Tuesday, 13 September 2016

HANA XS Dynamic Job Scheduling through UI5 application the Easy Way

So the process's very simple, we define an "empty" job in XS Job Dashboard, then we add/delete the job scheduler through the UI5 application with a switch action.

First, the user you're using to authentication for your UI5 application must have the following role assigned:

sap.hana.xs.admin.roles::RuntimeConfAdministrator

HANA XS Dynamic Job Scheduling through UI5 application the Easy Way

Monday, 12 September 2016

Calling HANA Views from Apache Spark

Open Source Apache Spark  is fast becoming the de facto standard for Big Data processing and analytics. It’s an ‘in-memory’ data processing engine, utilising the distributed computing power of 10’s or even 1000’s of logical linked host machines (cluster). It’s able to crunch through vast quantities of both structured and unstructured data. You can easily scale out your cluster as your data appetite grows.

In addition to this it can also be used as a data federation layer spanning both traditional databases as well as other popular big data platforms, such as Hadoop HDFS, Hadoop Hbase, Cassandra, Amazon Redshift and S3, to name a few.

Sunday, 11 September 2016

Table Functions in SAP HANA - Demo

Post SPS 11 there are changes and restructuring in the way of HANA modeling. In this document I am trying to discuss about Table Functions (User Defined Functions).
The recommendation is to make use of Graphical Calculation views as the end product, so that the complexity of the information views are hidden inside Table Functions.
SAP recommends to Migrate Script-based Calculation Views to Table Functions and then into the Graphical Calculation Views. In the future, SAP recommends using graphical calculation views for modeling any analytic use cases. A migration tool within the SAP HANA modeler allows you to convert existing script-based calculation views available in your system to table functions and graphical calculation views.

Saturday, 10 September 2016

My Life Cycle Experience (SP8)

I’m writing this blog in regards to the SAP HANA Lifecycle Management tool in SP8, hdblcm (SAP Note 1997526). The plan for this blog is to initially post about installing HANA (and components) using hdblcm (also taking a detour to add a node) and then when SAP HANA SP9 comes out I will upgrade using the new hdblcm that comes with SP9.

So where is the hdblcm tool? this is found in the install media (if this is downloaded from the SAP Service Market Place you will need to have SAPCAR on your system to un-archive this file). It can be started in two ways, by calling ./hdblcm or ./hdblcmgui.

Friday, 9 September 2016

SAP HANA: Exploring data using SAP Visual Intelligence

SAP Visual Intelligence is SAP’s latest innovation in the SAP BusinessObjects Explorer solution family. It is a desktop-based visualization and data manipulation solution that allows business users to acquire data from a variety of corporate and personal data sources and manipulate without any scripting.
They can then analyze this data, with beautiful visualizations, quickly discovering unique insight that can be easily shared, shaping the business faster than ever before. -- As mentioned by SAP

Let us see what SAP says that we can do with SAP HANA using SAP Visual Intelligence:

Thursday, 8 September 2016

SAP HANA Migration from Multi-Node to Single-Node

Environment
SAP HANA Database SPS 08 and higher

Install a New HANA Database as Single-Host System


An SAP HANA database cannot be recovered to an SAP HANA database with a lower software version. The SAP HANA software version used for the recovery must always be the same version or higher than the SAP HANA database used to create the data backup or storage snapshot.

Wednesday, 7 September 2016

Network Bandwidth Test for HANA Server and Replication Calculation

For network bandwidth testing you need the newest NIPING version for this test, please follow SAP note 799428 to get it.

First start the NIPING server on Secondary server site which you plan to use as data replication target site with the following command:

niping -s -I 0 (the last character is zero, not the letter O)

One can also use the command: niping -s

Tuesday, 6 September 2016

Create Cumulative figure in HANA Graphical Calculation View

This is a document how you can create a cumulative calculation using HANA Graphical Calculation View. It is one of the easy example, but you can extend it to any complex scenario.

I am creating a Simple file like this and our aim is to create the cumulative column as show bellow.

Product Annual Sales(USD) Cumulative Sum Sales
1070 3600 3600
1050 336 3969
1020 220 4156
1060 192 4348

Saturday, 3 September 2016

SAP HANA Scripted Calculation View

When I was trying to learn HANA scripted calculation view, I had to spend lot of time in creating tables, views and data records in order to get my hands dirty and learn how the scripted calculation views works. What I am trying to do here is gather all of these information in this blog so that one can create their own tables, data and finally scripted calculation Views in the following ways.

We will Discuss Calculation View using.

1. SQL Script - Using CE Functions
2. Table Functions
3. Procedure

Friday, 2 September 2016

Upload and Retrieve Image using SAP HANA XS & SAP UI5

One of the most common use case in building XS based Native HANA Application is working with file uploads like image, csv etc.,
In this blog I will explain how to:
  1. Use SAPUI5 File Uploader to upload an image
  2. Render preview of the uploaded image
  3. Save it to HANA DB using a XSJS service.
  4. Retrieve the saved image using XSJS service and render it on the UI.
Please note this approach can be applied for any type of file. e.g .csv, xlsx etc.,
Upload an Image using SAPUI5 File Uploader:

Wednesday, 31 August 2016

SAP HANA XS Advanced Installation through resident hdblcm (Command based)

SAP HANA XS Advanced Installation:

Prerequisites:
SAP HANA 1.0 SPS11+, kindly upgrade your SAP HANA to 1.0 SPS11+ to use benefits of XSA

1. Download XS Advanced Run-time from SMP

SAP HANA XS Advanced Installation through resident hdblcm (Command based)

Tuesday, 30 August 2016

Data Loading to HANA using DXC Connection

As we know we have 3 types of data provisioning tools for HANA System
1.  SAP BODS – we can connect SAP and Non-SAP systems
2.  SAP SLT – we can connect SAP and Non- SAP systems
3.  DXC -  we can connect only SAP systems

Now will discuss about DXC connection extract data to HANA System

SAP HANA Direct Extractor Connection (DXC) is available as a simple option in ETL (batch) scenarios for data replication from existing SAP Data Source extractors into SAP HANA

Monday, 29 August 2016

Vora 1.2 Modeling Tool

SAP HANA Vora provides an in-memory processing engine which can scale up to thousands of nodes, both on premise and in cloud. Vora fits into the Hadoop Ecosystem and extends the Spark execution framework.

Following image shows you where Hadoop fits in the Hadoop ecosystem:

Vora 1.2 Modeling Tool

Tuesday, 23 August 2016

New Hierarchy SQL enablement with Calculation Views in SAP HANA 1.0 SPS 10

SQL enabled hierarchies with SAP HANA Calculation Views


Modeling SAP HANA Calculation Views is the key approach to successfully exploit the power of the SAP HANA Platform and leverage key SAP HANA capabilities. With SAP HANA SPS 10 (find enhancement overview here), calculation views provide a deeper integration of hierarchy objects and their exposure for usage within SQL. By leveraging the SQL integration of hierarchy objects, hierarchy-based filters, aggregations and hierarchy-driven analytic privileges are enabled.

Monday, 22 August 2016

ALV and FPM on SAP HANA

With ALV and FPM lists, SAP provided very powerful and convenient tools to represent lists in SAPGUI, WebDynpro UIs or SAP NetWeaver Business Client. These tools have been very well adapted to the paradigm of databases being the bottleneck. In this context it is an efficient design to select the data into an ABAP table and to execute the UI operations like paging, sorting, aggregation and filtering on the ABAP application server. Handing over this internal table as data source to the reuse components ALV and FPM list is their basic principle and the reason why they can provide these operations out of the box: they have control over the result set.

Sunday, 21 August 2016

XS application for table distribution in scale out HANA system

HANA demands optimal distribution of data across all the HANA blades for best performance. A proper table distribution helps for more optimal load balancing and better parallelization. In this blog we will cover only the table distribution part. Table partitioning optimization will be described in another SCN article.
Here are several basic rules for general DB table distribution which this app follows:
  1. Large tables should not be replicated.
  2. Small tables can exist on different nodes to leverage more optimal joins and prevent network transfer between the DB nodes.
  3. Tables are distributed as evenly as possible.

Saturday, 20 August 2016

HOW TO GENERATE ROW NUMBER OR SEQUENCE NUMBER USING HANA GRAPHICAL CALC VIEW RANK

Created a Table name Country in SAP HANA and have following columns:
COUNTRY_NAME        VARCHAR (50)
COUNTRY_ID                INTEGER

HOW TO GENERATE ROW NUMBER OR  SEQUENCE NUMBER USING HANA GRAPHICAL CALC VIEW RANK

Friday, 19 August 2016

Licensing, Sizing and Architecting BW on HANA

I've had more than a few questions on BW on HANA Licensing and Sizing, and it seems that there isn't anything authoritative in the public domain. So here we go, but before we start...

Caveats


Architecting BW on HANA systems requires some care. First, database usage, number of indexes and aggregates, use of database compression, reorgs and non-Unicode systems all cause a variance in compression in the HANA DB. The best way to size a HANA DB is to do a migration.

Thursday, 18 August 2016

First Steps of Code Quality Analysis for SAP HANA SQLScript

One of these is SAP HANA SQLScript, which is used to develop high-performance stored procedures for the SAP HANA in-memory database. Unfortunately, SAP did not provide any static code analysis for SQLScript (in contrast to SAP Code Inspector for ABAP). Moreover, there are no precise guidelines how to develop good SQLScript code so far. In this post I'll present our initial thoughts on assessing the code quality of SQLScript.

The starting point to identify relevant static checks was the SAP HANA SQLScript Reference, which already mentions some (very general) best practices (chapter 13). Some of the recommendations there are very easy to detect automatically, e.g.

Wednesday, 17 August 2016

How to Define Role to different Server Nodes in Multi Node HANA System

The standard SAP recommended Node role would be as follows:

How to Define Role to different Server Nodes in Multi Node HANA System

In the above screen shot we have three nodes in which the first node has been set as Master node for Index and Name server.

Tuesday, 16 August 2016

Connecting SAP HANA Views to Sensor Data from Osisoft PI

Organizations across different industries leverage Osisoft PI systems in order to collect operational data (e.g. temperature, pressure, flow…) from sensors. These sensor data can be used to get a real-time view of the operational performance of assets, monitor the quality of products, or identify machine failures; just to highlight a few examples. However, sensor data from an Osisoft PI system can even be consumed in SAP HANA Views and thus enable new insights for business users.
In this blog, I give an overview on how these sensor data stored in an Osisoft PI system can be accessed from SAP HANA Views in real time via the SAP Manufacturing Integration and SAP Plant Connectivity solutions, without having to persist the data in the SAP HANA database. I focus on accessing the PI Data Archive and PI Asset Framework (PI AF) of an Osisoft PI system.

Monday, 15 August 2016

How and why to activate asynchronous queuing of IO requests

Why Asynchronous IO:

Undesired synchronous I/O can have a major impact on the HANA performance, especially restart time and table load time for read I/O as well as savepoint and write transaction commit times for write I/O.

HANA uses asynchronous IO to write to and read from disk. Ideally, asynchronous IO should have a trigger ratio close to zero. A trigger ratio close to 1 indicates asynchronous IO that behaves almost like synchronous IO, that is: triggering an IO request takes just as long as executing it and hence very prone to performance degradation of HANA system. In such cases we need to activate asynchronous IO for a particular file system/path.

Saturday, 13 August 2016

How to use HDBAdmin to analyze performance traces in SAP HANA

Most of the time, the Plan Visualizer is sufficiently powerful to understand what is going on inside of SAP HANA when you run a query. However, sometimes you need to get to a lower level of detail to understand exactly what is going on in the calculation engine.

It is then possible to use HANA Studio to record performance traces, and analyze them with HDBAdmin. This is a fairly advanced topic, so beware!

First, let's pick a query which runs slowly. This query takes 12 seconds, which is longer than I'd like. Admittedly, it's a tough query, grouping 1.4bn transactions and counting over 2m distinct customers.

Thursday, 11 August 2016

ABAP on HANA - Use Cases

Introduction to ABAP on HANA


Through HANA, SAP has brought forth a high performing multi faceted appliance with rich analytic computational capabilities, in-memory hardware, enhanced compression technology, geospatial capabilities, Text analytics and predictive analytics, to name a few. With so powerful a back-end, the application layer too had to be revised to fully leverage the enriched capabilities of HANA. CDS Views, AMDPs, and enhancements to existing Open SQL are the various available solutions which help in achieving Code Push Down – transfer the data intensive logic to the database resulting in better performance. AS 7.4 or above is the required version of application layer on a HANA database for the features mentioned throughout the document to work.

Wednesday, 10 August 2016

How to Trouble Shoot Statistic server migration Failed

Run the following SQL to determine the time it failed at

select
        value
from

_SYS_STATISTICS.STATISTICS_PROPERTIES
where key = 'internal.installation.state'

This will return the time that the switch failed at

Tuesday, 9 August 2016

How to connect Microsoft SSIS with SAP HANA

SSIS (SQL Server Integration Services) is a component of the MS SQL Server which can be utilized for various data migration tasks. This blog provides the step by step process with screenshots to implement a connection between SAP HANA and MS SSIS to perform data transfer.


Tools Required

  1. HANA Studio.
  2. MS Visual Studio 
  3. Business Intelligence tools for Visual Studio   
  4. HANA Client

Monday, 8 August 2016

Saturday, 6 August 2016

Generate Time Data in SAP HANA - Part 2

Procedure:


1. Generate the master data from the specific time frame that you are interested in
  • On the Quick Launch Page > Data > Generate Time Data
Generate Time Data in SAP HANA - Part 2

Friday, 5 August 2016

Generate Time Data in SAP HANA - Part 1

In this document, I tried to explain "Generate Time Data" functionality. Through this, I am trying  to provide the general functionality and idea about the "Generate Time Data" with calender type "Gregorian".
In order to better understand how to use “Generate Time Data”, we are going to use standard table for the examples.

Note: While using this option you need to replicate the standard table into SAP HANA that is T005T, T005U, T009, and T009B. If these standard tables are not available then you will not be able use the “Generate Time Data” function.

Thursday, 4 August 2016

Expose Attribute Views as XS OData Service in HANA

1. You need to change the Eclipse IDE perspective to SAP HANA Development. Navigate to Window → Open Perspective → Other to change the perspective of your HANA Studio to SAP HANA Development   (OR)   Select the SAP HANA Development perspective in the perspective shortcut which is at the top right corner of your SAP HANA Studio.

Expose Attribute Views as XS OData Service in HANA

Wednesday, 3 August 2016

Modelling: Column to Row Transpose using Matrix in HANA

Background:


In almost every project, some form of data transformation is required. Most of these transformations are some combination of Aggregation, Projection, Union or Join steps and these transformation can be easily done in the HANA models.

Where possible, It may be a good idea to push the complex transformation to the data acquisition layer using ETL tools like data services.

Tuesday, 2 August 2016

What’s new in SAP HANA SPS12 – SAP HANA Graph Engine

What’s new in SAP HANA SPS12 – SAP HANA Graph Engine


SAP HANA Graph is an integrated part of SAP HANA core functionality. It expands (the functions of) relational database management systems, with native support for graph processing, and allows executing typical graph operations on all data stored in a SAP HANA system; which is automatically optimizes execution plans to execute highly performant queries, and built-in graph algorithms based on flexible schema property graph data model, to traverse on relationships without the need to predefined modeling and complex JOIN statements.

Monday, 1 August 2016

Table Valued - UDFs Vs Scripted Calculation Views - Rudimentary Exploration

'Use table functions instead of scripted calculation views (SP09)'

This sounded like, the coveted scripted calculation views, which were a part & parcel of any real-time customer implementation were facing an extinction threat! To fathom what that means, warranted some investigation and here's my attempt to get the basics in place w.r.t
User-defined Table-valued Functions (a.k.a TV-UDFs).

Saturday, 30 July 2016

My first experiences with the on premise Web IDE for HANA

Project


Creating a XS Advanced (XSA) project is supported by templates, but currently only one template for Multi-Target Application Project exists that simply creates a Multi-Target Application (MTA) Descriptor file. Templates like the Fiori Master Detail Application, that I like because it fits the requirements for many simple out of the box scenarios, are currently missing and the MTA Descriptor file has to be edited manually for the application configuration.

Friday, 29 July 2016

How to return value XS OData using XSJSLIB and Stored Procedure

Information:


This document contains code of a small test. This logic allow us to return the Id with XS OData. A stored procedure is used to insert records into a table.
A structure is used specifically as input. This highlights, that I only insert directly from the stored procedure into the table. In this way, data can be validated before inserting into the table.
A XSJSLIB file is used to generate the new Id. In this post you will see an example, how to combine a XSJSLIB with a Stored Procedure and some validation inside the stored procedure.

Thursday, 28 July 2016

How to Consume HANA XS Odata in ECC

Introduction:


In this blog we will discuss how can we consume HANA XS-Odata services in ECC system. In this scenario we will set up an RFC connection between HANA and ECC system and the XS Odata services can be consumed. The data coming from the XS-Odata services can be populated in an internal table for further processing.

Prerequisites:

  • This scenario has to be implemented in ECC system.
  • An RFC connection should exist between HANA and ECC.

Wednesday, 27 July 2016

Troubleshooting Dynamic Tiering Connections to SAP HANA

Sometimes you may get a connection error when converting a table to
extended storage:

ALTER TABLE "DT_SCHEMA"."TABLE_A" USING EXTENDED STORAGE;

[4863] {207439} [95/10731633] 2016-05-06 17:50:26.455132 e FedTrace
odbcaccess.cpp(03672) : ODBC error: connected:  1 state: HY000 cide: -65 [SAP]
[ODBC Driver] Unable to connect to server 'HANA': [SAP AG] [LIBODBCHDB SO] [ HDBODBC]
General error;1033 error while parsing protocol

Tuesday, 26 July 2016

SAP HANA Developer Edition 1.00 SPS11

SAP HANA Developer Edition 1.00 SPS11

After several months we finally have the SPS11 SAP HANA Developer Edition live with the brand new SAP HANA XSA configured and running, unfortunately being early meant we had a few problems along the way so hopefully you'll all be happy with the results - we are that is for sure!

Monday, 25 July 2016

All about Joins using SQL in HANA

Lets do it in HANA studio and cover below points :
  • Overview of HANA studio
  • Joins concept
  • Creating Schema
  • Creating Tables
  • Insert Values into Tables
  • Using SQL to understand below Joins in HANA Studio
             - Inner Join
             - Left Outer Join
             - Right Outer Join
             - Full Outer Join

Saturday, 23 July 2016

SAP HANA Analytical view

SAP HANA Analytical View


Now we will create an Analytic View which combines purchase order table data with the product attribute view we created in the previous step.
In your models package create a new Analytic View.

SAP HANA Analytical view

Friday, 22 July 2016

SAP HANA TDI on Cisco UCS and VMware vSphere - Part 2

ESXi Host


UCS Service Profile

The ESXi host provides the platform where virtual machines are running on. The service profile contains the configuration of the hardware in a Cisco UCS environment. Service Profile Templates or vSphere Auto Deploy can be used to ease the ESXi deployment process. In this example, a standalone service profile creation is shown.

For each vSwitch, it is recommended to configure two uplink interfaces with MTU 9000 as trunk. The VLAN assignment takes place in the port group configuration of the vSwitch.

Thursday, 21 July 2016

Understand HANA SPS and supported Operating Systems

HANA Revision Strategy


 SAP is shipping regular corrections and updates. Corrections are shipped in the form of Revisions and Support Packages of the product’s components. New HANA capabilities are introduced twice a year in form of SAP HANA Support Package Stacks (SPS). The Datacenter Service Point (DSP) is based upon the availability of a certain SAP HANA revision, which had been running internally at SAP for production enterprise applications for at least two weeks before it is officially released.

Understand HANA SPS and supported Operating Systems

Wednesday, 20 July 2016

DB Refresh of HANA Based systems using HANA Studio Backup/Recovery without SWPM

DB Refresh using HANA Studio DB Restore  with SID Modification -  NON SAPinst

HANA Studios DB restore option and hdbuserstore there after makes the whole process of a  System copy a lot simpler . The only limitation would be that the target schema would still remain as source.

The idea of this blog is to explain the process of performing  a simple homogeneous system copy of a HANA based system using the recovery of the source backup on the target HANA DB using HANA Studio.

In this case ,we will be  considering a situation of copying a Prod backup and refreshing it in a already running quality system

Tuesday, 19 July 2016

Execution management by HANA Graphical Calculation View

SAP HANA, An appliance with set of unique capabilities has provided wide range of possibilities for end user to perform data modeling. One among them is 'Graphical Calculation View', which helps in leveraging the fullest potential of SAP HANA in data modeling sector.

This Document helps in understanding the nature of execution management by HANA Graphical Calculation Views by utilizing the allowed set of properties in it. So as to reveal out the effectiveness in handling the properties to manage the execution flow.

First among the property set under discussion is

Monday, 18 July 2016

Smart Data Access- A new feature by HANA

In Today's world, Companies are facing challenges to optimize cost and processes in order to sustain in this grim economic condition. Business has to be dynamic and agile to keep pace with the market and technology. Business has to get information in real time to make quick decision on time and at the same time, we need to keep control over cost for IT and Technology.

Keeping Business need in view, SAP has recently introduce Smart Data Access in SAP HANA which is a Virtualization Technique. This feature is introduced from SPS6 in SAP HANA. Smart Data Access enables SAP HANA to combine data from heterogeneous sources like Teradata, Sybase IQ, SAP Sybase Adaptive Service Enterprise and Hadoop.

Saturday, 16 July 2016

A closer look at tables that contain LOB data type columns

Introduction


Since SPS06, SAP HANA offers a flexibility around how it manages data type LOB (Large Object Types) columns, namely Hybrid LOBs. Data types, CLOB, NCLOB and BLOB are used to store a large amount of data such as text documents and images. The current maximum size for an LOB on SAP HANA is 2GB.

In the following blog, we will go through some examples of how Hybrid LOBS are defined, specifically around the memory threshold value and the effect that has on keeping the LOB column data on disk versus allowing it to be loaded into memory.

Friday, 15 July 2016

How to Perform Garbage Collection in HANA

The process defined below can be executed on daily basis during off peak hour or after every 3 days. Also it can be repeated on specific HANA Node if memory utilization on specific HANA node server is more than 85%.

The following SQL statement needs to be executed in HANA studio to know the current resident memory at the start and after each Garbage collection process to verify it's effect:

SELECT HOST, ROUND(USED_PHYSICAL_MEMORY/1024/1024/1024, 2) AS "Resident GB", ROUND((USED_PHYSICAL_MEMORY + FREE_PHYSICAL_MEMORY)/1024/1024/1024, 2) AS "Physical Memory GB" FROM PUBLIC.M_HOST_RESOURCE_UTILIZATION;

Thursday, 14 July 2016

Dynamic Currency reporting in HANA

In many reporting scenarios, it is often required to display the financial figures in multiple currencies. For example, most of the systems capture the "Net Sales in the Stores" in the local currency of the Store. For the Country level local reporting, it is often required to report the Net sales in the Local currency. But for the Regional Head Quarter reporting, the Net Sales values need to be converted to the Regional currency. The same figure at the Global Head Quarter might be required in the Global currency.

Wednesday, 13 July 2016

New Hierarchy SQL enablement with Calculation Views in SAP HANA 1.0 SPS 10

SQL enabled hierarchies with SAP HANA Calculation Views

Modeling SAP HANA Calculation Views is the key approach to successfully exploit the power of the SAP HANA Platform and leverage key SAP HANA capabilities. With SAP HANA SPS 10, calculation views provide a deeper integration of hierarchy objects and their exposure for usage within SQL. By leveraging the SQL integration of hierarchy objects, hierarchy-based filters, aggregations and hierarchy-driven analytic privileges are enabled.

Enabling hierarchies for SQL access

Within the SAP HANA calculation view general properties, there is a new checkbox “Enable Hierarchies for SQL access”. With that hierarchy views from shared dimensions used in a Star Join Calculation View are enabled for SQL access.

Tuesday, 12 July 2016

The Best-run Organizations Prefer SAP HANA

Only recently, a German television magazine unveiled that no others than the U.S. security agencies CIA and NSA are using software solutions based on the SAP HANA® platform. This is especially remarkable since most people would expect that particularly U.S. authorities prefer a database made in the U.S. Well – they don’t, for obvious reasons. SAP HANA is a database platform that is currently unparalleled. It enables companies and organizations to analyze large amounts of data, both structured and unstructured, in real time. And consequently, more and more of these companies and organizations replace their legacy databases with SAP HANA, realizing incredible cost and performance benefits while simplifying their IT infrastructure as well as exploring innovative business processes and new business models at the same time. Let me present just a few:

Monday, 11 July 2016

MDC conversion on HANA System Replication configured

MDC system can only be replicated as the whole system, it means that the system database and all tenant databases are part of system replication. A take over happen for the whole Hana database (system database + all tenant databases) and is not possible to take over just for a particular container.

In our scenario, we have system replication setup for single container systems running on 112.02, and we decided to convert them into MDC. As we know that primary and secondary must be identical (N+N, nodes (except Standby) and services) during system replication setup, there's no exception for MDC.

Saturday, 9 July 2016

Cumulative Sum / Running Total in HANA Calculation View

Requirement :

Most of the popular reporting tools can handle cumulative sum or running total , but still there have been lot of posts in SCN about whether it is possible to do it in HANA itself.  So, just wanted to see , if it can be easily done via Graphical Calculation view and it seems to be working .

Note: This might still be easier to do in Scripted View or Reporting Tool .

In Figure 1, our base data is in Column 1 and 2 and we want to add Colum 3 ( "C UM_SUM" ) to existing data set.

Friday, 8 July 2016

How to scheduling XS Job to call stored procedures

This tutorial try to explain how to schedule a XS job to call strore procedure.

We need to change the SAP HANA perspective to SAP HANA Development.

How to scheduling XS Job to call stored procedures

We create a XS Project with a new package as follow

Thursday, 7 July 2016

Adjust SLT connection to HANA MCD

After convert HANA to MCD (MultiTenant) some tasks must to be executed in SLT to connect with MCD Database.

In addition to Note 2101084 please check the SAP Note relevant for your DMIS version from the list below and ensure all corrections are applied to both source and SLT server where indicated in the note.
  • 2016511 - Installation/Upgrade SLT - DMIS 2011 SP7
  • 1958809 - Installation/Upgrade SLT - DMIS 2011 SP6 / DMIS 2010 SP10
  • 1882433 - Installation/Upgrade SLT - DMIS 2011 SP5
  • 1824710 - Installation/Upgrade SLT - DMIS 2011 SP4 / 2010 SP9
  • 1759156 - Installation/Upgrade SLT - DMIS 2011 SP3 / 2010 SP8
  • 1709225 - Installation/Upgrade SLT - DMIS 2010 SP7 / 2011 SP2
  • 2191214 - Installation/Upgrade SLT - DMIS 2011 SP9 & SP10 

Wednesday, 6 July 2016

IMPLEMENTING AND DISPLAYING STANDARD HIERARCHY WITH SAP HANA

INTRODUCTION

STANDARD HIERARCHY

Definition:
Standard hierarchy is a tree hierarchy which is used to organize business processes of a controlling area. The highest node of a standard hierarchy is normally the first business process group. The groups created thereafter make of the remaining nodes of the standard hierarchy.
The standard hierarchy is assigned directly to the controlling area and has itself a set of business process groups assigned to it. This ensures that all business processes belonging to a controlling area are grouped together.

Monday, 4 July 2016

[HANA System Replication] end-to-end Client Reconnect

I've seen many posts on how to setup Hana System Replication and its takeover, however, none or few of the post that covers client reconnect after sr_takeover.

In order to ensure the client is able to find seamlessly the active HDB node (doesn't matter primary or secondary), we can either use IP Redirection or DNS Redirection. In this blog, i'll emphasize on simple IP redirection as it is much easier, faster and less dependancies compare to DNS redirection.

First of all, we need to identify a virtual hostname/ip, create them in your DNS. Below is the sample virtual hostname/ip and physical hostname/ip used:

Virtual IP/Hostname: [10.X.X.50 / hanatest]
Primary Physical IP/Hostname: 10.X.X.20 / primary1
Secondary Physical IP/Hostname: 10.X.X.21 / secondary2

Saturday, 2 July 2016

Predictive analytics using SAP Design Studio and SAP HANA – Part 2

In part 1, we discussed the 101 of predictive analytics in SAP HANA using SAP HANA PAL and HANA flowgraph modelling. The drag and drop interface allows us to focus on the analysis without spending too much time writing out complex SQL scripts.

In this blog, we will understand how we can integrate flowgraphs with SAP Design Studio and provide a readily consumable output to the decision makers.

Extending our example from the previous blog, we would like to select different store and product combinations, to be able to view the sales forecast at a granular level. This kind of analysis will help in inventory planning, promotional offers for low selling products and so on. We would also like to choose the number of periods of forecast as well as be able to modify the forecast parameters to ensure a good fit.

Friday, 1 July 2016

Predictive analytics using SAP Design Studio and SAP HANA – Part 1

With advanced analytics finding a place in every business function, scripting and programming is too much of a hassle when you are bound by deadlines and tough competition. This is where SAP makes a difference by providing a minimal scripting approach to predictive analytics.
This is a two-part blog where part 1 discusses the fundamentals of SAP HANA PAL & HANA flowgraphs and part 2 discusses how flowgraphs can be integrated with SAP Design Studio to derive actionable insights on the fly.

SAP HANA PAL – Predictive Analytics Library, which is a part of HANA AFL (Application Functions Library) is a large collection of functions to implement predictive modelling. These are the same set of functions that are made available in SAP Predictive Analytics Expert Mode when connecting to a HANA data source. Earlier on, these functions could be used by means of a several-step process using SQL scripts – manually creating all of the artefacts required.

Thursday, 30 June 2016

Persisting output from HANA View

Background is based on a problem we faced with a HANA calculation view that was not performing as we’d like to see it and there didn’t seem ways around performance issue. It was not volatile data in terms of changes so we decided best option would be to store the data on a load schedule in our HANA system. Question was then how to do that in HANA. As a background, I can say that we are on SPS09.
Process that we went with was to create a custom table that would be filled with stored procedure that would be scheduled to run a load frequency through xs admin. I’ll share the steps.
Creating the custom table would be based on the definitions that we had in HANA view. You can manually create the table in the schema that you’d like to store it, but alternative would be to run following script in SQL console:
create column table <schema>.<table name> as (select * from "_SYS_BIC".<package>/<HANA view>)

Wednesday, 29 June 2016

Developer's Journal: ABAP/HANA Connectivity via Secondary Database Connection

Introduction


In this first edition of this HANA Developer's Journey I barely scratched the surface on some of the ways which a developer might begin their transition into the HANA world. Today I want to describe a scenario I've been studying quite a lot in the past few days: accessing HANA from ABAP in the current state.  By this, I mean what can be built today.  We all know that SAP has some exciting plans for ABAP specific functionality on top of HANA, but what everyone might not know is how much can be done today when HANA runs as a secondary database for your current ABAP based systems.  This is exactly how SAP is building the current HANA Accelerators, so it’s worth taking a little time to study how these are built and what development options within the ABAP environment support this scenario.

Tuesday, 28 June 2016

Configuration and perfroming backups using netbackup for SAP HANA

The NetBackup for SAP HANA Agent integrates the backint interface for SAP HANA along with the backup and the recovery management capabilities of NetBackup. The software works in single as well as multi-node environments. Below are the list of activites that you need to perfrom in the HANA Studio for netbackup configuration.

1. Create hdbbackint soft link from /usr/sap/<SID>/SYS/global/hdb/opt/hdbbackint to /usr/openv/NetBackup/bin/hdbbackint_script for every database instance.

2. The parameter file (initSAP.utl) must be specified for data in the SAP HANA database instance configuration. To specify the parameter file, go to
Instance->Configuration-> global.ini > data_backup_parameter_file.

Monday, 27 June 2016

Tool for Quick Creation of Virtual Table in HANA

Introduction:

I have seen there is redundant work involved in creating Virtual table after setting up connection in HANA. So here is a tool which you can create in your schema to quickly create the tables.

My Scenario:

I have a scenario where I have to move an application from old SP09 landscape to SP11 landscape where most the Calculation views have virtual table.  In my scenario we have both the SCHEMA_NAME and the source is also HDB. So I have 2 scenario to cover to make it more generic.

Saturday, 25 June 2016

Effective Query pruning using Constant Column in UNION node

So far we have known different modeling techniques for implementing business scenarios which include data from different sources, but need to be reported together in the output. The classic example for HANA modeling workshop is to display the “Actual” and “Planned” data, which can be combined together in a Calc view using UNION operator.

Effective Query pruning using Constant Column in UNION node

Fig 1. Actual vs Planned implementation using Projection nodes in UNION