Saturday, 6 August 2022

Integration of SAP Ariba Sourcing with Qualtrics XM for Suppliers, HANA Cloud Database

In this blog, I will give you an overview of a solution to extract supplier data from a Sourcing event in SAP Ariba Sourcing, and save it in a mailing list in SAP Qualtrics XM for Suppliers, using BTP services.

Process Part 1

First, to extract the information from SAP Ariba Sourcing, I use the Operational Reporting for Sourcing (Synchronous) API to get events by date range, and also the Event Management API which returns supplier bid and invitation information from the sourcing events.

Then, I store the information I need in a SAP HANA Cloud database. I created 3 tables to store the information that I will send to SAP Qualtrics XM for Suppliers: RFx header information, Invitations, and Organizations contact data.

Finally, I send all the information needed to a SAP Qualtrics XM for Suppliers mailing list, which will then handle automatically sending surveys to suppliers that participated in the Ariba sourcing events.

To send the information to SAP Qualtrics XM for Suppliers, I use the Create Mailing List API.

Integration

All this is orchestrated by the SAP Integration Suite, where I created 2 iFlows:

◉ The first iFlow is to get the information from the SAP Ariba APIs, and store it in the SAP HANA Cloud database.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get RFx Information from Ariba, and store it in SAP HANA Cloud

◉ The second iFlow is to get the information from the SAP HANA Cloud database, and send it to SAP Qualtrics XM for Suppliers via the Mailing List API.

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs
Get information from HANA Cloud and Send Contacts information to Qualtrics 

The SAP HANA Cloud database was created with a CAP application developed in Visual Studio Code.

Final Thoughts

By using SAP Ariba and Qualtrics APIs, as well as a couple of BTP services, integration between the two can be achieved in a very simple way with only a few steps.

Process Part 2


There are two methods to create SAP HANA Cloud artifacts:

HANA Database project

In this method we create database artifacts in a classic database schema using declarative SQL

For this exercise, I created the project in the SAP Business Application Studio as explained in the tutorial. I used these files to create the tables:

rfx.hdbtable

COLUMN TABLE rfx (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  title NVARCHAR(1024) COMMENT 'Title',
  created_at DATETIME COMMENT 'Created at',
  updated_at DATETIME COMMENT 'Updated at',
  event_type NVARCHAR(30) COMMENT 'Event Type',
  event_state NVARCHAR(30) COMMENT 'Event State',
  status NVARCHAR(30) COMMENT 'Event Status',
  PRIMARY KEY(id)
) COMMENT 'RFx Information'
 
rfx_invited_users.hdbtable

COLUMN TABLE rfx_invited_users (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  unique_name NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  full_name NVARCHAR(1024) COMMENT 'Full Name',
  first_name NVARCHAR(250) COMMENT 'First Name',
  last_name NVARCHAR(250) COMMENT 'Last Name',
  email NVARCHAR(250) COMMENT 'Email',
  phone NVARCHAR(30) COMMENT 'Phone',
  fax NVARCHAR(30) COMMENT 'Fax',
  awarded BOOLEAN COMMENT 'Awarded',
  PRIMARY KEY(id, unique_name)
) COMMENT 'Users invited to RFx'
 
rfx_organizations.hdbtable

COLUMN TABLE rfx_organizations (
  id NVARCHAR(15) NOT NULL COMMENT 'Internal ID',
  item INTEGER COMMENT 'Item No',
  org_id NVARCHAR(15) COMMENT 'Organization ID',
  name NVARCHAR(1024) COMMENT 'Name',
  address NVARCHAR(1024) COMMENT 'Address',
  city NVARCHAR(1024) COMMENT 'City',
  state NVARCHAR(1024) COMMENT 'State',
  postal_code NVARCHAR(10) COMMENT 'Postal Code',
  country NVARCHAR(100) COMMENT 'Country',
  contact_id NVARCHAR(1024) NOT NULL COMMENT 'Contact Unique Name',
  PRIMARY KEY(id, item)
) COMMENT 'RFx Organizations'

After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud database explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use JDBC (for example) to access the tables in the SAP HANA Cloud database.

Multi-target application project, CAP and CDS

In this method we create a multi-target application and use CAP and CDS to generate the SAP HANA database tables, as well as the OData services to access the database.

I used these files for the CDS artifacs:

schema.cds

namespace com.aribaxm.service;

type InternalId : String(15);
type SDate : DateTime;
type XLText : String(2050);
type LText : String(1024);
type SText : String(30);
type MText : String(250);

entity Rfx {
    key id           : InternalId;
        title        : LText;
        createdAt    : SDate;
        updatedAt    : SDate;
        eventType    : SText;
        eventState   : SText;
        status       : SText;
}

entity RfxInvitedUsers {
    key id         : InternalId;
    key uniqueName : LText;
    fullName       : LText;
    firstName      : MText;
    lastName       : MText;
    email          : MText;
    phone          : SText;
    fax            : SText;
    awarded        : Boolean;
}

entity RfxOrganizations {
    key id      : InternalId;
    key item    : Integer;
    orgId       : SText;
    name        : LText;
    address     : LText;
    city        : LText;
    state       : LText;
    postalCode  : SText;
    country     : MText;
    contactId   : LText;
}
 
service.cds

using com.aribaxm.service as aribaxm from '../db/schema';

service CatalogService @(path:'/api/v1') {
    entity Rfx as projection on aribaxm.Rfx;
    entity RfxInvitedUsers  as projection on aribaxm.RfxInvitedUsers;
    entity RfxOrganizations as projection on aribaxm.RfxOrganizations;
}

server.js

"use strict";

const cds = require("@sap/cds");
const cors = require("cors");
//const proxy = require("@sap/cds-odata-v2-adapter-proxy");

cds.on("bootstrap", app => app.use(cors()));

module.exports = cds.server;

The mta.yaml should look something like this:

---
_schema-version: '3.1'
ID: AribaXM
version: 1.0.0
parameters:
  enable-parallel-deployments: true
build-parameters:
  before-all:
    - builder: custom
      commands:
        - npm install --production
        - npx -p @sap/cds-dk cds build --production

modules:
  - name: aribaxm-srv
    type: nodejs
    path: gen/srv
    parameters:
      buildpack: nodejs_buildpack
    build-parameters:
      builder: npm-ci
    provides:
      - name: srv-api # required by consumers of CAP services (e.g. approuter)
        properties:
          srv-url: ${default-url}
    requires:
      - name: aribaxm-db

  - name: aribaxm-db-deployer
    type: hdb
    path: gen/db
    parameters:
      buildpack: nodejs_buildpack
    requires:
      - name: aribaxm-db

resources:
  - name: aribaxm-db
    type: com.sap.xs.hdi-container
    parameters:
      service: hana # or 'hanatrial' on trial landscapes
      service-plan: hdi-shared
    properties:
      hdi-service-name: ${service-name}
 
And the package.json should look something like this:

{
  "name": "aribaxm",
  "version": "1.0.0",
  "description": "A simple CAP project.",
  "repository": "<Add your repository here>",
  "license": "UNLICENSED",
  "private": true,
  "dependencies": {
    "@sap/cds": "^5",
    "express": "^4",
    "@sap/hana-client": "^2",
    "cors": "^2"
  },
  "devDependencies": {
    "sqlite3": "^5",
    "@sap/hdi-deploy": "^4"
  },
  "engines": {
    "node": "^16"
  },
  "scripts": {
    "start": "cds run"
  },
  "eslintConfig": {
    "extends": "eslint:recommended",
    "env": {
      "es2020": true,
      "node": true,
      "jest": true,
      "mocha": true
    },
    "globals": {
      "SELECT": true,
      "INSERT": true,
      "UPDATE": true,
      "DELETE": true,
      "CREATE": true,
      "DROP": true,
      "CDL": true,
      "CQL": true,
      "CXL": true,
      "cds": true
    },
    "rules": {
      "no-console": "off",
      "require-atomic-updates": "off"
    }
  },
  "cds": {
    "requires": {
        "db": {
            "kind": "hana"
        }
    },
    "hana": {
        "deploy-format": "hdbtable"
    }
  }
}
 
After creating all files, the project should look like this:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

After deployment, you should see all 3 tables in the SAP HANA Cloud Database Explorer:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

And these 2 applications in the BTP space:

SAP Ariba Sourcing, HANA Cloud Database, SAP HANA Exam, SAP HANA Career, SAP HANA Tutorial and Materials, SAP ABAP Skills, SAP ABAP Jobs

Now you can use OData to access the database. In this exercise I didn’t added access control, so you should use your BTP user to execute the OData services from Postman and check the access.

Final Thoughts

Now you have 2 methods of creating SAP HANA Cloud database artifacts, both from the SAP Business Application Studio or Visual Studio, so you can access the database from JDBC or OData services.

No comments:

Post a Comment