Schema flexibility is one of the key capabilities in SAP HANA that majorly helps to bring in flexibility with the column store table definition. A brief insight with a good example can be seen in this Getting flexible with SAP HANA.
Let us now understand the new capabilities in schema flexibility with this document.
With HANA SPS11 customers can now avail the increases capabilities of schema flexibility in SAP HANA, let us now understand the same via some examples.
Create a column table to store the employee details considering his Name, ID and Age by giving a room to add the other necessary information of employee based on the needs at run time by using the below syntax :
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility;
Adding the clause of ‘with schema flexibility’ during the table creation enables the provision of dynamic column creation during the DML operations like insert/upsert, update or delete.
Once the base structure for Employee_details table is created , there comes a requirement to add some more details like employee_salary, employee_department as new columns to the created table definition, now the dynamicity of employee_details table comes in handy as we have enabled it ‘with schema flexibility’ option, instead of altering the structure of the table we can now directly add whatever data we need to the table as shown below :
Insert into employee_Details(emp_id , emp_name, AGE, employee_salary, employee_department) values(1,’RAM’,29,1000,PI_HANA);
Now the insert statement will get executed successfully irrespective of whether the column highlighted in the insert operations are existing or not, which means the 2 new highlighted columns must get added to the metadata of the table implicitly as part of the insert statement.
Nature of flexible table is to create the dynamic column with default data type as NVARCHAR having maximum length (5000), if we do not want to use this default nature and make the data type of dynamic column as user’s choice it can now be done with HANA SPS11 during the creation of table. lets say in our case any dynamic column that gets added to employee_Details table must have the data type of intereger then we can do it by writing my create statement as:
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility (DEFAULT DATA TYPE INTEGER);
Now any newly created dynamic columns during the insertion/update will take integer as the data type.
If we have a case where the details that get added to employee_Details table are heterogeneous entries and we want the dynamic columns to construct their data types based on the inserted value , we can do that by the following create statement : which is considered as ‘Data type detection’.
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility(DEFAULT DATA TYPE *).
Here the dynamic columns created constructs their type of data based on the value inserted.
That is:
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary) values(1,’RAM’,29,’PI_HANA’,2000);
The last two columns take numeric and string data types respectively which differs from the default case.
Data type detection behavior is valid for both single-valued and multi-valued entities.
Here comes a case where ‘employee_feedback’ is to be dynamically added to employee_Details table and is initially entered as an integer value for the first year’s rating, then the data type of employee_feedback column is constructed as integer and in the coming year if the same column finds an entry of floating value like 3.5 it becomes an impossible action to capture it. So to enable this use case we have an idea here during the table creation :
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility(DEFAULT DATA TYPE * AUTO DATA TYPE PROMOTION )
Yes, it is the option of Data type promotion during the creation which gets our use case ready.
This must help us to maintain the data type with the most genric formbased on the data.
As an example for first year rating our insert statement goes like dis:
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating) values(1,’RAM’,29,’PI_HANA’,2000, 4);
Now employee_rating column takes data type as integer.
And in the coming year when it hits a floating value :
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating) values(1,’RAM’,29,’PI_HANA’,2000, 4.5);
The data type of employee_rating will automatically get promoted to a floating type thus sufficing the needs without any errors.
Here is the allowed conversion rule for data type promotion :
Let us now understand the new capabilities in schema flexibility with this document.
With HANA SPS11 customers can now avail the increases capabilities of schema flexibility in SAP HANA, let us now understand the same via some examples.
Create a column table to store the employee details considering his Name, ID and Age by giving a room to add the other necessary information of employee based on the needs at run time by using the below syntax :
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility;
Adding the clause of ‘with schema flexibility’ during the table creation enables the provision of dynamic column creation during the DML operations like insert/upsert, update or delete.
Once the base structure for Employee_details table is created , there comes a requirement to add some more details like employee_salary, employee_department as new columns to the created table definition, now the dynamicity of employee_details table comes in handy as we have enabled it ‘with schema flexibility’ option, instead of altering the structure of the table we can now directly add whatever data we need to the table as shown below :
Insert into employee_Details(emp_id , emp_name, AGE, employee_salary, employee_department) values(1,’RAM’,29,1000,PI_HANA);
Now the insert statement will get executed successfully irrespective of whether the column highlighted in the insert operations are existing or not, which means the 2 new highlighted columns must get added to the metadata of the table implicitly as part of the insert statement.
Nature of flexible table is to create the dynamic column with default data type as NVARCHAR having maximum length (5000), if we do not want to use this default nature and make the data type of dynamic column as user’s choice it can now be done with HANA SPS11 during the creation of table. lets say in our case any dynamic column that gets added to employee_Details table must have the data type of intereger then we can do it by writing my create statement as:
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility (DEFAULT DATA TYPE INTEGER);
Now any newly created dynamic columns during the insertion/update will take integer as the data type.
If we have a case where the details that get added to employee_Details table are heterogeneous entries and we want the dynamic columns to construct their data types based on the inserted value , we can do that by the following create statement : which is considered as ‘Data type detection’.
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility(DEFAULT DATA TYPE *).
Here the dynamic columns created constructs their type of data based on the value inserted.
That is:
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary) values(1,’RAM’,29,’PI_HANA’,2000);
The last two columns take numeric and string data types respectively which differs from the default case.
Data type detection behavior is valid for both single-valued and multi-valued entities.
Here comes a case where ‘employee_feedback’ is to be dynamically added to employee_Details table and is initially entered as an integer value for the first year’s rating, then the data type of employee_feedback column is constructed as integer and in the coming year if the same column finds an entry of floating value like 3.5 it becomes an impossible action to capture it. So to enable this use case we have an idea here during the table creation :
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility(DEFAULT DATA TYPE * AUTO DATA TYPE PROMOTION )
Yes, it is the option of Data type promotion during the creation which gets our use case ready.
This must help us to maintain the data type with the most genric formbased on the data.
As an example for first year rating our insert statement goes like dis:
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating) values(1,’RAM’,29,’PI_HANA’,2000, 4);
Now employee_rating column takes data type as integer.
And in the coming year when it hits a floating value :
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating) values(1,’RAM’,29,’PI_HANA’,2000, 4.5);
The data type of employee_rating will automatically get promoted to a floating type thus sufficing the needs without any errors.
Here is the allowed conversion rule for data type promotion :
Here is an other case of multi valued promotion that is supported, we now have employee_phone as a new detail to the table and gets added with a value of varchar which is a phone number as below
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating, employee_phone) values(1,’RAM’,29,’PI_HANA’,2000, 4.56,’01233556589’)
It takes the entered input as a singled valued var-char.
Now when employees start using dual/ triple sim cell there is a need to store the multi-valued char’s, It should now be possible to store new data set in the same column without altering it as we have enabled the table with auto data type promotion.
That is :
Insert into employee_Details (emp_id, emp_name, AGE,Emp_deprtment, emp_salary, employee_rating, employee_phone) values(1,’RAM’,29,’PI_HANA’,2000, 4.56,array(’01233556589’,’983232131’,’324324’));
Must now convert employee_Phone column into a multi-valued character attribute.
Flexible table usage majorly contributes for better memory management, to support this we have an operation called ‘Garbage Collection’.
In our case we decide to normalize the ‘employee_feedback’ details by having a separate table for it and thus flush all the values existing in the ‘employee_feedback’ column of employee_details table.
Now implicitly ‘Garbage collection’ comes into picture if our employee_details table is enabled for it in a below manner :
Create column table employee_Details (emp_id int, emp_name varchar(100), AGE int) with schema flexibility (RECLAIM);
Enabling the RECLAIM option will now turn on the Garbage collection and dynamic columns(in our case ‘employee_feedback’) will be automatically dropped if no values are left for all rows in the column.
What if my need for all the above discussed features come after my table is created but not during the creation of table . Should we drop the table and create them ? Answer is No.
Or what if somewhere between the time slots we wish to disable the above characteristics individually in the created table.
It is possible to do that, as all the above discussed operations are supported even with table alter operation as shown below:
1) ALTER TABLE <table name> DISABLE SCHEMA FLEXIBILITY
2) ALTER TABLE <table name> ENABLE SCHEMA FLEXIBILITY [(<options>)]
3) ALTER TABLE <table name> ALTER SCHEMA FLEXIBILITY (<options>)
4) ALTER TABLE <table name> ALTER <column name> [<data type>] DISABLE SCHEMA FLEXIBILITY
5) ALTER TABLE <table name> ALTER <column name> [<data type>] ENABLE SCHEMA FLEXIBILITY
One line explanation for the above operations are correspondingly explained below :
1) With this, all dynamic columns are being converted to static columns. If the column conversion for a dynamic column fails, the operation fails as whole and no changes are applied. Normal tables are allowed to only have a certain number of columns (currently 1,000 columns). In order to successfully convert a Flexible Table into a normal table, the number of columns in the Flexible table must not exceed this limit.
2)Turns flexibility of a database table on.
3)In this case, the option list is mandatory. All schema flexibility options that are listed in the CREATE TABLE … WITH SCHEMA FLEXIBILITY section can be used here and changes on one or several options for a Flexible Table must be successful.
4)Here the specified Dynamic columns must get converted to static columns.
5)Here the specified static columns must get converted to Dynamic columns.
Source: scn.sap.com
No comments:
Post a Comment