Posts

Showing posts from 2014

SAP BI on HANA Testing

Image
owing to its complex landscape, we have just too many components that are integrating. As shown above, we will below components typically in a landscape. 1. SAP HANA Box 2. SAP SLT 3. SAP BODS 4. SAP BI 5. SAP ECC/CRM/SRM 6. Legacy System 7. SAP BOBJ Suite SAP HANA Box, it self has many components (to be continued...)

Ramblings on SAP BW Powered by HANA

Image
Most of customer's Use-case for HANA appliance is BW on HANA Database(HDB). This is the entry point of HANA into most of the customers Landscape. Here I will quickly try to cover what's new with BW 7.3 and 7.4 on HANA. Migrating BW onto HANA can be primarily seen from two perspectives. 1. Copying data from legacy DB to HANA 2. SAP BW application server migration onto HDB Here I am referring to migration with minimum system downtime. We will initally copy data into HANA and when everything is ready, will take down BI system down. Then migrate application servers too onto HANA. This ensures minimum service disruption time. Database Migration : Mass copying of data from Legacy into HDB, in general will be handled by in house BASIS team or your HANA hardware vendor can help in DB migration. In the Software Update Manager( SUM ),  we have Database Migration Option( DMO ). This makes our job easier to copy data foot print. However also note that, due to HANA Column stora...

Certification,A good read about SAP BW and its lineage and Free SAP PAL Trial!!!

This week started off with high by clearing HANA Certification exam(C_HANAIMP131). Although the exam is fairly easy, being certified by SAP feels good. I found below article very engaging, exploring BW' starting from initial jinx (which I never heard off) till BW on HANA. BW is Dead – Long Live BW Currently I'm working to create some models using SAP PAL.Hopefully by next week I can publish some. Also,  SAP is giving away free trail for 30 days to explore PAL. 

Pick the right Reporting Tool and SAP Lumira, SAP PAL, SAP InfiniteInsight

SAP BIBO 4.0 portfolio boasts of multiple reporting solutions. At times it might be confusing which tool to pick. Although one tool for all scenario doesn't exist, a refined set of reporting requirements should make it clear which tool to use for which scenario. Each tool has its own stronghold and weak areas. Crystal Reports : If we want pixel perfect reports, we will go for Crystal Reports. Mostly if requirement is to design report according to legal requirements or companies follow rigid format of reporting. Also Crystal reports is ideal, if we want to place lot of data in the report like year end results for stock holders. Also we can make the content dynamic by using variables. They can be broadcast/shared with business users.                              However the report format has to be pre-built and regular business users might find it difficult to deal with tool. We require power users to built t...

SAP HANA Reporting Options

Image
We will have a look at how BIBO suite reporting tools can interact with Data Modeled in HANA. Although most of the semantic layer options(Column aliases,filters etc) are also available in HANA modeler semantics node, building Universe on HANA gives existing users ease of use. As we have large existing user base who are good at designing Universe and also some options make sense to create in universe rather than in HANA modeler. *Universe still uses JDBC/ODBC  Connection to  interact with SAP HANA. It does add any      overhead/performance issue while interacting with HANA. It merely used as a passage. *Although explorer can be connected via Universe, it is recommended to connect HANA directly. As   Explorer is optimized for HANA(no need to build/refresh indexes and HANA provides real-time  data). *As Excel uses MDX query language, we can use it to display Hierarchies(parent-child/level) from HANA.

Consuming HANA Models in BW, New breed of info Providers

BW on HANA We have 4 ways to make use of HANA data in BW. 1.        Virtual providers Virtual providers which are actually meant for real-time reposting based on data from OLTP systems directly (the whole data loading into Datasource, DTP execution and loading into infosource happens just right when query is executed in real time) can be used for consuming analytic/calculation views in BW. When we create virtual providers we have new option “Based on HANA”, which will ask you to input schema name and model name. Once model is selected then we start with modelling process in workbench. We have to select characteristics, key figures for our virtual info provider. Then we can map these to fields from HANA model (F4 help is provided). Here by providing BW Infoobjects will can make use of BW master data while reporting.  That’s all and virtual provider is ready for consumption using reporting tools. 2.        Transi...

SAP V1 V2 V3 JOBS

Image
Before starting with V-series jobs, lets review some fundamentals. Data base LUW:  It consists of sequence of DB operations(INSERT,UPDATE,DELETE) that needs to be committed to either all or none to keep DB in consistent state. COMMIT will be executed when all DB operations were successfully executed, once COMMITTED these changes can't be ROLLED BACK. This is part of traditional ACID property of any RDBMS. If a failure happens while executing the LUW(will be mostly technical like memory crunch,integrity violation or data base crashed,power failure) and when DB is restarted, DB will be ROLLed BACK to previous state. Which is nothing but the state when last COMMIT was happened. In other words, when ROLL BACK happens DB will be safely brought to previous COMMIT state. DB maintains log of actions performed since last COMMIT and as part of recovery process, all these actions will be reversed. All RDBMS will have in-built locking/multi concurrency control such that dirty READ-WRITE ope...

RETURN EXCEPTION OBJECT FROM WCF SERVICE

while talking about creating WCF services, we always focused on ABC(Address, binding,contract) of endpoint.When service executes successfully, we will return the corresponding object which will be return data type of WCF method. However, when a services results in an exception, we want to throw this error to the caller for better error analysis. to felicitate this we have tag [FAULT CONTRACT] to be defined in service interface for the method. Along with [DATACONTRACT], [SERVICECONTRACT], [DATAMEMBER], we could define [FAULTCONTRACT] which will specify the type exception object to be thrown to caller. This will ensure the caller to have better error handling. Sample code: public interface IService { [OperationContract] [FaultContract(typeof(MathFault))] int Divide(int n1, int n2);  } Corresponding interface implementation: public int Divide(int n1, int n2) { try { return n1 / n2; } catch (DivideByZeroException) { MathFault...

ETL SYSTEM DESIGN

ETL systems are used to Extract data from heterogeneous sources, Transform data (merge data from various systems, DE-normalize data, create conformed dimensions, compute fact tables), finally Load data into target OLAP system. In other words, they should feed data into OLAP systems such that content is ready to be designed according to business needs, query execution is optimized. Also, data cleansing, validation and data quality also should be ensured by ETL systems. on a whole ETL systems should take care of below issues. 1. Transform data into conformed dimensions/metrics suitable to model according to business needs 2. Validate input data and cleansing data to ensure quality data input to OLAP system 3. Merge data from multiple systems alias data integration. In general ERP systems will have multiple master       data management systems. Also different user groups/departments might used multiple systems. Data from     all these sources should be conso...

SAP HANA RAMBLINGS

Image
SAP HANA is a leading IMDB(In-memory DB, not the movies one:)). It works by keeping all required in RAM. The idea of developing in-memory cache was started with mounting pressure from BW customers for faster query execution times. Also the same case with APO CIF queues. SAP answered this query by developing live cache for APO and BW accelerator, both works keeping data in-memory. This prompted to develop a fully operational DB in-memory, SAP HANA. The cheaply available RAM costs are also a main reason. SAP HANA is configured for parallel processing and distributed processing. When a query is passed to HANA DB engine, HANA observes the generated execution plan of the query. Then it identifies the parts of query that can be computed in parallel there by reducing response time.  Also, HANA is deployed in infrastructure that has disaster tolerance and high availability. SAP teams up with hardware vendors like HP,Hitachi, IBM to develop this hardware infrastructure. These vendo...

Connecting and Consuming CRM webservices using WCF component

Being new to MS CRM environment and with my naive C# skills, i struggled my bit to create WCF component to connect/consume CRM web services. In this blog, I will go through all the required steps from start to end. Also many resources are available for reference in bits and pieces, I thought of putting everything at one place. Motive behind developing WCF component :  We have users with out much knowledge of CRM and its terminology. we wanted a simplified, lite web application that our users can use with ease. Also we have multiple web components that are connecting to CRM. Hence we zeroed on developing WCF component which consumes CRM services. By hosting this WCF service, any one with corresponding link and credentials could connect/consume CRM services with out bothering much technical details. Also different user groups should be made to access only specific services. Also instead of using CRM DB context with LINQ queries, CRM Services will make developer life easier. As...