Skip to main content

Source Pre-load and Source Post-Load in Informatica




We can write the stored procedures to carry some action before the Source Qualifier executes and after the load is done on the Dimension table.

Example: Creating an Index on the Source table before the load and dropping the same index after the load is done on the target.

In our example, we have created two proceduresin the source database SCOTT – first one to create an index on the EMPNAME column of the source table EMP, and the second to drop the same index.

CREATE OR REPLACE PROCEDURE P_IN_EMP_NAME
AS
BEGIN
   EXECUTE IMMEDIATE 'CREATE INDEX IND_EN ON EMP (ENAME)';
END;

CREATE OR REPLACE PROCEDURE P_D_IND_EMP_ENAME
 AS
 BEGIN
     EXECUTE IMMEDIATE 'DROP INDEX IND_EN';
 END;

Note: Do not execute these procedures. We’ll do this task in Informatica by calling these procedures one after the other.

Steps:

Define a mapping   M_SRC_PRE_POST_LOAD.

Drag Source (EMP),Target  DIM_EMP to the mapping designer.

Create a SP Transformation T_SP_PRESOURCE_LOAD.

Import the stored procedure P_IN_EMP_NAME

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica



Edit the transformation. There are no ports.

In the Properties set the Connection Information to MEEN_SRC.

Give the Stored Procedure Type as Source Preload.

Assign the call text property as P_IN_EMP_NAME ()

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica
Note: Give the execution order if there are more than one SPs in the Pre Source load.



Similarly Create a SP Transformation T_SP_POSTSOURCE_LOAD

Import the procedure P_D_IND_EMP_ENAME.

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica

Edit the Transformation.

No ports.

Set the Properties:

Connection Information   MEEN_SRC

Call Text P_D_IND_EMP_ENAME ()

Stored Procedure Type Source Post Load

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica


Project the required ports from SQ to the TGT DIM_EMP.

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica
Save the repository.

Define Workflow.

Assign the connections (SRC,TGT,Transfn.)

Source Pre-load and Source Post-Load in Informatica
Source Pre-load and Source Post-Load in Informatica

Save the repository.

Start the work flow.

Observe the Session Log.

Comments

Popular posts from this blog

Contact Me

Do You have any queries ?                   If you are having any query or wishing to get any type of help related Datawarehouse, OBIEE, OBIA, OAC then please e-email on below. I will reply to your email within 24 hrs. If I didn’t reply to you within 24 Hrs., Please be patience, I must be busy in some work. kashif7222@gmail.com

Top 130 SQL Interview Questions And Answers

1. Display the dept information from department table.   Select   *   from   dept; 2. Display the details of all employees   Select * from emp; 3. Display the name and job for all employees    Select ename ,job from emp; 4. Display name and salary for all employees.   Select ename   , sal   from emp;   5. Display employee number and total salary   for each employee. Select empno, sal+comm from emp; 6. Display employee name and annual salary for all employees.   Select empno,empname,12*sal+nvl(comm,0) annualsal from emp; 7. Display the names of all employees who are working in department number 10   Select ename from emp where deptno=10; 8. Display the names of all employees working as   clerks and drawing a salary more than 3000   Select ename from emp where job=’clerk’and sal>3000; 9. Display employee number and names for employees who earn commission   Select empno,ename from emp where comm is not null and comm>0. 10

Informatica sample project

Informatica sample project - 1 CareFirst – Blue Cross Blue Shield, Maryland (April 2009 – Current) Senior ETL Developer/Lead Model Office DWH Implementation (April 2009 – Current) CareFirst Blue Cross Blue Shield is one of the leading health care insurance provided in Atlantic region of United States covering Maryland, Delaware and Washington DC. Model Office project was built to create data warehouse for multiple subject areas including Members, Claims, and Revenue etc. The project was to provide data into EDM and to third party vendor (Verisk) to develop cubes based on data provided into EDM. I was responsible for analyzing source systems data, designing and developing ETL mappings. I was also responsible for coordinating testing with analysts and users. Responsibilities: ·          Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. ·          Understood the requirement and develope