Skip to main content

About Informatica server process that how it works relates to mapping variables?




Informatica primary uses load manager and data transformation manager 
(dtm) to perform extracting, transformation and loading. Load manager reads 
parameters and variables related to session, mapping and server and passes 
the mapping parameters and variable information to the DTM.DTM uses this 
information to perform the data movement from source to target

Comments

  1. Informatica Data Quality Online Training - Informatica MDM Online Training - Informatica Online Training - - Data Ware Housing Concepts - Introduction to Data Warehousing - Introduction to OLTP - - ETL and OLAP Systems - Dimensional Modelling - Typs of Schema’s - Types of Fact and Dimension Tables - Why do we Need ETL tools - What is Informatica? - Power Centre Components - Informatica Architecture - Designer - Repository Manager - Workflow Manager - Workflow Monitor - Informatica Concepts and Overview - Source Qualifier - Target Designer - Transformations - Mappings & Mapplets - Sessions - Workflows & Work lets - Tasks - Sources - Working with Relational Sources - Working with Flat Files (Direct/Indirect) - Targets - Working with Relational Targets - Working with Flat file Targets - Transformations - Active and Passive Transformations - Connected and Unconnected Transformations - Reusable, - Non Reusable & Shortcuts - Expression - Source Qualifier - Filter - Joiner - Sorter - Rank - Router - Union - Aggregator - Sequence Generator - Lookup - Stored Procedure - Update Strategy - Normaliser - SQL Transformation - Transaction Control - Mapplet input and Output transformation - Implementation of Slowly Changing Dimensions - Type-1 - Type-2 - Type-3 - Cache concepts (Lookup, Aggregator, Joiner, Sorter) - Tasks - Start - Session - Assignment - Event raise and Event wait - Email - Command - Decision - Timer - Control - Advanced features - Incremental Aggregation - Mapplet Implementation - Target Load Plan - Target update override - Constraint Based loading - Mapping Parameters & Variables - Session Parameters - Link Conditions - Performance Tuning - Real Time scenarios & FAQ’s -

    ReplyDelete

Post a Comment

Popular posts from this blog

Contact Me

Do You have any queries ?                   If you are having any query or wishing to get any type of help related Datawarehouse, OBIEE, OBIA, OAC then please e-email on below. I will reply to your email within 24 hrs. If I didn’t reply to you within 24 Hrs., Please be patience, I must be busy in some work. kashif7222@gmail.com

Top 130 SQL Interview Questions And Answers

1. Display the dept information from department table.   Select   *   from   dept; 2. Display the details of all employees   Select * from emp; 3. Display the name and job for all employees    Select ename ,job from emp; 4. Display name and salary for all employees.   Select ename   , sal   from emp;   5. Display employee number and total salary   for each employee. Select empno, sal+comm from emp; 6. Display employee name and annual salary for all employees.   Select empno,empname,12*sal+nvl(comm,0) annualsal from emp; 7. Display the names of all employees who are working in department number 10   Select ename from emp where deptno=10; 8. Display the names of all employees working as   clerks and drawing a salary more than 3000   Select ename from emp where job=’clerk’and sal>3000; 9. Display employee number and names for employees who earn commission   Select empno,ename from emp where comm is not null and comm>0. 10

Informatica sample project

Informatica sample project - 1 CareFirst – Blue Cross Blue Shield, Maryland (April 2009 – Current) Senior ETL Developer/Lead Model Office DWH Implementation (April 2009 – Current) CareFirst Blue Cross Blue Shield is one of the leading health care insurance provided in Atlantic region of United States covering Maryland, Delaware and Washington DC. Model Office project was built to create data warehouse for multiple subject areas including Members, Claims, and Revenue etc. The project was to provide data into EDM and to third party vendor (Verisk) to develop cubes based on data provided into EDM. I was responsible for analyzing source systems data, designing and developing ETL mappings. I was also responsible for coordinating testing with analysts and users. Responsibilities: ·          Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. ·          Understood the requirement and develope