Skip to main content

DAC - Source System Parameters in DAC

 Source system Parameters in DAC                   
Parameters represent an attribute in ETL process, which is not hard coded for the source tables. It just defines the execution period for all the ETL process environments.
Source system parameters apply to all tasks in a source system container.
Source system parameters take priority over global external parameters.
Parameters used in the ETL process from source tables to target tables to define the various period of extracting data for the flexibility of business requirements to analyze.
On dimension tables we can set the default parameter values to view the changes occur on extracting the data over the period of time in Oracle BI Applications. If you do not want to track changes on a particular table, you can change the parameter value "Y" (yes) to "N" (no). Source Where we can see the source system Parameters:
Under Design TabΓ Source System Parameters….See the Image Below

In source system parameter we can define name for the parameter and its load type, data type and the variable name with its value.
Load Type: It defines the ETL process whether full load or incremental load for the source system.
Data Type: It defines the data type of the source system parameter going to load. Some of the data types can define in the source system parameters are.
Text
Time Stamp
Sql

Parameter Value:
We have to define the parameter value, the value is classified into static or Runtime (dynamic).
Source System and task parameters that have either a Text or Timestamp data type can be specified as either static or runtime.
Static Variable:
Specifies a value that remains constant for all ETL runs. The values of static parameters are user defined and are not necessarily constant for all ETL processes.
Run Time (Dynamic):
The values for runtime parameters are provided by DAC and relate to runtime data. Such values are used in the ETL logic but cannot be predicted, such as process ID, last refresh timestamp, ETL start timestamp, and so on. Specifies the value will be provided by DAC during the execution of the task.
Some of the Run time Parameters predefined for DAC for the “Text”, shown below.
·         @DAC_CURRENT_PROCESS_ID. Returns the current process ID.
·         @DAC_DATASOURCE_NUM_ID. Returns the data source number ID.
·         @DAC_DATATARGET_NUM_ID. Returns the target database number ID.
·         @DAC_EXECUTION_PLAN_NAME. Returns the name of the current execution plan.
·         @DAC_EXECUTION_PLAN_RUN_NAME. Returns the run name of the current execution plan.
·         @DAC_READ_MODE. Returns the "read mode" while running the task. The possible values are FULL and INCREMENTAL.
·         @DAC_SOURCE_DBTYPE. Returns the task's primary source database type.
·         @DAC_SOURCE_TABLE_OWNER. Returns the table owner of the source database.
·         @DAC_SOURCE_PRUNE_DAYS. Returns the number of prune days for the primary source as defined in the execution plan connectivity parameters.
·         @DAC_SOURCE_PRUNE_MINUTES. Returns the number of prune minutes for the primary source as defined in the execution plan connectivity parameters
·         @DAC_TARGET_DBTYPE. Returns the task's primary target database type.
·         @DAC_TARGET_PRUNE_DAYS. Returns the number of prune days for the primary target as defined in the execution plan connectivity parameters.
·         @DAC_TARGET_PRUNE_MINUTES. Returns the number of prune minutes for the primary target as defined in the execution plan connectivity parameter.
·         @DAC_TARGET_TABLE_OWNER. Returns the table owner of the target database.
·         @DAC_TASK_NAME. Returns the task name of the task that is currently running.
·         @DAC_TASK_NUMBER_OF_LOOPS. Returns the task's total number of loops as defined in the task's extended property.
·         @DAC_TASK_RUN_INSTANCE_NUMBER. Returns the instance number of the task currently running.
·         @DAC_TASK_RUN_INSTANCE_NUMBER_DESC. Returns the instance number of the task currently running in descending order.
·         @DAC_TASK_FULL_COMMAND. Returns the name of the Informatica workflow for a task's full load command.
·         @DAC_TASK_INCREMENTAL_COMMAND. Returns the name of the Informatica workflow for a task's incremental load command.
·         @DAC_WRITE_MODE. Returns the "write mode" while running the task. The possible values are FULL and INCREMENTAL
DAC Variables for the Timestamp Data Type
The following variables are available when you define a runtime parameter of the Timestamp data type.
·         @DAC_ETL_START_TIME. Returns the timestamp for the start time of the ETL process.
·         @DAC_ETL_START_TIME_FOR_SOURCE. Returns the timestamp for the source database.
·         @DAC_ETL_START_TIME_FOR_TARGET. This variable returns the timestamp for the target database.
·         @DAC_ETL_PRUNED_START_TIME. Returns the current execution plan's actual start time minus the prune minutes.
·         @DAC_ETL_PRUNED_START_TIME_FOR_SOURCE. Returns the current execution plan's actual start time adjusted to the source database time zone, minus the prune minutes.
·         @DAC_ETL_PRUNED_START_TIME_FOR_TARGET. Returns the current execution plan's actual start time adjusted to the target database time zone, minus the prune minutes.
·         @DAC_CURRENT_TIMESTAMP. Returns the current timestamp of the DAC Server.
·         @DAC_SOURCE_REFRESH_TIMESTAMP. Returns the minimum of the task's primary or auxiliary source tables last refresh timestamp.
·         @DAC_TARGET_REFRESH_TIMESTAMP. Returns the minimum of the task's primary or auxiliary target tables' last refresh timestamp.
·         @DAC_SOURCE_PRUNED_REFRESH_TIMESTAMP. Returns the minimum of the task's primary or auxiliary source tables last refresh timestamp, minus the prune minutes.
·         @DAC_TARGET_PRUNED_REFRESH_TIMESTAMP. Returns the minimum of the task's primary or auxiliary target tables last refresh timestamp, minus the prune minutes.

Comments

Popular posts from this blog

Contact Me

Do You have any queries ?                   If you are having any query or wishing to get any type of help related Datawarehouse, OBIEE, OBIA, OAC then please e-email on below. I will reply to your email within 24 hrs. If I didn’t reply to you within 24 Hrs., Please be patience, I must be busy in some work. kashif7222@gmail.com

Top 130 SQL Interview Questions And Answers

1. Display the dept information from department table.   Select   *   from   dept; 2. Display the details of all employees   Select * from emp; 3. Display the name and job for all employees    Select ename ,job from emp; 4. Display name and salary for all employees.   Select ename   , sal   from emp;   5. Display employee number and total salary   for each employee. Select empno, sal+comm from emp; 6. Display employee name and annual salary for all employees.   Select empno,empname,12*sal+nvl(comm,0) annualsal from emp; 7. Display the names of all employees who are working in department number 10   Select ename from emp where deptno=10; 8. Display the names of all employees working as   clerks and drawing a salary more than 3000   Select ename from emp where job=’clerk’and sal>3000; 9. Display employee number and names for employees who earn commission   Select empno,ename from emp where comm is not null and comm>0. 10

Informatica sample project

Informatica sample project - 1 CareFirst – Blue Cross Blue Shield, Maryland (April 2009 – Current) Senior ETL Developer/Lead Model Office DWH Implementation (April 2009 – Current) CareFirst Blue Cross Blue Shield is one of the leading health care insurance provided in Atlantic region of United States covering Maryland, Delaware and Washington DC. Model Office project was built to create data warehouse for multiple subject areas including Members, Claims, and Revenue etc. The project was to provide data into EDM and to third party vendor (Verisk) to develop cubes based on data provided into EDM. I was responsible for analyzing source systems data, designing and developing ETL mappings. I was also responsible for coordinating testing with analysts and users. Responsibilities: ·          Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. ·          Understood the requirement and develope