Skip to main content

Troubleshooting Normalizer Transformations



Troubleshooting Normalizer Transformations 

________________________________________

I cannot edit the ports in my Normalizer transformation when using a relational source. 

When you create ports manually, you must do so on the Normalizer tab in the transformation, not the Ports tab. 

Importing a COBOL file failed with a lot of errors. What should I do? 

Check your file heading to see if it follows the COBOL standard, including spaces, tabs, and end of line characters. The header should be similar to the following: 

identification division.

program-id. mead.

environment division.

select file-one assign to "fname".

data division.

file section.

fd FILE-ONE.

The import parser does not handle hidden characters or extra spacing very well. Be sure to use a text-only editor to make changes to the COBOL file, such as the DOS edit command. Do not use Notepad or Wordpad. 

A session that reads binary data completed, but the information in the target table is incorrect. 

Open the session in the Workflow Manager, edit the session, and check the source file format to see if the EBCDIC/ASCII is set correctly. The number of bytes to skip between records must be set to 0. 

I have a COBOL field description that uses a non-IBM COMP type. How should I import the source? 

In the source definition, clear the IBM COMP option. 

In my mapping, I use one Expression transformation and one Lookup transformation to modify two output ports from the Normalizer transformation. The mapping concatenates them into one single transformation. All the ports are under the same level, which does not violate the Ni-or-1 rule. When I check the data loaded in the target, it is incorrect. Why is that? 

You can only concatenate ports from level one. Remove the concatenation. 


If you connect the ports directly from the Normalizer transformation to targets, you connect the records from HST_MTH, represented in the Normalizer transformation, to their own target definition, distinct from any other target that may appear in the mapping. 

3. Open the new Normalizer transformation. 

4. Select the Ports tab and review the ports in the Normalizer transformation. 

5. Click the Normalizer tab to review the original organization of the COBOL source. 

This tab contains the same information as in the Columns tab of the source definition for this COBOL source. However, you cannot modify the field definitions in the Normalizer transformation. If you need to make modifications, open the source definition in the Source Analyzer. 

6. Select the Properties tab and enter the following settings: 

Setting
Description
Reset
Resets the generated key value after the session finishes to its original value.
Restart
Restarts the generated key values from 1 every time you run a session.
Tracing level
Determines the amount of information about this transformation that the PowerCenter Server writes to the session log. You can override this tracing level when you configure a session.


  1. Click OK.
  1. Connect the Normalizer transformation to the rest of the mapping.
If you have denormalized data for which the Normalizer transformation has created key values, connect the ports representing the repeated data and the output port for the generated keys to a different portion of the data flow in the mapping. Ultimately, you may want to write these values to different targets.

To add a Normalizer transformation to a mapping:
  1. In the Mapping Designer, choose Transformation-Create. Select Normalizer transformation. Enter a name for the Normalizer transformation. Click Create.
The naming convention for Normalizer transformations is NRM_TransformationName. The Designer creates the Normalizer transformation.
If your mapping contains a COBOL source, and you do not have the option set to automatically create a source qualifier, the Create Normalizer Transformation dialog box displays. For more information on this option, see “Using the Designer” in the Designer Guide.
  1. If the create Normalizer Transformation dialog box displays, select the Normalizer transformation type.
  1. Select the source for this transformation. Click OK.
  1. Open the new Normalizer transformation.
  1. Select the Normalizer tab and add new output ports.
Add a port corresponding to each column in the source record that contains denormalized data. The new ports only allow the number or string datatypes. You can create only new ports in the Normalizer tab, not the Ports tab.
Using the level controls in the Normalizer transformation, identify which ports belong to the master and detail records. Adjust these ports so that the level setting for detail ports is higher than the level setting for the master record. For example, if ports from the master record are at level 1, the detail ports are at level 2. When you adjust the level setting for the first detail port, the Normalizer transformation creates a heading for the detail record.

Enter the number of times detail records repeat within each master record.
  1. After configuring the output ports, click Apply.
The Normalizer transformation creates all the input and output ports needed to connect master and detail records to the rest of the mapping. In addition, the Normalizer transformation creates a generated key column for joining master and detail records. When you run a session, the PowerCenter Server generates unique IDs for these columns.
  1. Select the Properties tab and enter the following settings: 
Setting
Description
Reset
Reset generated key sequence values at the end of the session.
Restart
Start the generated key sequence values from 1.
Tracing level
Determines the amount of information PowerCenter Server writes to the session log. You can override this tracing level when you configure a session.

  1. Click OK.
  1. Connect the Normalizer transformation to the rest of the mapping.
  1. Choose Repository-Save. 

Comments

Popular posts from this blog

Contact Me

Do You have any queries ?                   If you are having any query or wishing to get any type of help related Datawarehouse, OBIEE, OBIA, OAC then please e-email on below. I will reply to your email within 24 hrs. If I didn’t reply to you within 24 Hrs., Please be patience, I must be busy in some work. kashif7222@gmail.com

Top 130 SQL Interview Questions And Answers

1. Display the dept information from department table.   Select   *   from   dept; 2. Display the details of all employees   Select * from emp; 3. Display the name and job for all employees    Select ename ,job from emp; 4. Display name and salary for all employees.   Select ename   , sal   from emp;   5. Display employee number and total salary   for each employee. Select empno, sal+comm from emp; 6. Display employee name and annual salary for all employees.   Select empno,empname,12*sal+nvl(comm,0) annualsal from emp; 7. Display the names of all employees who are working in department number 10   Select ename from emp where deptno=10; 8. Display the names of all employees working as   clerks and drawing a salary more than 3000   Select ename from emp where job=’clerk’and sal>3000; 9. Display employee number and names for employees who earn commissi...

Informatica sample project

Informatica sample project - 1 CareFirst – Blue Cross Blue Shield, Maryland (April 2009 – Current) Senior ETL Developer/Lead Model Office DWH Implementation (April 2009 – Current) CareFirst Blue Cross Blue Shield is one of the leading health care insurance provided in Atlantic region of United States covering Maryland, Delaware and Washington DC. Model Office project was built to create data warehouse for multiple subject areas including Members, Claims, and Revenue etc. The project was to provide data into EDM and to third party vendor (Verisk) to develop cubes based on data provided into EDM. I was responsible for analyzing source systems data, designing and developing ETL mappings. I was also responsible for coordinating testing with analysts and users. Responsibilities: ·          Interacted with Data Modelers and Business Analysts to understand the requirements and the impact of the ETL on the business. ·  ...