Oracle Business Intelligence Application Architect
☎ +91 9994883085
Don't miss

Monday, 9 September 2013

Troubleshooting Normalizer Transformations


By on 03:36:00



Troubleshooting Normalizer Transformations 

________________________________________

I cannot edit the ports in my Normalizer transformation when using a relational source. 

When you create ports manually, you must do so on the Normalizer tab in the transformation, not the Ports tab. 

Importing a COBOL file failed with a lot of errors. What should I do? 

Check your file heading to see if it follows the COBOL standard, including spaces, tabs, and end of line characters. The header should be similar to the following: 

identification division.

program-id. mead.

environment division.

select file-one assign to "fname".

data division.

file section.

fd FILE-ONE.

The import parser does not handle hidden characters or extra spacing very well. Be sure to use a text-only editor to make changes to the COBOL file, such as the DOS edit command. Do not use Notepad or Wordpad. 

A session that reads binary data completed, but the information in the target table is incorrect. 

Open the session in the Workflow Manager, edit the session, and check the source file format to see if the EBCDIC/ASCII is set correctly. The number of bytes to skip between records must be set to 0. 

I have a COBOL field description that uses a non-IBM COMP type. How should I import the source? 

In the source definition, clear the IBM COMP option. 

In my mapping, I use one Expression transformation and one Lookup transformation to modify two output ports from the Normalizer transformation. The mapping concatenates them into one single transformation. All the ports are under the same level, which does not violate the Ni-or-1 rule. When I check the data loaded in the target, it is incorrect. Why is that? 

You can only concatenate ports from level one. Remove the concatenation. 


If you connect the ports directly from the Normalizer transformation to targets, you connect the records from HST_MTH, represented in the Normalizer transformation, to their own target definition, distinct from any other target that may appear in the mapping. 

3. Open the new Normalizer transformation. 

4. Select the Ports tab and review the ports in the Normalizer transformation. 

5. Click the Normalizer tab to review the original organization of the COBOL source. 

This tab contains the same information as in the Columns tab of the source definition for this COBOL source. However, you cannot modify the field definitions in the Normalizer transformation. If you need to make modifications, open the source definition in the Source Analyzer. 

6. Select the Properties tab and enter the following settings: 

Setting
Description
Reset
Resets the generated key value after the session finishes to its original value.
Restart
Restarts the generated key values from 1 every time you run a session.
Tracing level
Determines the amount of information about this transformation that the PowerCenter Server writes to the session log. You can override this tracing level when you configure a session.


  1. Click OK.
  1. Connect the Normalizer transformation to the rest of the mapping.
If you have denormalized data for which the Normalizer transformation has created key values, connect the ports representing the repeated data and the output port for the generated keys to a different portion of the data flow in the mapping. Ultimately, you may want to write these values to different targets.

To add a Normalizer transformation to a mapping:
  1. In the Mapping Designer, choose Transformation-Create. Select Normalizer transformation. Enter a name for the Normalizer transformation. Click Create.
The naming convention for Normalizer transformations is NRM_TransformationName. The Designer creates the Normalizer transformation.
If your mapping contains a COBOL source, and you do not have the option set to automatically create a source qualifier, the Create Normalizer Transformation dialog box displays. For more information on this option, see “Using the Designer” in the Designer Guide.
  1. If the create Normalizer Transformation dialog box displays, select the Normalizer transformation type.
  1. Select the source for this transformation. Click OK.
  1. Open the new Normalizer transformation.
  1. Select the Normalizer tab and add new output ports.
Add a port corresponding to each column in the source record that contains denormalized data. The new ports only allow the number or string datatypes. You can create only new ports in the Normalizer tab, not the Ports tab.
Using the level controls in the Normalizer transformation, identify which ports belong to the master and detail records. Adjust these ports so that the level setting for detail ports is higher than the level setting for the master record. For example, if ports from the master record are at level 1, the detail ports are at level 2. When you adjust the level setting for the first detail port, the Normalizer transformation creates a heading for the detail record.

Enter the number of times detail records repeat within each master record.
  1. After configuring the output ports, click Apply.
The Normalizer transformation creates all the input and output ports needed to connect master and detail records to the rest of the mapping. In addition, the Normalizer transformation creates a generated key column for joining master and detail records. When you run a session, the PowerCenter Server generates unique IDs for these columns.
  1. Select the Properties tab and enter the following settings: 
Setting
Description
Reset
Reset generated key sequence values at the end of the session.
Restart
Start the generated key sequence values from 1.
Tracing level
Determines the amount of information PowerCenter Server writes to the session log. You can override this tracing level when you configure a session.

  1. Click OK.
  1. Connect the Normalizer transformation to the rest of the mapping.
  1. Choose Repository-Save. 

0 comments:

Post a Comment

Blog Archive

Labels

1z0-482 Dumps (2) 1Z0-525 Dumps (1) BI Apps Installation (1) BI Publisher Interview Questions (1) BICS (2) Business Intelligence (1) DATA WAREHOUSE ADMINISTRATOR CONSOLE (18) Data Warehousing (48) E-BIZ R12 (1) E-BIZ R12 INSTALLATION (1) Essbase (1) hmailserver (1) Hyperion (1) Hyperion Essbase (2) Hyperion Essbase Interview Questions (1) Hyperion financial management (1) Informatica (163) Informatica Installation 9.6.1 (2) Informatica Interview Question (47) Informatica Online Training (1) informatica scenarios questions (1) Informatica Training (1) Informatica Training Chennai (1) JAVA (1) linux (1) Normalization in Oracle (1) OBIA 11.1.1.7.1 (1) OBIA 7.9.6.3 (3) OBIA Installation (33) OBIEE (2) OBIEE 10G (8) OBIEE 11G (138) OBIEE 11g dumps (2) OBIEE 11g Interview Questions (1) OBIEE 12c (20) OBIEE 12c Architecture (1) OBIEE 12C TRAINING (1) OBIEE certification (2) OBIEE Interview Question (1) OBIEE Publisher Interview Questions (1) OBIEE TRAINING CHENNAI (1) OBIEE TRAINING ONLINE (1) ODI (Oracle data integrator) (2) odi 11g certification (1) ODI dumps (2) Oracle Administrator (20) Oracle Business Intelligence Application (3) Oracle Data Integrator (2) Oracle Data Integrator Interview questions for experience (1) Oracle Database (3) Oracle Database 12c Installation steps (3) Oracle Database Installation in Linux 6 (2) Oracle E-Business R 12.3 (1) Oracle Hyperion Planning and Budgeting Questionnaire (2) oracle Joins (1) Power BI (5) RCU INSTALLATION (1) scorecard in obiee 11g (1) SQL (3) SQL Interview Questions (1) sql server 2008 (2) Sql server installation (1) WinSCP (1)