little tips and tricks, which i stumbled upon randomly

Friday, May 30, 2014

Informatica Certification - Informatica PowerCenter Data Integration 9.x:Developer Specialist Certification Preparation Guide

After I got my Informatica PowerCenter Data Integration 9.x:Developer Specialist Certificate, I want to share my preparation experience with all who want to take same path.

First of all certification exam is really hard. (As far as it stated official forums,  success rate is 30% .) For all questions there are at least two answers which are very close to each other. Please double check before making final decision.

Although there are some practical questions, practical knowledge is not enough. You need to read and study.

Official skill set is good place to start. https://community.informatica.com/docs/DOC-2601

Although I am not the original creator, here is my list of important topics:


There are only few sample questions floating around the web. Be wise to choose when to solve them.
Only if you are not going to take exam soon, I recommend you to test yourself with this test  (https://community.informatica.com/message/99395#99395
Your target shall be at least 80+ here. Actual exam is way harder than this. If there is not enough time to forget all these questions do not check them before completing reading list. You can only fool your self not Informatica University. 



Finally my memo sheet is below. Just read this before exam. I wish best for all participants.

MEMO SHEET

Active Transformations
  • Change the number of rows that pass through the transformation.
  • Change the transaction boundary. (Transaction Control:commit, roll back)
  • Change the row type (Update strategy)


Custom transformations can be active or passive, but always connected.
               
Unconnected Transformations
  • External Procedure
  • Lookup
  • Stored Procedure


Use local variables in Aggregator, Expression, and Rank transformations to improve performance.

You might use variables to complete the following tasks:
  • Temporarily store data.
  • Simplify complex expressions.
  • Store values from prior rows.
  • Capture multiple return values from a stored procedure.
  • Compare values.
  • Store the results of an unconnected Lookup transformation.


Variable Initialization
The Integration Service does not set the initial value for variables to NULL. Instead, the Integration Service uses the following guidelines to set initial values for variables:
  • Zero for numeric ports
  • Empty strings for string ports
  • 01/01/1753 for Date/Time ports with PMServer 4.0 date handling compatibility disabled
  • 01/01/0001 for Date/Time ports with PMServer 4.0 date handling compatibility enabled


Target
  • You can create targets  with multibyte character sets.
  • Target code pages must be a superset of source code pages when you run a session.


COBOL
  • COBOL When the Designer analyzes the file, it creates a different column for each OCCURS statement in the COBOL file.
  • COBOL uses REDEFINES statements to build the description of one record based on the definition of another record. When you import the COBOL source, the Designer creates a single source that includes REDEFINES.

               
DTM
The DTM process is also known as the pmdtm process.

If the workflow uses a parameter file, the PowerCenter Integration Service process sends the parameter file to the DTM when it starts the DTM. The DTM creates and expands session-level, service-level, and mapping-level variables and parameters.            

The DTM creates logs for the session.

The PowerCenter Integration Service verifies that the source query, target query, lookup database query,
and stored procedure call text convert from the source, target, lookup, or stored procedure data code page
to the UCS-2 character set without loss of data in conversion. If the PowerCenter Integration Service encounters an error when converting data, it writes an error message to the session log.

After validating the session code pages, the DTM verifies permissions for connection objects used in the session.

The DTM verifies that the user who started or scheduled the workflow has execute permissions for connection objects associated with the session.

The DTM sends a request to the PowerCenter Integration Service process to start worker DTM processes on other nodes when the session is configured to run on a grid.

After verifying connection object permissions, the DTM runs pre-session shell commands. The DTM then runs presession stored procedures and SQL commands.

After initializing the session, the DTM uses reader, transformation, and writer threads to extract, transform, and load data.

After the DTM runs the processing threads, it runs post-session SQL commands and stored procedures.

When the session finishes, the DTM composes and sends email that reports session completion or failure.
If the DTM terminates abnormally, the PowerCenter Integration Service process sends post-session email.

For one partition, the master thread creates one reader, one transformation, and one writer thread to process the data.

Grid
  • To prevent concurrent writes to the Model repository, the master service process runs all jobs that write to the Model repository.
  • The worker service processes run all other types of jobs. If a worker service process is selected to run a job,but all the threads of the node are busy, then the next worker service process is selected instead.

Note: The master service process also acts as a worker service process and completes jobs.

Parameters 
The Integration Service looks for the start value of a variable in the following order:
  1. Value in parameter file
  2. Value saved in the repository (if the variable is persistent)
  3. User-specified default value
  4. Datatype default value

 Variables
  1. Value in session parameter file
  2. Value in workflow parameter file
  3. Value in pre-session variable assignment
  4. Value saved in the repository
  5. Initial value
  6. Datatype default value 

The parameter file you set with pmcmd overrides the parameter file in the session or workflow properties.

Variables may be changed during session or wf run. Parameters can not be changed during run time.

If the variable is persistent, the Integration Service saves the current value of the variable to the repository
at the end of a successful workflow run. If the workflow fails to complete, the Integration Service does not update the value of the variable in the repository.

Worklet variables only persist when you run the same workflow. A worklet variable does not retain its value when you use instances of the worklet in different workflows. For each worklet instance, you can override the initial value of the worklet variable by assigning a workflow variable to it.

Rules and Guidelines for Using Worklet Variables
  • Use the following rules and guidelines when you work with worklet variables:
  • You cannot use parent workflow variables in the worklet.
  • You can assign the value of a workflow variable to a worklet variable to initialize it.
  • You cannot use user-defined worklet variables in the parent workflow.
  • You can use predefined worklet variables in the parent workflow, just as you use predefined variables for other tasks in the workflow.

You cannot assign parameters and variables in reusable sessions.

Types of parameters and variables include service variables, service process variables, workflow and worklet variables, session parameters, and mapping parameters and variables

The Integration Service ignores lines that are not valid headings and do not contain an equals sign character (=).
; This is a valid comment because this line contains no equals sign.

Debugger States
  • Initializing State
  • Running State
  • Paused State


Stored Procedure transformation
You might use stored procedures to complete the following tasks:
  • Check the status of a target database before loading data into it.
  • Determine if enough space exists in a database.
  • Perform a specialized calculation.
  • Drop and recreate indexes.


Transaction Control Transformation
expressions:
  • TC_CONTINUE_TRANSACTION (Default Value)
  • TC_COMMIT_BEFORE
  • TC_COMMIT_AFTER
  • TC_ROLLBACK_BEFORE
  • TC_ROLLBACK_AFTER

 If the transaction control expression evaluates to a value other than commit, rollback, or continue, the Integration Service fails the session.

The Transaction Control transformation becomes ineffective for downstream transformations or targets
if you put a transformation that drops incoming transaction boundaries after it.
This includes any of the following active sources or transformations with the All Input level transformation scope:
  • Aggregator transformation
  • Joiner transformation
  • Rank transformation
  • Sorter transformation
  • Custom transformation configured to generate transactions
  • Transaction Control transformation

 Mappings with Transaction Control transformations that are ineffective for targets may be valid or invalid. When you save or validate the mapping, the Designer displays a message indicating which Transaction Control transformations are ineffective for targets.

Transaction Control transformations connected to any target other than relational, XML, or dynamic MQSeries targets are ineffective for those targets.

Java Transformation
A Java transformation always has one input group and one output group. The transformation is not valid if it has multiple input or output groups.
Double read java transformation api options.

Lookup Transformation:
Lookup Policy on Multiple Match
Determines which rows to return when the Lookup transformation finds multiple rows that match the lookup condition. Select one of the following values:
  • Report Error. The Integration Service reports an error and does not return a row.
  • Use First Value. Returns the first row that matches the lookup condition.
  • Use Last Value. Return the last row that matches the lookup condition.
  • Use All Values. Return all matching rows.
  • Use Any Value.The Integration Service returns the first value that matches the lookup condition. It creates an index based on the key ports instead of all Lookup transformation ports. 

When you cache the lookup source, the Integration Service generates an ORDER BY clause for each column in the lookup cache to determine the first and last row in the cache. The Integration Service then sorts each lookup source column in ascending order.

Dynamic Cache:
0 - Do nothing
1 - Insert
2 - Update
THERE IS NO DELETE

WorkFlow Suspension
If one or more tasks are still running in the workflow when a task fails, the Integration Service stops running the failed task and continues running tasks in other paths. The Workflow Monitor displays the status of the workflow as "Suspending"

When you recover a workflow, the Integration Service restarts the failed tasks and continues evaluating the rest of the tasks in the workflow.

Workflow Manager Windows
  • Navigator
  • Workspace.
  • Output.
  • Overview.


Workflow Monitor windows
  • Navigator window. Displays monitored repositories, Integration Services, and repository objects.
  • Output window. Displays messages from the Integration Service and the Repository Service.
  • Properties window. Displays details about services, workflows, worklets, and tasks.
  • Time window. Displays progress of workflow runs.
  • Gantt Chart view. Displays details about workflow runs in chronological (Gantt Chart) format.
  • Task view


Joining in source filter is homogeneous join

Mapplet
Each mapplet must contain one or more Output transformations to pass data from the mapplet into the mapping.

You cannot include the following objects in a mapplet:
  • Normalizer transformations
  • COBOL sources
  • XML Source Qualifier transformations
  • XML sources
  • Target definitions
  • Other mapplets


User Defined Functions
When you validate the expression, PowerCenter does not validate the user-defined function. It only validates the expression.

The Integration Service does not save the final current value of a mapping variable to the repository when any of the following conditions are true:
  • The session fails to complete.
  • The session is configured for a test load.
  • The session is a debug session.


Update Strategy
You can configure the Update Strategy transformation to either pass rejected rows to the next transformation or drop them.
By default, the Integration Service forwards rejected rows to the next transformation.
The Integration Service flags the rows for reject and writes them to the session reject file.
If you do not select Forward Rejected Rows, the Integration Service drops rejected rows and writes them to the session log file.

If you put the Update Strategy before the Aggregator transformation, you must consider how the Aggregator
transformation handles rows flagged for different operations. In this order, the Integration Service flag rows for insert, update, delete, or reject before it performs the aggregate calculation. How you flag a row determines how the Aggregator transformation treats values in that row used in the calculation. For example, if you flag a row for delete and then use the row to calculate the sum, the Integration Service subtracts the value in this row. If you flag a row for reject and then use the row to calculate the sum, the Integration Service does not include the value in this row. If you flag a row for insert or update and then use the row to calculate the sum, the Integration Service adds the value in this row to the sum.

If you delete a workflow that is running, the Integration Service aborts the workflow.

You can edit the task instance in the Workflow Designer. Changes you make in the task instance exist only in the workflow. The task definition remains unchanged in the Task Developer.

Lookup cache
When configuring a lookup cache, you can configure the following options:
  • Persistent cache
  • Recache from lookup source
  • Static cache
  • Dynamic cache
  • Shared cache
  • Pre-build lookup cache

Note: You can use a dynamic cache for relational or flat file lookups.

The HTTP transformation uses the following forms of authentication:
  • Basic. Based on a non-encrypted user name and password.
  • Digest. Based on an encrypted user name and password.
  • NTLM. Based on encrypted user name, password, and domain.


Configure an HTTP application connection in the following circumstances:
  • The HTTP server requires authentication.
  • You want to configure the connection timeout.
  • You want to override the base URL in the HTTP transformation

37 comments:

  1. Ongun bey, Informatica kullanıcıları için çok yararlı bir döküman olmuş elinize sağlık.

    ReplyDelete
  2. what is the cost of the exam?

    ReplyDelete
    Replies
    1. Exam set cost 249$ (1 exam and 1 retake exam in 30 days)

      Delete
  3. Hi Adhi; It is list to restudy just before exam. These key points will cover most of the questions.

    ReplyDelete
    Replies
    1. Hi Ongun,
      Thanks for the reply. Will we get a score summary report after the exam.What it will be.Will it contain the incorrect answered questions or the incorrect answered topic?
      Thanks,

      Delete
    2. Hi Adhi;

      You will only get score for each topic.

      Kind regards
      Ongun

      Delete
    3. Thanks Ongun.You have any personal Email id?

      Delete
  4. Hi Ongun,

    Are you ready to provide any online training on Informatica Certification.Am working as a Software Engineer and I wanna take up Informatica 1st level of Developer Certification. Pls help me out.

    Thx,
    Saran

    ReplyDelete
  5. Hi Saranya;

    Unfortunately I am not providing online training, but feel free to ask any question.

    Kind regards
    Ongun

    ReplyDelete
  6. Hi Ongun,

    Thanks for the article.
    Is the certification exam is a multiple choice ? and do we need to design any mapping in the exam center ?
    Can I know how many years of experience you have with informatica.
    Does the exam has many questions based on real time scenario ? or mostly on books.
    If we have theory knowledge, do you think we can pass the exam ?

    Thanks
    Adr

    ReplyDelete
    Replies
    1. Hi Adriana;

      Yes exam is multiple choice and you do not need to design any mapping.
      Exam is %70 books, %30 real world experience. There are many questions related with less used transformations.

      It may be hard to pass only with theory.

      Kind regards
      Ongun

      Delete
  7. Exam prep says 5% questions on Administration and Architecture. Is this on domains and nodes? Load balancing, grid, high availability? Do you have any bulletpoints around this? Thanks.

    ReplyDelete
    Replies
    1. Unfortunately this is the hardest part of exam. Questions was mostly around grid and services. In one of the question a domain with services scattered around nodes defined and correct sequence of services interaction during login was asked. There are some questions about informatica command line interface as well, little experience on pmcmd mostly sufficient.

      Delete
  8. I notice something like repository object versioning and check-in/check-out isn't mentioned at all. Obviously it's in the guides but is this an example of something that just isn't asked on the Specialist exam?

    ReplyDelete
  9. There doesn't seem to be much of anything on Partitioning either. Is it unlikely questions would be asked on partitions?

    ReplyDelete
  10. I think they are pretty straight forward topics. Moreover there were no questions about both of topics in my exam.
    On the other hand if you have time, studying doesn't hurts :)

    ReplyDelete
  11. Btw, I was thinking about slowly changing dimensions or really any insert/update into target table using lookups. It seems like dynamic lookups are the way to go with the cache being updated dynamically alongside the target table. Is there a case for the insert/update and not dynamically looking up the target? Just doing a static lookup?

    ReplyDelete
  12. If table only couple of times per day, I prefer static lookup with update strategy.

    ReplyDelete
  13. I suppose you could theoretically run into an issue with static lookup where, for example, you have a ProductCode and ProductCategory coming from source to load a Product dimension.
    ProductCode = 'ABC' ProductCategory = '123'
    ProductCode = 'ABC' ProductCategory = '456'

    If the Product Dimension didn't have any ProductCode record yet then static lookup would identify as insert and try to insert both records whereas dynamic lookup would recognize the first record as insert and second record as update.

    I'm not sure where else a dynamic lookup would come in particularly handy?

    Are you saying dynamic lookup could be useful if the data is coming through very frequently?

    ReplyDelete
  14. Hi,

    (https://community.informatica.com/message/99395#99395) link is not opening. Could you share exact link please

    ReplyDelete
  15. Hi Gopinath;

    Unfortunately informatica changed his message board since I wrote this guide. I do not think that sample exits any more.

    Kind regards

    ReplyDelete
  16. Hi,

    I'm planning to take the Informatica certification. Is that enough what you have mentioned in the index listing chapters?


    Thanks,
    Haritha

    ReplyDelete
    Replies
    1. Hi Harita;

      I think these chapters will enough.

      Kind regards

      Delete
  17. How many exams we need to clear to get logo and certificate. Is it after each level or have to clear 2/3 exams.

    ReplyDelete
  18. How many exams we need to clear to get logo and certificate. Is it after each level or have to clear 2/3 exams.

    ReplyDelete
  19. Very Informative, check out http://www.kerneltraining.com/informatica/ one of the best training institute to learn Informatica online.

    ReplyDelete
  20. Nice post ! Thanks for sharing valuable information with us. Keep sharing. Informatica Online Course india

    ReplyDelete
  21. I will suggest here to everyone to use Pass4sure CompTIA Dumps. You can ace your paper just by memorizing questions and answers. Every concept has been discussed briefly but with full details. You will find all the questions in the final familiar to you because you will know the answers of all the questions. I say thanks to Dumpspass4sure for their help. For my any further certification I will definitely use Real Exam dumps material.

    ReplyDelete
  22. I think Informatica and its products are actually very useful in developing some crucial components.

    Informatica Read Rest API

    ReplyDelete
  23. I entered the exam and I got 69% and pass score is 70%:(

    I planning to retake the exam. Do u think they change the questions totally or just rearrange questions with some new ones

    If possible how do I know the right answers of questions in popular online test guide

    I need badly a help in this :'(

    ReplyDelete
  24. Thank you so much for providing information about Informatica and its other necessary components.
    Informatica Read SOAP API

    ReplyDelete
  25. Really you have done a good job. Thanks for sharing this valuable information....
    Informatica Power Center
    Informatica PowerCenter Latest Version

    ReplyDelete
  26. Hard Rock Hotel Casino & Spa - Mapyro
    Find 동해 출장안마 the 보령 출장샵 Hard Rock Hotel 세종특별자치 출장마사지 Casino 경산 출장마사지 & Spa, Hollywood in South Florida, real people reviews and detailed data for 서산 출장샵 Hard Rock Hotel Casino & Spa.

    ReplyDelete