Informatica Interview Questions

What Is Informtica?

Informatica is a Software development company, which offers data integration products. If offers products for ETL, data masking, data Quality, data replica, data virtualization, master data management, etc.

Informatica Powercenter ETL/Data Integration tool is a most widely used tool and in the common term when we say Informatica, it refers to the Informatica PowerCenter tool for ETL.

Informatica Powercenter is used for Data integration. It offers the capability to connect & fetch data from different heterogeneous source and processing of data.

For example, you can connect to an SQL Server Database and Oracle Database both and can integrate the data into a third system.

The latest version of Informatica PowerCenter available is 9.6.0. The different editions for the PowerCenter are

  • Standard edition
  • Advanced edition
  • Premium edition

The popular clients using Informatica Powercenter as a data integration tool are U.S Air Force, Allianz, Fannie Mae, ING, Samsung, etc. The popular tools available in the market in competition to Informatica are IBM Datastage, Oracle OWB, Microsoft SSIS and Ab Initio.


Define Informatica?

Ans: Informatica is a tool, supporting all the steps of Extraction, Transformation and Load process. Now days Informatica is also being used as an Integration tool.Informatica is an easy to use tool. It has got a simple visual interface like forms in visual basic. You just need to drag and drop different objects (known as transformations) and design process flow for Data extraction transformation and load.

These process flow diagrams are known as mappings. Once a mapping is made, it can be scheduled to run as and when required. In the background Informatica server takes care of fetching data from source, transforming it, & loading it to the target systems/databases.

Why do we need Informatica?

Informatica comes to the picture wherever we have a data system available and at the backend we want to perform certain operations on the data. It can be like cleaning up of data, modifying the data, etc. based on certain set of rules or simply loading of bulk data from one system to another.

Informatica offers a rich set of features like operations at row level on data, integration of data from multiple structured, semi-structured or unstructured systems, scheduling of data operation. It also has the feature of metadata, so the information about the process and data operations are also preserved.


What are the advantages of Informatica?

Informatica has some advantages over other data integration systems. A couple of the advantages are:

  • It is faster than the available platforms.
  • You can easily monitor your jobs with Informatica Workflow Monitor.
  • It has made data validation, iteration and project development to be easier than before.
  • If you experience failed jobs, it is easy to identify the failure and recover from it. The same applies to jobs that are running slowly.


Its GUI tool, Coding in any graphical tool is generally faster than hand code scripting.
Can communicate with all major data sources (mainframe/RDBMS/Flat Files/XML/VSM/SAP etc).
Can handle vary large/huge data very effectively.
User can apply Mappings, extract rules, cleansing rules, transformation rules, aggregation logic and loading rules are in separate objects in an ETL tool. Any change in any of the object will give minimum impact of other object.
Reusability of the object (Transformation Rules)
Informatica has different “adapters” for extracting data from packaged ERP applications (such as SAP or PeopleSoft).
Availability of resource in the market.
Can be run on Window and Unix environment.

In what real situations can Informatica be used?

 Informatica has a wide range of application that covers areas such as:

  • Data migration.
  • Application migration.
  • Data warehousing.

What are some examples of Informatica ETL programs?

Some basic Informatica programs are:

  • Mappings: A mapping is designed in the Designer. It defines all the ETL processes. Data are read from their original sources by mappings before the application of transformation logic to the read data. The transformed data is later written to the targets.
  • Workflows: The processes of runtime ETL are described by a collection of different tasks are known as workflow. Workflows are designed in the Workflow Manager.
  • Task: This is a set of actions, commands, or functions that are executable. How an ETL process behaves during runtime can be defined by a sequence of different tasks.

Which development components of Informatica have the highest usage?

There are many development components in Informatica. However, these are the most widely used of them:

  • Expression: This can be used to transform data that have functions.
  • Lookups: They are extensively used to join data.
  • Sorter and Aggregator: This is the right tool for sorting data and aggregating them.
  • Java transformation: Java transformation is the choice of developers if they want to invoke variables, java methods, third-party API’s and java packages that are built-in.
  • Source qualifiers: Many people use this component to convert source data types to the equivalent Informatica data types.
  • Transaction control: If you want to create transactions and have absolute control over rollbacks and commits, count on this component to bail you out.

What are the uses of ETL tools?

ETL tools are quite different from other tools. They are used for performing some actions such as:

  • Loading important data into a data warehouse from any source known as Target.
  • Extracting data from a data warehouse from any sources such as database tables or files.
  • Transforming the data received from different sources in an organized way. Some of the notable sources where data are received include SAP solutions, Teradata, or web services.

This Questions & answers part contains the questions and answers about

  • Types of data warehouses
  • ETL tools

Definition of

  • Data mart
  • star schema
  • snow flake schema

Define Enterprise Data Warehousing?

When the data of organization is developed at a single point of access it is known as enterprise data warehousing.

Differentiate between a database, and data warehouse?

Database have a group of useful information which is brief in size as compared to data warehouse whereas in data warehouse their are set of every kind of data whether it is useful or not and data is extracted as the the requirement of customer.

What do you understand by a term domain?

Domain is the term in which all interlinked relationship and nodes are under taken by sole  organizational point.

Differentiate between a repository server and a powerhouse?

Repository server mainly guarantees the repository reliability and uniformity while powerhouse server tackles the execution of many procedures between the factors of server’s database repository.

In Informatica WorkFlow Manager, how many repositories can be created?

It mainly depends upon the number of ports we required but as general there can be any number of repositories.

In Informatica WorkFlow Manager, how many repositories can be created?

It mainly depends upon the number of ports we required but as general there can be any number of repositories.

Write the advantages of partitioning a session?

The main advantage of partitioning a session is to get better server’s process and competence. Other advantage is it implements the solo sequences within the session.

How we can create indexes after completing the load process?

With the help of command task at session level we can create indexes after the load procedure.


Define sessions in Informatica ETL.

Session is a teaching group that requires to be to transform information from source to a target.

In one group how many number of sessions can we have?

We can have any number of session but it is advisable to have lesser number of session in a batch because it will become easier for migration.

Differentiate between mapping parameter and mapping variable?

At the time values alter during the session’s implementation it is known as mapping variable whereas the values that don’t alter within the session implementation is called as mapping parameters.

What are the features of complex mapping?

The features of complex mapping are:
Many numbers of transformations
tricky needscompound business logic

How we can identify whether mapping is correct or not without connecting session?

With the help of debugging option we can identify whether mapping is correct or not without connecting sessions.

Can we use mapping parameter or variables developed in one mapping into any other reusable transformation?

Yes, we can use mapping parameter or variables into any other reusable transformation because it doesn’t have any mapplet.

What is the use of aggregator cache file?

If extra memory is needed aggregator provides extra cache files for keeping the transformation values. It also keeps the transitional value that are there in local buffer memory.

What is lookup transformation?

The transformation that has entrance right to RDBMS Is known as lookup transformation.

What do you understand by term role playing dimension?

The dimensions that are used for playing diversified roles while remaining in the same database domain are known as role playing dimensions.

How we can access repository reports without SQL or other transformations?

We can access repository reports by using metadata reporter. No need of using SQL or other transformation as it is a web app.

Write the types of metadata those stores in repository?

The types of metadata which is stored in repository are Target definition, Source definition, Mapplet, Mappings, Transformations.

What is code page compatibility?

Transfer of data take place from one code page to another keeping that both code pages have the same character sets then data failure cannot occur.

How we can confirm all mappings in the repository simultaneously?

At a time we can validate only one mapping. Hence mapping cannot be validated simultaneously.

Define Aggregator transformation?

It is different from expression transformation in which we can do calculations in set but here we can do aggregate calculations such as averages, sum, etc.

What is Expression transformation?

It is used for performing non aggregated calculations. We can test conditional statements before output results move to the target tables.

Define filter transformation?

Filter transformation is a way of filtering rows in a mapping. It have all ports of input/output and the row which matches with that condition can only pass by that filter.

Define Joiner transformation?

It combines two associated mixed sources located in different locations while a source qualifier transformation can combine data rising from a common source.

What do you mean by Lookup transformation?

Lookup transformation is used for maintaining data in a relational table through mapping. We can use multiple lookup transformation in a mapping.

How we can use Union Transformation?

It is a different input group transformation that is used to combine data from different sources.

Define Incremental Aggregation?

The incremental aggregation is done whenever a session is developed for a mapping aggregate.\

Differentiate between a connected look up and unconnected look up?

In connected lookup inputs are taken straight away from various transformations in the pipeline it is called connected lookup. While unconnected lookup doesn’t take inputs straight away from various transformations, but it can be used in any transformations and can be raised as a function using LKP expression.

The differences are illustrated in the below table:

Connected Lookup Unconnected Lookup
Connected lookup participates in dataflow and receives input directly from the pipeline Unconnected lookup receives input values from the result of a LKP: expression in another transformation
Connected lookup can use both dynamic and static cache Unconnected Lookup cache can NOT be dynamic
Connected lookup can return more than one column value ( output port ) Unconnected Lookup can return only one column value i.e. output port
Connected lookup caches all lookup columns Unconnected lookup caches only the lookup output ports in the lookup conditions and the return port
Supports user-defined default values (i.e. value to return when lookup conditions are not satisfied) Does not support user defined default values

Define mapplet?

A mapplet is a recyclable object that is using mapplet designer.

What is reusable transformation?

This transformation is used various times in mapping. It is divest from other mappings which use the transformation as it is stored as a metadata.

Define update strategy.

Whenever the row has to be updated or inserted based on some sequence then update strategy is used. But in this condition should be specified before for the processed row to be tick as update or inserted.

Explain the scenario which compels informatica server to reject files?

When it faces DD_Reject in update strategy transformation then it sends server to reject files.