Top Informatica interview questions and answers

Safalta Expert Published by: Aryan Rana Updated Tue, 13 Sep 2022 12:02 PM IST

Highlights

Informatica is used by over 21,000 enterprises in the United States alone, according to iDatalabs. Its widespread use in sectors like healthcare, finance, insurance, and non-profits has led to an increase in demand for Informatica specialists.

Top answers to Informatica interview questions


ETL (extract, transform, and load) software called Informatica is frequently used by businesses to create data warehouses. Informatica is one of the most demanding career options because it is used by more than 21,000 firms in the United States alone, according to iDatalabs. The demand for Informatica specialists is only expected to increase as a result of its widespread use in a variety of sectors, including healthcare, finance, insurance, and non-profit organisations.  If you are interested in Digital Marketing or Graphic Designing and want to learn these interesting courses then click on the links mentioned Digital Marketing Course and Graphic Designing course

Download these FREE Ebooks:
1. Introduction to Digital Marketing

 

1. What is Informatica PowerCenter?


ETL/data integration tool Informatica PowerCenter has numerous applications.

Source: Safalta.com

With the use of this programme, users can connect to, retrieve data from, and then handle data coming from several heterogeneous sources.

Users can, for instance, connect to an Oracle or SQL Server database, or both, and combine the data from these two databases into a third system.
 

2. Mention a few Informatica use scenarios that are common.


Informatica has a wide range of common applications, but these are the ones that make the most use of it:
  • When businesses switch from old legacy systems to new database systems
  • Businesses build their data warehouses at this time.
  • As data from diverse heterogeneous systems, such as several databases and file-based systems, are being integrated
  • For cleaning data


3. How does Informatica filter rows?


There are two approaches to filtering rows using Informatics Transformation, and they are as follows:
  • Source Qualifier Transformation: This operation filters rows as a relational data source is read.

    Free Demo Classes

    Register here for Free Demo Classes

    Please fill the name
    Please enter only 10 digit mobile number
    Please select course
    Please fill the email
    To improve performance, it reduces the number of rows while mapping. The filter condition also uses Standard SQL to execute in the database.
  • Filter Transformation: It filters any source's rows of mapped data. In order to enhance performance and filter out undesirable data, it is inserted close to the source. Based on circumstances, it generates true or false values.


4. How many repositories may be created in Informatica Workflow Manager?


Repositories can be made based on the number of ports that are needed. However, the quantity of repositories is generally unlimited.
 

5. What variations of lookup transformation are there?


Four distinct lookup transformations are available:
  • Lookup on relational tables, often known as a flat-file lookup.
  • Lookup performed on application sources by the pipeline.
  • Lookup transformations can be connected or disconnected. A connected lookup transformation takes data from the source, runs a lookup, and sends the findings back to the pipeline. An unconnected lookup transformation occurs when the source is not connected. It gives the calling transformation one column back.
  • Lookup transformation can be set up to cache lookup data, or we can directly query the source of the lookup each time a lookup is requested.


6. How do shell pre- and post-session commands work?


For a session task, a command task may be referred to as a pre-session or post-session shell command. It can be used as a pre-session command, post-session command for success, or post-session command for failure. The way shell commands are applied can be modified or changed depending on use cases.
 

7. What can we do to enhance Informatica Aggregator Transformation's performance?


If records are sorted before being passed to the aggregator and the "sorted input" checkbox is selected in the Aggregator Properties window, aggregator performance improves significantly. On the columns that will be used in the Group By action, the record set should be sorted. When possible, it is wise to sort the record set at the database level, for example, inside a source qualifier transformation, unless there is a danger that the previously sorted records from the source qualifier could revert to their original state before reaching the aggregator.
 

8. Using Update Strategy is not necessary to update a record in the target table.


Without employing a "Update Strategy," a target table can be changed by first defining its key at the Informatica level and then connecting it to the mapping target's field that needs to be updated. Setting the target property to "Update as Update" and selecting the "Update" checkbox at the session level is the right course of action.

Assume that we have a target table called "Customer" that contains the fields "Customer ID," "Customer Name," and "Customer Address." If we want to update "Customer Address" without using an update strategy, we must define "Customer ID" as the primary key in Informatica, and we must connect the "Customer ID" and "Customer Address" fields in the mapping. The mapping will only update the "Customer Address" field for all matching customer IDs if the session attributes are correctly configured as mentioned above.
 

9. Why do we employ mapping variables and parameters?


The values in mappings and mapplets are essentially represented by mapping parameters and mapping variables.

Parameter Mapping
  • Before starting a session, constant values are defined as mapping parameters.
  • Parameters are created, then they show up in Expression Editor.
  • These options can be used for overriding, user-defined joins, and source qualifier filters.

Variable Mapping
  • Unlike mapping parameters, mapping variables' values can change throughout a session.
  • At the conclusion of each productive session, the Integration Service saves the most recent value of a mapping variable to the repository. With parameter files, it is possible to alter saved settings.
  • In essence, mapping variables are utilised to read data sources incrementally.


10. What is the surrogate key?


In a database, modelled entities or objects are generally identified by a surrogate key, which functions as an identifier. Surrogate keys may or may not be used as primary keys because they are not derived from any other data in the database.

Essentially, it is a distinct sequential number. A surrogate key is used to identify any entity that is both patterned after and represents an externally existing object in the database. In these situations, internal surrogate keys are generated for certain objects or modelled entities.
 

11. Sessions should be explained, along with how batches are utilised to combine executions.


A session is nothing more than a teaching set that must be used to transfer information from a source to a target. Users must make use of the session manager or the pmcmd command to run sessions. Batch execution is used to combine sessions in either a parallel or serial fashion. Sessions of any size can be combined into batches for migration.
 

12. How can duplicate rows be removed from flat files?


Using the sorter transformation and choosing the distinct option, we may remove duplicate rows from flat files. The duplicate rows will be removed if this option is chosen.
 

13. What attributes does Informatica Developer 9.1.0 have?


Some of the new features in Informatica Developer 9.1.0 are listed below from the perspective of an Informatica Developer:
  • In the updated version, lookup can be set up as an active transformation that, in the event of a successful match, can return several rows.
  • We can now also write SQL overrides for uncached lookup. Prior to now, we could only use a cached lookup.
  • Control over our session log's file size: In a real-time environment, we have some degree of control over the size or duration of our session log files.
  • Database deadlock resistance feature: By using this, we may prevent an immediate session failure in the event of a database deadlock. The operation will be tried again. The number of retry attempts is configurable.


14. What benefits do Informatica's ETL tools have over Teradata?


Firstly, Teradata is an MPP database with some scripting and quick data transfer capabilities, whereas Informatica is a data integration tool.

Informatica has advantages over Teradata.
  • It serves as a metadata library for the ETL ecosystem of the company. Worklets and workflows in folders can be used to logically organise Informatica jobs (sessions). It results in an environment that is quicker to evaluate and improve for architects and analysts.
  • The Informatica Workflow Monitor makes job monitoring and recovery simple. Additionally, it is simpler to locate failed or slowly performing jobs and recover from them. It shows the possibility to go back to the failure row step and resume.
  • The Informatica Market Place is a one-stop shop for several tools and accelerators that can speed up the software development life cycle and enhance application support.
  • It makes it possible for a large number of developers on the market to communicate and collaborate.
  • In addition to the standard (and slow) ODBC drivers, there are several connectors for different databases, including support for Teradata MLoad, TPump, FastLoad, and Parallel Transporter.
  • It may be quicker to generate surrogate keys using Informatica's shared sequence generators than inside the database.
  • Vendors like Infosys may carry out migration projects to migrate the data and modify the ETL code to work with the new database fast, precisely, and efficiently if a firm decides to switch from Teradata to another solution.
  • The database's data can be processed via pushdown optimization.
  • It provides the ability to programme ETL so that the workload is evenly distributed between the ETL server and the database box. This is helpful if the database box is old or if the ETL server has a fast disc, adequate RAM, or a powerful enough CPU to handle some tasks better than the database.
  • Processes can be published as web services using this capability.

Teradata has advantages over Informatica.
  • Cheaper (at first): There are no upfront licence fees for ETL tools. Only OPEX expenses are reduced because Informatica Corp. support is not a yearly expense.
  • Excellent option if all the data to be loaded can be found in structured files. After an initial stage load, the data may subsequently be processed inside the database.
  • a wise choice for an ecology with less complexity.
  • To create and improve the system, only Teradata developers or resources with solid ANSI/Teradata SQL/BTEQ experience are needed.


15. Define OLAP. What are the many types of OLAP? 


A particular type of software called OLAP, or online analytical processing, enables users to evaluate data from numerous database systems at once. Analysts can extract and examine company data from various sources or points of view using OLAP.

Forms of OLAP
  • ROLAP: ROLAP is an OLAP server that converts multidimensional operations into conventional relational operations.
  • MOLAP: MOLAP, or Multidimensional OLAP, creates multidimensional views of data using array-based multidimensional storage engines. To manage dense and sparse datasets, many MOLAP servers use two tiers of data storage representation.
  • HOLAP: For faster calculation and greater data scalability, hybrid OLAP combines ROLAP and MOLAP.


16. Which load order is the target? How do you set it?


Users of Designer are able to specify the target load order for each source that is relevant to a mapplet when using one in mapping. Users can specify the target load order in Designer, which controls how Integration Service sends rows to targets within the mapping. In essence, a target load order group is a grouping of source qualifiers, transformations, and targets that are connected by a mapping. When using tables with primary and secondary keys, the target load order can be configured to maintain referential integrity.


Setting the Target Load Order: Steps

Step 1: Construct a mapping with numerous target load order groups.
Step 2: Select Target Load Plan after clicking on Mappings.
Step 3: All Source Qualifier transformations and the targets that they send data to are listed in the Target Load Plan dialogue box.
Step 4: Pick a source qualifier, then use the Up and Down arrows to adjust its location.
Step 5: If you want to reorganise the source qualifiers, repeat steps three and four.
Step 6: When you're finished, click OK.


17. Define Target Designer.


When performing ETL operations, we require source data, target tables, and the necessary transformations. Using Informatica's Target Designer, we can build target tables and change the pre-existing target definitions.

Flat files, relational databases, XML definitions, Excel workbooks, and other sources can all be used to import target definitions.

Select the Target Designer option from the Tools menu to launch Target Designer.
 

18. What are Informatica's benefits?


The benefits of Informatica are as follows:
  • It is a GUI tool: Hand-coding scripts takes longer than coding in any graphical tool.
  • It may exchange data with any known data source, including mainframes, RDBMS, flat files, XML, VSM, SAP, and others.
  • It is quite effective at handling very huge data.
  • An ETL tool has separate objects for aggregate logic and loading rules as well as mappings, extract rules, cleansing rules, transformation rules, and rules the user can apply. Any modification to one object will have a minimal effect on the others.
  • The item can be used again (Transformation Rules).
  • Different "adapters" from packaged ERP programmes are available in Informatica (such as SAP or PeopleSoft).
  • There are resources on the market.
  • Both Windows and Unix environments support its use.
  • It makes it simple to track down failed jobs, identify delayed jobs, and monitor jobs.
  • It comes with a wide range of powerful capabilities, such as database information, data validation, project conversion from one database to another, etc.


19. List a few PowerCenter client programmes together with an explanation of what they do.

  • An administrative tool used to manage repository directories, objects, groups, etc. is the repository manager.
  • To carry out service operations, use the administration console.
  • PowerCenter Designer: This programme comes with a number of design tools, such as a source analyzer, target designer, mapplet designer, mapping manager, etc.
  • Workflow Manager: Specifies the instructions needed to carry out mappings.
  • Task and workflow monitoring


20. How do sessions work? List each of their attributes.


Sessions can be set up by establishing a session task and are accessible in the Workflow Manager. There may be several sessions in a mapping software that are either reusable or not.

Sessions' Characteristics
  • Depending on the situation, session tasks may run simultaneously or sequentially.
  • They can be set up to analyse performance.
  • Log files, test loads, error handling, commit intervals, target characteristics, etc. are all included in sessions.


21. What types of transformations are available in Informatica?


The numerous transformations include:
  • Aggregator Transformation
  • Expression Transformation
  • Normalizer Transformation
  • Rank Transformation
  • Filter Transformation
  • Joiner Transformation
  • Lookup Transformation
  • Stored procedure Transformation
  • Sorter Transformation
  • Update Strategy Transformation
  • XML Source Qualifier Transformation
  • Router Transformation
  • Sequence Generator Transformation


22. What qualities does connected lookup possess?


The following are some of the features of connected lookup:
  • Direct input from the pipeline is taken by it.
  • The cache is both dynamic and static, and it actively participates in the data flow.
  • Every lookup column is cached, and when the lookup criterion is not met, default values are returned as the result.
  • The output port can receive more than one column value.
  • User-defined default values are supported.


23. Define the dimensions of junk.


Junk qualities, such as random codes or flags, are grouped together to form structures known as junk dimensions. Instead of constructing many tables for the same thing, they create a framework for storing related codes with respect to a particular dimension in a single location.
 

24. What purpose does Rank Transformation serve?


A set of records is sorted and ranked either from the top or from the bottom using rank transformation, whether it is active or connected. Additionally, based on particular ports, it is utilised to choose the data with the highest or lowest numeric value.
 

25. Explain transformation for the sequence generator.


The Sequence Generator transformation produces primary keys or a string of integers for calculations or processing, and it is available in passive and linked versions. There are several transformations in a mapplet that can be connected to its two output ports. They are as follows:

NEXTVAL: This can be linked to a variety of transformations to produce a special value for every row or transformation.
When NEXTVAL is already connected to another transformation within the mapplet, CURRVAL is the port that is attached.
 

26. What serves as the INITCAP function's purpose?


The INITCAP function, when used, turns all other letters in a string to lowercase while capitalising the first letter of each word.

Syntax:

INITTCAP(string name)
 

27. What is an enterprise data warehouse?


Enterprise data warehousing is the process of creating a single point of access for all of an organization's data.
 

28. What makes a data warehouse different from a database?


In contrast to the data warehouse, the database has a collection of relevant data that is compact in size. There are sets of every type of data in the data warehouse, regardless of whether it is helpful or not, and the data is extracted in accordance with customer needs.
 

29. What do you mean when you say "domain"?


All interconnected relationships and nodes that are managed by a single organisational point are referred to as "domains."


30. Distinguish between a powerhouse and a repository server.


A powerhouse server handles the execution of several processes amongst the components of the server's database repository, whereas a repository server primarily ensures the consistency and reliability of the repository.
 

31. How do we build indices after the load operation is finished?


After the loading process, we can generate indexes with the aid of the command task at the session level.
 

32. In Informatica ETL, define sessions.


A session is an instructional gathering where information must be transformed from a source to a target.
 

33. What is the maximum number of sessions per group?


Although there is no limit to the number of sessions, it is recommended to have fewer sessions in a batch so that migrating is simpler.
 

34. Explain the differences between a mapping parameter and a mapping variable.


Mapping variables are the values that change while the session is implemented, whereas mapping parameters are the values that remain the same.
 

35. Mention the benefits of dividing a session.


Partitioning a session has the primary benefit of improving the efficiency and expertise of the server. It also incorporates solo sections into the lesson, which is a plus.
 

36. What characteristics distinguish complicated mapping?


The following are some characteristics of complicated mapping:

There are more transformations than usual.
It employs intricate business logic.


37. Without a connected session, how can we tell if a mapping is correct or not?


Without connecting sessions, we may determine whether or not a mapping is proper with the use of the debugging option.
 

38. Can we apply mapping variables or parameters established in one mapping to another reusable transformation?


Because they lack a mapplet, we can use mapping parameters or variables in any other reusable transformation.
 

39. What use does the aggregator cache file serve?


Aggregator offers additional cache files for storing the transformation values in case more RAM is required. Additionally, it maintains the transitional values that are stored in the local buffer memory.
 

40. Describe lookup transformation.


Lookup transformation refers to the transformation that has direct access to RDBMS.
 

41. The phrase "role-playing dimension" means what to you?


Role-playing dimensions are those that are applied to the performance of multiple roles inside the same database domain.
 

42. Without using SQL or any other modifications, how can we obtain repository reports?


Using a metadata reporter, we may retrieve repository reports. Since it is a web application, there is no requirement for SQL or any other transformation.


43. List the various forms of metadata that are kept in repositories.


The several kinds of metadata that are kept in the repository include Target definition, Source definition, Mapplet, Mappings, and Transformations.


44. Why is code page compatibility important?


Data failure won't happen because information is transferred between code pages so that both have the same character set.


45. How are we able to simultaneously confirm every mapping in the repository?


One mapping at a time can be validated. Thus, simultaneous validation of mapping is not possible.
 

46. Define the Aggregator transformation.


Aggregator transformation differs from expression transformation in that aggregate calculations like averages, sums, and so forth are possible.
 

47. What does expression transformation entail?


Nonaggregated calculations can be done with it. Before moving the output results to the destination tables, conditional statements can be tested.


48. Definition of filter transformation.


A mapping's rows can be filtered using a filter transformation. Every input/output port is present, and the filter can only let the row that matches that criterion through.


49. What is Joiner transformation?


As opposed to a source qualifier transformation, which can combine data coming from a single source, it combines two associated mixed sources that are spread across multiple places.
 

50. Why do you use the term "lookup transformation"?


Data maintenance in a relational table through mapping is done through lookup transformation. In a mapping, many lookup transformations are possible.
 

51. Establish a distinction between linked and unconnected lookups.


An input for a linked lookup is obtained directly from one of the transformations in the pipeline. An unconnected lookup can be utilised in any transformation and raised as a function using the LKP expression, whereas it does not immediately accept inputs from different transformations.
 

52. What is a mapplet?


A recyclable item created by a mapplet designer is called a mapplet.

 

53. Reusable transformation: what is it?


In mapping, this transformation is applied numerous times. Since it is stored as metadata, it differs from other mappings that employ the transformation.


54. Establish an updating strategy.

An update strategy is utilised whenever a row has to be added or changed depending on a sequence. However, in this case, prerequisites must be stated before the processed row may be marked as an Update or Insert.
 

55. How does reusable transformation work?


A number of times in mapping, this transformation is used. Considering that it is stored as metadata, it differs from other mappings that employ transformations.

56. Lay out your update plan.

An update strategy is applied any time a row needs to be changed or added based on a sequence. For the processed row to be marked as an Update or Insert, however, requirements must first be stated.
 

57. Describe the circumstance that forces the Informatica server to reject files.


When the update strategy transformation encounters DD Reject, it sends the server to refuse files.


58.  Why is a surrogate key used?


It serves as an alternative to the natural prime key. It serves as a special identity for every table row.
 

59. Describe the steps necessary to complete the session division.


One must arrange the session to partition source data before installing several CPUs on the Informatica server system in order to conduct session partition.
 

60. What files on the Informatics server are created during a session RUM?


The following file kinds are produced throughout the RUMs session:

Mistakes log
poor file
Low workflow
Log sessions


       

Free E Books