Sharing Knlowledge With The World…

Category: Oracle

Source Target Level Commit in Informatica

Source Target Level Commit in Informatica

In this post we will discuss how Source Target Level Commit in informatica works.

First of all What is a Commit to a Data Base : – A Commit is a instruction to the database server , which tells it to the save the state of the database session at that point of time.

Informatica supports two commit categories / levels :-

1) Source Level Commit

2) Target Level Commit

Source Level Commit :-  In this type of commit  an integration service issue commit  to the underlying data base (Oracle , SQL Server e.t.c.) when a particular number of rows has been read from the source data base irrespective of the number of rows inserted to the data base.

Let’s Take an Example : –

A (ORACLE) source is reading and it has taken 505 rows till now and in target there you have 300 inserted rows so far , and mean while rest of the rows are getting processed with in the mapping and in session properties you have selected source level commit with row interval of 510. As soon as source will pick 510th row the integration service will issue a commit to the underlying data base however there are still 300 rows in target  , only 300 rows will be committed. Rest rows will be committed either again reading 510 rows or after completion of the session.

Number of source rows reading matters in this case.

Target Level Commit :- In this commit type the integration service issues a commit to the data base once a particular number of rows are inserted to the data base.

Let’s Take an Example : –

your source is reading and has read 500 rows so far and in the session properties you had selected Target Level Commit with commit interval of 300 rows. And in target you have 250 rows so far , now your source has read 800 rows and in your target you just touched 300th row , immediately a commit will fire on the target data base.

Number of target rows affected matters in this case.

Author : – Ankit Kansal & Nayan Naik



Continue Reading

Call DB Sequence From Talend

Hello Readers,
Today we will discuss how call db sequence from Talend while loading data. For demo purposes we have taken Oracle as our Database.STEPS:
1) Create a sequence in Database using : –create sequence test_seq start with 1 increment by 1;
The above statement will create a sequence in Oracle Databse.
select test_seq.nextval from dual;
This statement is very important as it will initialize your sequence.2)Take a source file/Databse and connect it to your oracle output component with all schema defined at Target Level.

call db sequence from Talend

call db sequence from Talend

3)Go to Advance Setting Tab of oracle output component and change the settings as defined in the screenshot.

call db sequence from Talend

call db sequence from Talend

seqence_name.nextval is the command which you have to write in SQL Expression. so that for every row this expression will call your Sequence present at Database Level.

Thanks For Reading……

Talend Interview Questions

Author:- Ankit Kansal & Nayan Naik
Continue Reading

Multiple Columns From UnConnected Lookup

Hello Readers,

                      Today we will be a discussing a scenario of getting Multiple Columns From UnConnected Lookup  in Informatica.
Normally an unconnected lookup only returns a single column from the lookup source that is referring to. But you can return multiple column using an unconnected lookup if your lookup source is TABLE using a query override feature.

Overall Job Desc:- Multiple Columns From UnConnected Lookup

Multiple Columns From UnConnected Lookup

Multiple Columns From UnConnected Lookup

Now In Lookup Transformation use override sql query feature and create an additional port which will going to be the return port and override the sql query as given below.

Multiple Columns From UnConnected Lookup

Multiple Columns From UnConnected Lookup


               , S_ULOOKUP_REAL.NAME as NAME
               , S_ULOOKUP_REAL.ID as ID 
               ,name||’,’||address||’,’||age as DATA

My last column(alias data) contains all the data and it has all those ports concatenated, which has to to be returned from the unconnected lookup make sure the port which you have created must share same name as data in our case  and it must be your return port.

Now call your Lookup Using :LKP.LKPTRANS(ID) , and then you can separate your column values using String Functions like SUBSTR() and many more…

so. this is how you can return Multiple Columns From UnConnected Lookup …

Thanks For Reading…..
Author:- Ankit Kansal & Nayan Naik

Continue Reading

Incremental Load Using Talend

Hi Readers,
                  Incremental Load is way of loading your target tables in data warehouse such that new or updated data from your source system will only affect your target system. 
It’s pretty similar with your novel reading, let’s say first day you read some pages then on other day you will start reading from the point you left and so on.
So, today we will discuss one of the implementation technique for incremental loading using Talend Open Studio.

Overall Job Desc:-


Execution Steps:

1. Create two context variables one will hold the directory structure of your parameter file(file will store your last run date info) specify some default value which you can further change using property file while deployment and the other variable is used within the job to hold the last run status. Load the last run context variable using tContextLoad.
2. Create a sql query which will hit your source database and pull down data as per your needs. In my case   query is with condition tdate < sysdate and tdate>'”+context.Last_Load_Date+”‘”.
As in most of the cases reporting is done till the previous day data, tdate is a column which holds the record’s updated/insert date status so we are pulling data as per the given condition. 
Total Data Fetched = Data Greater than Last_Time_Run + Data Less then current_date.  
3.Simple update your parameter file that is holding your previous run date with current_date-1.
“select ‘Last_Load_Date;’||to_char(sysdate-1,’dd-mon-yyyy’) as prev_day from dual”

Thanks for reading..:)  

Talend Interview Questions                                                               

Author: Ankit Kansal & Nayan Naik
Continue Reading

Dynamic Column Ordering From Source File In Informatica

Hello Readers,
                      Today we will be discussing a scenario which commonly occurs during ETL development. Lets say we have a source file which contains  Employee information as described in the following given text file.
SCENARIO:- The given source file will be available to you on daily basis, but the order of the given columns varies daily  such that the ordering of the columns doesn’t remain constant . Today we have ename in first position ,the next day it may appear in the last position. Hence you cannot implement the logic straight away as we know that you must know your column ordering in informatica to implement the further logic.
So, to overcome this current situation we have developed a mapping which will provide you a sure shot solution and you can re modify this mapping as per your business needs.



Take the number of mapping variable as per number of columns in your source file. In my case i have three columns so i have defined three variables of Integer data type named as $$var1,$$var2,$$var3 set initial value to 0 while defining them.
Actual Implementation will be done in Expression Transformation. 
So carefully look into the given snapshot for expression transformation.
Now in expression transformation create four expr_variables as defined they were used for setting up the values of mapping variables. var_count is used for setting up the values first time for all the mapping variables.
now for var_ename variable use the code given below.
The code given above will check for the first time whether count is one(ie the header row) if yes then it checks if the column name ename for the  first row and if  contains data as ‘ename’ then setvariable($$var1,1) similarly it checks all columns for data ‘ename’ and sets the any of $var which satisfies the condition. 
Same code is repeated all the times for all column names but only the difference is the checking condition. Like deptno it will check whether the columns contains ‘deptno’ and if yes then it sets setvariable($$var1,2)and so on.
*remember to pull the data from the first row of the file don’t ignore the first like otherwise it will not work however you can skip your first row after you recognized your columns using filter in the next component.

Now Finally create three output ports which will be used to forward your corrected data to your next transformation. So again look into the code given below.
Similarly, use similar condition in all your output ports like for ename use $$var1 any of the variable holding your ename data will be satisfied here similarly you can define for rest of your ports. Further on condition will change just a bit instead of $$var1=1 use have to use $$var1=2 or 3 whatever you have specified before.

Thank u for reading…

Continue Reading

Date Dimension Using Talend

Hello Readers,
                      In our previous post we discussed what is the use of Date Time Dimension in Warehousing environment and shown a demo displaying a Date-Time Dimension using Informatica.

So, in this post we will demonstrate the same implementation using Talend Open Studio.


1)Oracle Db(tOracleConnection)
2)tMap(for data manipulation)
3)tConvertType(to change the data type)
4)tLogRow(Displaying Result)



         In tOracleInput component define two columns as a schema both must be of data type and in sql editor write the query given below.

“select sysdate+level,trunc(sysdate+level,’q’) from dual connect by level < 366”

first column will return you data serially from the current date and second column will give you first date for every quarter.

Most of the implementation is completed using existing  components.
To compute monthOfQuarter–
var monthOfQuarter=Mathematical.MOD(TalendDate.getPartOfDate(“MONTH”,row1.ora_date)+1,3)  

To pull out Weekday in character format use–

To calculate days from first date of quarter–

So, this is how you can implement a basic date-time dimension. Beyond this you can alter this current implementation as per your requirement.

Thanks for giving your precious time….kindly leave a comment  in case of any queries ..

Talend Interview Questions

Author : Ankit Kansal & Nayan Naik

Continue Reading

Date Dimension Using Informatica

Hello Readers,
                       Today we will be discussing one of the most important topic which is widely used in warehouse implementation that is Time/Date Dimension using Informatica.
It is  also widely acceptable as a Confirmed Dimension (A confirmed Dimension is a table whose attributes are widely referenced in multiple tables, as it can serve multiple fact tables).
Date-Time Dimension is used to analyze/report the data in terms of date and time up to its granular level.
Now, in our implementation we have taken Oracle as a Database and Informatica as a tool to achieve the basic functionality.



We have taken source as s_date_source this source may not present in your database, we have used this source to pull data from oracle using a sequel query.
SELECT sysdate+level,trunc(sysdate+level,’q’) FROM dual CONNECT  by level <366;
In our case we are loading our date-time dimension from the current date to the next complete year.
*trunc(sysdate+level,’q’) is the date passed to the function which will return the first date of every quarter, which further can be used to evaluate further columns.
Now to implement real logic we need to pull out one expression transformation from the palette.
IIF(To_char(ora_date,’DAY’)=’Saturday’) OR TO_Char(ora_date,’DAY’)=’Sunday’, ‘Weekend’, ‘Weekday’)
Try To use most of the basic function given by Informatica to achieve your functionality, some of the important ones are
O/P Rows 


This is one of the way to implement the basic Date-Time Dimension and you can further use various function given by the tool to achieve more functionality as per your requirements.
Continue Reading