What is Oracle Data Integrator (ODI)?
Oracle acquired SUNOPSIS with its ETL tool called “Sunopsis Data Integrator” and renamed to Oracle Data Integrator (ODI) is an E-LT (Extract, Load and Transform) tool used for high-speed data movement between disparate systems.
The latest version, Oracle Data Integrator Enterprise Edition (ODI-EE 12c) brings together “Oracle Data Integrator” and “Oracle Warehouse Builder” as separate components of a single product with a single licence.
Explain what is ODI?why is it different from the other ETL tools.
ODI stands for Oracle Data Integrator. It is different from another ETL tool in a way that it uses E-LT approach as opposed to ETL approach. This approach eliminates the need of the exclusive Transformation Server between the Source and Target Data server. The power of the target data server can be used to transform the data. i.e. The target data server acts as staging area in addition to its role of target database.
while loading the data in the target database (from staging area) the transformation logic is implemented. Also, the use of appropriate CKM (Check Knowldege Module) can be made while doing this to implement data quality requirement.
What is E-LT? Or What is the difference between ODI and other ETL Tools?
E-LT is an innovative approach to extracting, loading and Transforming data. Typically ETL application vendors have relied on costly heavyweight , mid-tier server to perform the transformations required when moving large volumes of data around the enterprise.
ODI delivers unique next-generation, Extract Load and Transform (E-LT) technology that improves performance and reduces data integration costs, even across heterogeneous systems by pushing the processing required down to the typically large and powerful database servers already in place within the enterprise.
Components of Oracle Data Integrator?
“Oracle Data Integrator” comprises of:
– Oracle Data Integrator + Topology Manager + Designer + Operator + Agent
– Oracle Data Quality for Data Integrator
– Oracle Data Profiling
What systems can ODI extract and load data into?
ODI brings true heterogeneous connectivity out-of-the-box, it can connect natively to Oracle, Sybase, MS SQL Server, MySQL, LDAP, DB2, PostgreSQL, Netezza.
It can also connect to any data source supporting JDBC, its possible even to use the Oracle BI Server as a data source using the jdbc driver that ships with BI Publisher
What are Knowledge Modules?
Knowledge Modules form the basis of ‘plug-ins’ that allow ODI to generate the relevant execution code , across technologies , to perform tasks in one of six areas, the six types of knowledge module consist of:
- Reverse-engineering knowledge modules are used for reading the table and other object metadata from source databases
- Journalizing knowledge modules record the new and changed data within either a single table or view or a consistent set of tables or views
- Loading knowledge modules are used for efficient extraction of data from source databases for loading into a staging area (database-specific bulk unload utilities can be used where available)
- Check knowledge modules are used for detecting errors in source data
- Integration knowledge modules are used for efficiently transforming data from staging area to the target tables, generating the optimized native SQL for the given database
- Service knowledge modules provide the ability to expose data as Web services
ODI ships with many knowledge modules out of the box, these are also extendable, they can modified within the ODI Designer module.
Does my ODI infrastructure require an Oracle database?
No, the ODI modular repositories (Master + and one of multiple Work repositories) can be installed on any database engine that supports ANSI ISO 89 syntax such as Oracle, Microsoft SQL Server, Sybase AS Enterprise, IBM DB2 UDB, IBM DB2/40.
Does ODI support web services?
Yes, ODI is ‘SOA’ enabled and its web services can be used in 3 ways:
- The Oracle Data Integrator Public Web Service, that lets you execute a scenario (a published package) from a web service call
- Data Services, which provide a web service over an ODI data store (i.e. a table, view or other data source registered in ODI)
- The ODIInvokeWebService tool that you can add to a package to request a response from a web service
what is the ODI Console?
ODI console is a web based navigator to access the Designer, Operator and Topology components through browser.
suppose I having 6 interfaces and running the interface 3 rd one failed how to run remaining interfaces?
If you are running Sequential load it will stop the other interfaces. so goto operator and right click on filed interface and click on restart. If you are running all the interfaces are parallel only one interface will fail and other interfaces will finish.
what is load plans and types of load plans?
Load plan is a process to run or execute multiple scenarios as a Sequential or parallel or conditional based execution of your scenarios. And same we can call three types of load plans , Sequential, parallel and Condition based load plans.
what is profile in ODI?
profile is a set of objective wise privileges. we can assign this profiles to the users. Users will get the privileges from profile
How to write the sub-queries in ODI?
Using Yellow interface and sub queries option we can create sub queries in ODI. or Using VIEW we can go for sub queries Or Using ODI Procedure we can call direct database queries
How to remove the duplicate in ODI?
Use DISTINCT in IKM level. it will remove the duplicate rows while loading into target.
Suppose having unique and duplicate but i want to load unique record one table and duplicates one table?
Create two interfaces or once procedure and use two queries one for Unique values and one for duplicate values.
how to implement data validations?
Use Filters & Mapping Area AND Data Quality related to constraints use CKM Flowcontrol.
How to handle exceptions?
Exceptions In packages advanced tab and load plan exception tab we can handle exceptions.
In the package one interface got failed how to know which interface got failed if we no access to operator?
Make it mail alert or check into SNP_SESS_LOg tables for session log details.
How to implement the logic in procedures if the source side data deleted that will reflect the target side table?
User this query on Command on target Delete from Target_table where not exists (Select ‘X’ From Source_table Where Source_table.ID=Target_table.ID).
If the Source have total 15 records with 2 records are updated and 3 records are newly inserted at the target side we have to load the newly changed and inserted records
Use IKM Incremental Update Knowledge Module for Both Insert n Update operations.
Can we implement package in package?
Yes, we can call one package into other package.
How to load the data with one flat file and one RDBMS table using joins?
Drag and drop both File and table into source area and join as in Staging area.
If the source and target are oracle technology tell me the process to achieve this requirement(interfaces, KMS, Models)
Use LKM-SQL to SQL or LKM-SQL to Oracle , IKM Oracle Incremental update or Control append.
what we specify the in XML data server and parameters for to connect to xml file?
File name with location :F and Schema :S this two parameters
How to reverse engineer views(how to load the data from views)?
In Models Go to Reverse engineering tab and select Reverse engineering object as VIEW.
Will Update more question and answers to them in next Series.