FDMEE Automated Batch Loads Time Out


In Financial Data Quality Management Enterprise Edition (FDMEE) automated batch loads time out.

There are two most likely causes of this issue:

  • Batch timeout setting not defined or insufficient in system and/or application settings in the FDMEE application
  • Stuck thread timeout setting needs increasing in the ErpIntegrator0 managed server (weblogic)

To increase the batch timeout setting in the FDMEE application:

  1. Navigate to Data Management from EPM workspace.
  2. From the Setup tab, and then under Configure, select System Settings.
  3. In System Settings, from Profile Type, select Other.
  4. Define a timeout value for the “Batch timeout in minutes” setting.

To increase the Stuck Thread timeout settting in the ErpIntegrator0 managed server in weblogic:

  1. Navigate to weblogic admin console.
  2. Under domain structure, expand Environment and choose servers.
  3. Choose ErpIntegrator0 from the servers list.
  4. Under configuration, choose tuning.
  5. Choose Lock & Edit to modify the setting value.
  6. Increase the Stuck Thread Max time value. The value is in seconds. By default it will be 600 seconds (5 minutes).
  7. Choose activate configuration so that the settings take effect.
  8. Restart FDMEE service.

Hope this helps.

Thanks,

~KKT~

FDQM/EE Loading Metadata to on-premise and cloud EPM Applications – Part-2


Import Format

The import format will map the columns from the source file to the dimensions in the custom application, or set defaults for properties that are the same for every member. In other cases, some of the members are set to the text “BLANK” so that mapping will succeed, but then values for the BLANK members may or may not be set by the by the event script. The use of the text for BLANK is specifically required when using the output dat file to generate the metadata file. If pulling directly from TDATASEG the user may not need to use this method.

Please also note that the AMOUNT field should be set to any valid numeric amount so that the input file will be successfully processed. In addition, a SKIP row may be required if the input file includes a header record.
For this example, the Import Format is defined as follows:

Event Scripts
In this example, an AftExportToDat and BefLoad script were used to process the file and then push the metadata to HFM. It’s possible to perform everything needed in a single script if desired by the user, and alternate implementations are possible. The script details are as follows:

AftExportToDat – Generate the metadata file in the format required by HFM

import shutil
#Open the .dat file generated by FDMEE
datFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.dat”
datFile = open(datFileName,”r”)
#Open the .app file to be created for HFM
appFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.app”
appFile = open(appFileName,”w”)
#Create .app file Header lines
appFile.write(“!FILE_FORMAT=11.12\n!VERSION=11.1.5250\n!CUSTOM_ORDER=Movement;CostCenter;Location;Nature\n\n”)
appFile.write(“!MEMBERS=Entity\n’Label;DefCurrency;AllowAdjs;IsICP;AllowAdjFromChildren;SecurityClass;UserDefined1;UserDefined2;UserDefined3;HoldingCompany;SecurityAsPartner;Default Parent;Description(s)\n”)
#Include a counter to skip record in the .dat file generated by FDMEE
i = 0
#Parse the records in .dat file & replace BLANKs with empty string. (FDMEE puts BLANK for empty columns by default & might be rejected by target EPM application.)
for memberLine in datFile:
i = i + 1
if i > 1:
memberLine = memberLine.replace(“BLANK”,””)
memberLine = memberLine.split(“,”)
#Derive the Default Parent
if memberLine[15] != “”:
DefParent = memberLine[14]
elif memberLine[14] != “”:
DefParent = memberLine[13]
else:
DefParent = ‘#root’
#Write the .app file record for members using the below format.
appFile.write(“%s;%s;%s;%s;%s;%s;%s;%s;%s;%s;%s;DefaultParent=%s;English=%s\n” % (memberLine[0],memberLine[1],memberLine[2],memberLine[3],memberLine[4],memberLine[5],memberLine[6],memberLine[7],memberLine[8],memberLine[9],memberLine[10],DefParent,memberLine[12]))
#Create another header record in the .app file for the hierarchy section.
appFile.write(“\n!HIERARCHIES=Entity\n”)
#Close the .dat file
datFile.close()
#Open the .dat file again for parsing the records for hierarchy section.
datFile = open(datFileName,”r”)
#Write a logic such that hierarchy sections are not repeated in the .app file.
j = 0
seen = set()
for hierarchyLine in datFile:
j = j + 1
if j > 1:
hierarchyLine = hierarchyLine.replace(“BLANK”,””)
hierarchyLine = hierarchyLine.split(“,”)
gen1 = “;%s\n” % (hierarchyLine[13])
gen2 = “%s;%s\n” % (hierarchyLine[13],hierarchyLine[14])
gen3 = “%s;%s\n” % (hierarchyLine[14],hierarchyLine[15])
if gen1 not in seen and hierarchyLine[13] != “”:
seen.add(gen1)
appFile.write(gen1)
if gen2 not in seen and hierarchyLine[14] != “”:
seen.add(gen2)
appFile.write(gen2)
if gen3 not in seen and hierarchyLine[15] != “”:
seen.add(gen3)
appFile.write(gen3)
#close the .app file & .dat file.
appFile.close()
datFile.close()

BefLoad – Load the file to HFM

#Import required HFM & Java libraries
from java.util import Locale
from java.io import File
import shutil
from oracle.epm.fm.hssservice import HSSUtilManager
from oracle.epm.fm.domainobject.application import SessionOM
from oracle.epm.fm.domainobject.loadextract import LoadExtractOM
from oracle.epm.fm.common.datatype import transport
#HSS Username / Password
UserName = “admin”
Password = “Password”
#Target HFM connection properties
clusterName = “HFMCluster”
application = “DEVHFM”
#Authenticate user & create HFM Session
ssoToken = HSSUtilManager.getSecurityManager().authenticateUser(UserName,Password)
hfmsession = SessionOM().createSession(ssoToken, Locale.ENGLISH,clusterName,application)
#Load members to HFM
loadOM = LoadExtractOM(hfmsession)
appFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.app”
metadata_file = File(appFileName)
memberLoadOptions = transport.MetadataLoadOptions()
memberLoadOptions.setUseReplaceMode(False)
memberLoadOptions.setEntities(True)
memberLoadOptions.setDelimiter(“;”)
setCustomDims = [False,False,False,False]
memberLoadOptions.setCustomDims(setCustomDims)
loadInfo = loadOM.loadMetadata(metadata_file, memberLoadOptions)
logFile = loadInfo.getLogFile()
logFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.log”
shutil.copy(logFile.getPath().replace(“\\”,”/”),logFileName)
SessionOM().closeSession(hfmsession)
fdmAPI.showCustomMessage(“Metadata Loaded Successfully”)

Continue – Part-3

FDQM/EE Loading Metadata to on-premise and cloud EPM Applications – Part-1


On-premise FDMEE will be used as the tool that facilitates loading metadata to the target EPM application. The high level process to use on-premise FDMEE to load metadata is
as follows:

1) Create a target custom application in FDMEE that includes a column for each of the attributes that the user wishes to load. Some of these attributes will be in the source file, and some may be defaulted during the import process. For example, the source file may contain the member name, description, parent, IsICP flag, SecurityClass, ValidForPlanType, etc.

2) Define an import format in FDMEE to map the columns in the source file to the dimensions in the target custom application. For attributes not in the source file, define an optional default in the import format.

3) Enter any necessary mapping rules for any required transformations.

4) When running the integration with FDMEE, the system will create a .dat file in the outbox directory with all the columns specified in the custom Application definition when the Export button is clicked. This file cannot be directly loaded into the target EPM application and requires further processing to match the required format of the target EPM application.

5) Create an AftExportToDat event script that will be used to format the data loaded to FDMEE into the file format required to load data to the target application. An additional script (BefLoad) will also be used to call the necessary API to load the file to the desired target application. In some cases file headers, section headers, etc. may be required as part of the metadata load file.

Please note that there is more than one way to generate a metadata file from FDMEE that can be loaded to a target application. Some users prefer to load data to TDATASEG, and then query TDATASEG to select the necessary data. Other users will scan the system generated file from the export process to generate the file to load rather than selecting data from the TDATASEG table.

Both methods are valid, and it is really the user’s preference to select the method they desire. Once these artifacts are defined in FDMEE, users can begin to load metadata from a file using FDMEE. It is assumed that the reader of this document is familiar with FDMEE, and Jython scripting.

Custom Application Setup
The first step is to create a custom application in FDMEE that can be used as a dummy target for setting up the FDMEE artifacts. The custom application will contain a column for each attribute that the user intends to load as part of the metadata import process. In addition, the customer can also add description and attribute columns to the import format to extend the number of attributes beyond what is defined in the custom application. Please note that the customer is constrained by the TDATASEG table definition in regards to the number of columns that can be used to support the desired set of attributes. The set of columns that can be used includes Account, Entity, ICP, UD1-20, ATTR1-14, DESC1-2 which is a total of 39 columns that can be used to define the metadata that is desired to be loaded. (DATAX is not used because custom applications only support numeric data loads.)

Import Format
The import format is used to map the data from the input file to the dimensions in the target application, and is also a way to set defaults for attributes that are not included in the source file. As of the date of publication, custom applications only allow loading of numeric data, and the user will need to define the source file type as “Delimited – Numeric Data”, and provide a number as a default in the expression field for the amount. This does not impact the metadata load process, but is required to successfully load the file to FDMEE as non-numeric amounts are rejected by FDMEE while performing a numeric data load.

Additional FDMEE Setup
Users will perform the rest of the required setup like a regular data load process and define the required periods, categories, mapping rules and data load rules. Artifacts like periods and categories are not required to load metadata, but are required in FDMEE to successfully process a data load file and will need to be setup. A generic category/period mapping like Metadata / Metadata_Period for example can be created for this purpose.

Event Script
An event script is used to either extract data from the TDATASEG table and reformat into a text file, or process the data file that is generated from the custom application export process. In addition to reformatting the data into the correct format for the target application, the event script must also call the appropriate API or command for the target application. For HFM, the script needs to call the HFM Java API. For on-premise planning, the script must call the outline load utility or the REST API (11.1.2.4 version). For the cloud, the script must call the REST API
or EPMAutomate.

Load to on-premise Financial Management (HFM)
For this blog , we are assuming that the user is loading metadata to HFM version 11.1.2.4.x, and will be using the Java API to load the metadata file. The unique aspect about loading metadata to HFM is that the metadata load file includes multiple sections, and the event script needed to output the metadata and load the file will need to take these different sections into consideration. This
example will show how to load a data file in a specific format to an HFM application. It is possible that users may need to load a metadata file with additional sections not included in this example, and the user should reference the HFM administration guide to understand the format required for any additional information.

Please note that the process to generate the metadata file to load to HFM is as follows:

1. Import file into FDMEE.
2. Perform mapping.
3. Process *.dat file from the export step to generate member data and hierarchy data. The output file will be read one time to generate the individual member data, and one time to generate the hierarchy section of the file.
4. Any logic to determine the “Default Parent” needs to defined accordingly.

Source File
For this example, the following file was used:
EntityName,Description,IsICP,SecurityClass,HFMLevel1,HFMLevel2,HFMLevel3

World,Earth,N,OPWP_90,World,,
Asia,Asian Countries,N,OPWP_90,World,Asia,
Europe,European Countries,N,OPWP_90,World,Europe,
North America,Northern America,N,OPWP_90,World,North America,
India,India,Y,OPWP_90,World,Asia,India,Asia
China,China,Y,OPWP_90,World,Asia,China,Asia
United Kingdom,England Wales & Scotland,Y,OPWP_90,World,Europe,United Kingdom
Canada,Canada,Y,OPWP_90,World,North America,Canada

The input file includes a header row which documents the layout of the file.Loading metadata to HFM also requires a hierarchy, and this data is included in the metadata file in the last 3 positions labeled as HFMLevel1, HFMLevel2 and HFMLevel3.
The intended member output for the load file is defined as the following:

‘Label;DefCurrency;AllowAdjs;IsICP;AllowAdjFromChildren;SecurityClass;UserDefin
ed1;UserDefined2;UserDefined3;HoldingCompany;SecurityAsPartner;Default
Parent;Description(s)

Some of the output is not included in the input file, and these items will be added as defaults from the import format, or as part of the event script that generates the output file.

Custom Application Definition
The custom application used in this example uses the following dimensions:

Dimension Name Dimension Class Data Table Column
AllowAdjFromChildren Generic UD5
AllowAdjs Generic UD3
DefaultParent Generic UD12
DefCurrency Generic UD2
Description Generic UD13
HoldingCompany Generic UD10
IsICP Generic UD4
Level1 Generic UD14
Level2 Generic UD15
Level3 Generic UD16
MemberName Generic ACCOUNT
SecurityAsPartner Generic UD11
SecurityClass Generic UD6
UserDefined1 Generic UD7
UserDefined2 Generic UD8
UserDefined3 Generic UD9

Please note that this is just an example, and the custom application used for this process can use other dimensions as required – you are not locked into this definition. Below is a partial screen shot of the custom target application definition in FDMEE. (Please note that ACCOUNT is a required dimension in a custom application.)

Continuation-

FDMEE: Where Are The Logs For FDMEE


What are the logs for FDMEE and where are they stored?

 

When providing logs for support please ensure you have set the logging level to 5. This can be done inside FDMEE in the System Settings.

 

11.1.2.3

1) ERPIntegrator0.log

Stored under: \Oracle\Middleware\user_projects\domains\domain\servers\ErpIntergrator0\logs

Often referred to as the FDMEE or ERPI log

 

2) Process Log  (<EPM Application Name>_<processid>.log)

Stored under: FDMEE Root Folder\Outbox\Logs  (to see the location of the Root Folder please check the System Settings or Application Settings within FDMEE)

This log can also be seen from with FDMEE on the Process Details screen. Check the Show Log option for the process

 

3) ODI Agent Log

Stored under: \Oracle\Middleware/user_projects/domains/EPMSystem/servers/ErpIntegrator0/logs/oracledi

 

11.1.2.4

1) aif-webapp.log

Stored under: \Oracle\Middleware\user_projects\domains\domain\servers\ErpIntergrator0\logs

 

2) Process Log  (<EPM Application Name>_<processid>.log)

Stored under: FDMEE Root Folder\Outbox\Logs  (to see the location of the Root Folder please check the System Settings or Application Settings within FDMEE)

This log can also be seen from with FDMEE on the Process Details screen. Check the Show Log option for the process

 

3) ODI Agent Log

Stored under: \Oracle\Middleware/user_projects/domains/EPMSystem/servers/ErpIntegrator0/logs/oracledi

Thanks,

~KKT~

FDM Error “Unable To Load The JVM” When Importing Data From ERPI


This error generally occurs when you apply a patch to upgrade your EPM to 11.1.2.2.500

In Financial Data Quality Management (FDM) the following error message is displayed during import from ERPI:

Unable to load the JVM.

The data load in ERPi runs without issue.

The main reason it occurs is as during the patch installation the Java Virtual Machine (JVM) path to the ERPICOMJNIBridge has not been updated correctly.

so as to fix this issue Reconfigure the FDM Load Balancer. This will configure and register the JVM path.

 

Load Data From FDMEE To ARM Error:” Export terminated due to mismatched currency buckets.”


When loading data to ARM  from FDMEE getting the error message:
Export terminated due to mismatched currency buckets “XXXXXX”. Ensure that Financial Data
Quality Management, Enterprise Edition Category names match ARM currency/buckets.

Error

To Fix-

Verify that currency bucket in ARM match currency in FDM:

Check currency mappings in FDMEE:

1. Log into Workspace

2. Navigate->Administer-> Setup->  Category Mapping ->  Application Mapping

3. Ensure that Target Application is “Account Reconciliation Manager”.

4. Make sure that you have appropriate Categories Setup.

FDMEE_Category

Thanks,

~KKT~

FDMEE – LCM Migration Failing Error: “Service unavailable’ “


Hi,

I was working on FDMEE and found this issue and here reference for you –

Error-

User Receive “Service unavailable’ error when attempting to migrate FDMEE Planning application metadata via LCM.
[2015-05-21T14:08:12.625-05:00] [FoundationServices1] [TRACE] [EPMLCM-54037] [oracle.EPMLCM] [tid: 156] [userId: ] [ecid: 00iTJ48khon13j15nvK6yZ0001Sg000MOa,0:1:4:3:1:3:3:3:3:4:3:3:4:4:4:4:3:3:4:4:3:3:3:4:3:4:3:3:4:3:3:4:4:3:3:3:3:3:3:3:4:3:3:3:3:4:4:4:4:3:4:4:4:3:3:3:3:3:3:1:1:1] [APP: SHAREDSERVICES#11.1.2.0] [URI: /workspace/logon] [SRC_CLASS: com.hyperion.lcm.handler.util.status.TaskStatus] [SRC_METHOD: getMessageContext:69] A task type error has occurred when performing the operation for task 0 defined in the migration definition file.
[2015-05-21T14:08:12.625-05:00] [FoundationServices1] [NOTIFICATION] [EPMLCM-13000] [oracle.EPMLCM] [tid: 156] [userId: ] [ecid: 00iTJ48khon13j15nvK6yZ0001Sg000MOa,0:1:4:3:1:3:3:3:3:4:3:3:4:4:4:4:3:3:4:4:3:3:3:4:3:4:3:3:4:3:3:4:4:3:3:3:3:3:3:3:4:3:3:3:3:4:4:4:4:3:4:4:4:3:3:3:3:3:3:1:1:1] [APP: SHAREDSERVICES#11.1.2.0] [URI: /workspace/logon] Service currently not available.
[2015-05-21T14:08:12.625-05:00] [FoundationServices1] [ERROR] [EPMLCM-37066] [oracle.EPMLCM] [tid: 156] [userId: ] [ecid: 00iTJ48khon13j15nvK6yZ0001Sg000MOa,0:1:4:3:1:3:3:3:3:4:3:3:4:4:4:4:3:3:4:4:3:3:3:4:3:4:3:3:4:3:3:4:4:3:3:3:3:3:3:3:4:3:3:3:3:4:4:4:4:3:4:4:4:3:3:3:3:3:3:1:1:1] [APP: SHAREDSERVICES#11.1.2.0] [URI: /workspace/logon] [SRC_CLASS: com.hyperion.lcm.common.LCMLogger] [SRC_METHOD: logMessages:939] Status document parsing failed. Nested exception is [[
org.xml.sax.SAXParseException: The reference to entity “Prom” must end with the ‘;’ delimiter.
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.createSAXParseException(ErrorHandlerWrapper.java:195)
at com.sun.org.apache.xerces.internal.util.ErrorHandlerWrapper.fatalError(ErrorHandlerWrapper.java:174)
at com.sun.org.apache.xerces.internal.impl.XMLErrorReporter.reportError(XMLErrorReporter.java:388)
at com.sun.org.apache.xerces.internal.impl.XMLScanner.reportFatalError(XMLScanner.java:1427)
at com.sun.org.apache.xerces.internal.impl.XMLScanner.scanAttributeValue(XMLScanner.java:881)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanAttribute(XMLDocumentFragmentScannerImpl.java:1547)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanStartElement(XMLDocumentFragmentScannerImpl.java:1320)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl$FragmentContentDriver.next(XMLDocumentFragmentScannerImpl.java:2756)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentScannerImpl.next(XMLDocumentScannerImpl.java:647)
at com.sun.org.apache.xerces.internal.impl.XMLDocumentFragmentScannerImpl.scanDocument(XMLDocumentFragmentScannerImpl.java:511)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:808)
at com.sun.org.apache.xerces.internal.parsers.XML11Configuration.parse(XML11Configuration.java:737)
at com.sun.org.apache.xerces.internal.parsers.XMLParser.parse(XMLParser.java:119)
at com.sun.org.apache.xerces.internal.parsers.DOMParser.parse(DOMParser.java:232)
at com.sun.org.apache.xerces.internal.jaxp.DocumentBuilderImpl.parse(DocumentBuilderImpl.java:284)
at weblogic.xml.jaxp.RegistryDocumentBuilder.parse(RegistryDocumentBuilder.java:163)
at com.hyperion.lcm.handler.Task.updateStatusDocument(Task.java:422)
at com.hyperion.lcm.handler.util.Migration._importGroupingUnit(Migration.java:681)
at com.hyperion.lcm.handler.util.Migration.migrate(Migration.java:303)
at com.hyperion.lcm.handler.TaskHandler.processMigrationQueue(TaskHandler.java:609)
at com.hyperion.lcm.handler.TaskHandler.runTasks(TaskHandler.java:420)
at com.hyperion.lcm.handler.TaskHandler.execute(TaskHandler.java:86)
at com.hyperion.lcm.clu.async.AsyncMigrator.run(AsyncMigrator.java:54)
at java.lang.Thread.run(Thread.java:662)

 

In some cases the LCM export would not check that all the dependencies have been met when exporting so sometimes you get failures when importing it again

Workaround:
——————-

Set Skip Validation to Yes.

Navigate to Administration>Migration Options –>Import Options

ERP Integrator – Skip Validation Yes/ No. Set to Yes.

Thanks,

~KKT~

FDM claissic time to say ‘GoodBye’


Dears,

I was personally very strong fan of FDM and doing stuff with VB was real magic with all sorts of mapping etc.

Bu now FDMEE is the future and we have to move ahead with Jython .

FDM classic will be supported till 11.1.2.3.500 and after also patch .700.  After that version there will no longer be further bug fixes.

So Starting from 11.1.2.4 you can have only FDMEE where a migration from FDM to FDMEE would be required.

Thanks,

~KKT~

FDMEE Open Interface Adapter – Updating ODi Project


Looking for a run down of how to update the Open Inteface project in order to add custom interfaces to load the AIF_OPEN_INTERFACE table.

All documentation says to:
– copy the ODI project and rename
– update the link in FDMEE for project code

All of the variables in the copied project fail since they still reference the original project. Is there instructions on how to successfully update the copied project?

Before following these steps, perform a backup of your existing work repository. Also backup your custom project
and model.
1) Create a new work repository with a new ID, say 900. This will be your development repository where you will perform
all customizations. If you already maintain a separate repository for customization, skip to step 4.
2) Export (Open Interface) Model and Project from FDMEE repository.
3) Import above Model into your dev repository (ID=900) in INSERT mode, followed by import of your Project.
4) Renumber your dev repository from 900 (or existing ID) to say 901 (topology -> repository -> right click to see
option). This is needed to have all objects tagged by ID 901. This will be your final dev repository ID where you
will perform customization work.
5) Export Model and Project from dev repository 901 as and when needed e.g. certain customization work is complete.
6) Import Model and then Project in INSERT mode to your target repository e.g. FDMEE repository where you want
customization to run.

Please perform any further customization on your dev repository (ID=901) and export to target/FDMEE repository as mentioned above.

Thanks,

~KKT~

ODI – view FDMEE Scheduled tasks


Follow the below steps to view FDMEE tasks.

1.) Open ODI Studio and click Connect To Repository.
2.) From the login screen, click the green Plus sign.
3.) Enter a name for the Login (can be anything).
4.) ODI Connection User is SUPERVISOR.
5.) Enter ODI Connection Password.
6.) Master Repository User is the FDMEE database username.
7.) Choose Microsoft SQL Server Datadirect Driver from the Drive List dropdown.
8.) Enter the URL: jdbc:weblogic:sqlserver://<db host>;databaseName=<database name>.
9.) Select Work Repository radio button and click the magnifying icon, then select desired FDMEE repository.
10.) Click the Test button (if you do not see “Successful Connection” the information provided is incorrect). Click OK, and then OK one more time.
11.) The log in screen should be visible again at this point. Select the Login Name created and press OK to log in to your FDMEE Work Repository.
12.) Click on the Operator tab in ODI Studio.
13.) Expand the Scheduling selection in the Operator tab.
14.) Expand the All Schedules selection to view scheduled tasks in ODI.
15.) Expand Scheduling under your scheduled batch.
16.) Right click OracleDIAgent and click Open.
17.) This screen will show the scheduled times/dates of which it should run.

Thanks,

~KKT~