How To Purge FDMEE Files


Under FDMEE instance, there is data and outbox directory which is getting bigger (80GB). How do we clean up those directories? What is the standard process? There are some some other files such as .ind and .dat direxctly under FDMEE instance. How do we clean those up?

In-order to purge the FDMEE files, there is an option under “workflow” > Go to > Script Execution > Click on System Maintenance Tasks > there you have the scripts to Purge files.

These scripts run based on the application, you need to select the start and the end period along with the category.

This will help you to purge files.

Maintain Application Folder
======================

The Maintain Application Folder process purges files from the inbox, outbox, and data folders directory. Oracle Hyperion Financial Data Quality Management, Enterprise Edition accepts a separate Days to Keep parameter for each of the folders. If the value is not specified for a specific folder, FDMEE skips the folder.

In addition, FDMEE checks the inbox and outbox sub-directories under the respective folders and deletes any files. In the data, FDMEE skips the scripts directory because it holds customer scripts.

Maintain Process Table

This process maintains the following execution tables:

AIF_PROCESSES

  AIF_PROCESS_DETAILS

  AIF_PROCESS_LOGS

  AIF_PROCESS_PARAMETERS

  AIF_PROCESS_PERIODS

  AIF_PROCESS_STEPS

  AIF_BAL_RULE_LOADS

  AIF_BAL_RULE_LOAD_PARAMS

  AIF_BATCH_JOBS

  AIF_BATCH_LOAD_AUDIT

  AIF_TEMP

It accepts the number of days to keep as a parameter.

Thanks,

~KKT~

Advertisements

FDMEE Runs Out of Resources


Dears,

From last month we noticed that FDMEE software has been going down multiple times a day. Below error message we are receiving in the logs

Error Message: weblogic.jdbc.extensions.PoolLimitSQLException: weblogic.common.resourcepool.ResourceLimitException: No resources currently available in pool aif_datasource to allocate to applications, please increase the size of the pool and retry..

Increase the connection pool for jdbc source (aif-source) from default 120 to 200 then 300

so we check and Currently the Maximum JVM heap Size is set to a large value, but the minimum JVM heap Size does not match the Max. Follow below steps to update.

A) Update the Min and Max JVM heap Sizes to Match each other (xms 5120 from xms 128 as an example to increase to 5 GB)

Start > Run > RegEdit

HKEY_LOCAL_MACHINE > Software > Hyperion Solutions > ERPintegrator0 > HyS9AIFWeb_epmsystem1

Locate the JVMOption value for -Xmsxxxm

Locate the JVMOption value for -Xmx5120m

Make sure the two match

B) Verify that the AIF_DataSource JDBC Connection Pool Max is set to a large enough value (depends on number of users)

C Verify that the ODIMASTER JDBC Connection Pool Max is set to the same value as the AIF_Datasource JDC Connection Pool

D) Recycle the FDMEE Web Application

Thanks,

~KKT~

FDMEE Error – Essbase Error(1003007): Data Value [X] Encountered Before All Dimensions Selected, [1] Records Completed


Dimension details in the Target Application screen do not look like what is in Planning.
When setting up a new Data Load Rule, all of the available Plan Types are not showing.
When trying to run a Data Load Rule, the following error occurs: “Essbase Error(1003007): Data Value [X] Encountered Before All Dimensions Selected, [1] Records Completed.”
Mappings are not being applied to all dimensions even though mappings exist.

There are situations where the metadata that FDMEE has stored for a Planning application within PBCS is different from the actual Planning metadata. If changes are made via LCM or simply through the Planning application, it may not be the same as when the application was registered within FDMEE.

Use the following steps to resolve the issue:

  • Before trying anything, you should go into the Target Application screen, choose the Planning application and click the “Refresh Metadata” button and see if the issue is resolved.
  • If there is still an issue, back up all of the artifacts for this target application using LCM so that you have a backup.
  • Go into the Data Load Mappings screen, into each location and export mappings for all dimensions so that you have a backup(s) of your mappings.
  • Go to the Target Application screen, highlight the target application and click the red X to delete the target application.
  • Navigate to another screen within FDMEE, then back to the Target Application screen.  The problem target application should not exist anymore.  Register the Planning application
  • Restore your artifacts in LCM
  • Restore your mappings for each location in the Data Load Mappings screen

Thanks,

~KKT~

FDMEE – Unable To See Newly Created PBCS Plan Type When Creating Data Load Rule


After creating new Plan Type in PBCS, it cannot be used in Data Management as a Target a Plan Type. When creating Data Load Rule in Data Management (FDMEE), the Target Plan Type drop down will not show the newly created Plan Type.

Follow the below steps to see the newly created Plan Type under Target Plan Type in Data Management.

– Add a new Cube at Administration > Manage > Cubes
– Refresh Database as usual
– Check that the new cube appears at the Planning Application windows
– Refresh Metadata and Members at Data Management Target Application window
– Click on Save
– Check that the new cube is not shown in the Plans list at Data Load Rule creation

Thanks,

~KKT~

FDMEE Automated Batch Loads Time Out


In Financial Data Quality Management Enterprise Edition (FDMEE) automated batch loads time out.

There are two most likely causes of this issue:

  • Batch timeout setting not defined or insufficient in system and/or application settings in the FDMEE application
  • Stuck thread timeout setting needs increasing in the ErpIntegrator0 managed server (weblogic)

To increase the batch timeout setting in the FDMEE application:

  1. Navigate to Data Management from EPM workspace.
  2. From the Setup tab, and then under Configure, select System Settings.
  3. In System Settings, from Profile Type, select Other.
  4. Define a timeout value for the “Batch timeout in minutes” setting.

To increase the Stuck Thread timeout settting in the ErpIntegrator0 managed server in weblogic:

  1. Navigate to weblogic admin console.
  2. Under domain structure, expand Environment and choose servers.
  3. Choose ErpIntegrator0 from the servers list.
  4. Under configuration, choose tuning.
  5. Choose Lock & Edit to modify the setting value.
  6. Increase the Stuck Thread Max time value. The value is in seconds. By default it will be 600 seconds (5 minutes).
  7. Choose activate configuration so that the settings take effect.
  8. Restart FDMEE service.

Hope this helps.

Thanks,

~KKT~

FDQM/EE Loading Metadata to on-premise and cloud EPM Applications – Part-2


Import Format

The import format will map the columns from the source file to the dimensions in the custom application, or set defaults for properties that are the same for every member. In other cases, some of the members are set to the text “BLANK” so that mapping will succeed, but then values for the BLANK members may or may not be set by the by the event script. The use of the text for BLANK is specifically required when using the output dat file to generate the metadata file. If pulling directly from TDATASEG the user may not need to use this method.

Please also note that the AMOUNT field should be set to any valid numeric amount so that the input file will be successfully processed. In addition, a SKIP row may be required if the input file includes a header record.
For this example, the Import Format is defined as follows:

Event Scripts
In this example, an AftExportToDat and BefLoad script were used to process the file and then push the metadata to HFM. It’s possible to perform everything needed in a single script if desired by the user, and alternate implementations are possible. The script details are as follows:

AftExportToDat – Generate the metadata file in the format required by HFM

import shutil
#Open the .dat file generated by FDMEE
datFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.dat”
datFile = open(datFileName,”r”)
#Open the .app file to be created for HFM
appFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.app”
appFile = open(appFileName,”w”)
#Create .app file Header lines
appFile.write(“!FILE_FORMAT=11.12\n!VERSION=11.1.5250\n!CUSTOM_ORDER=Movement;CostCenter;Location;Nature\n\n”)
appFile.write(“!MEMBERS=Entity\n’Label;DefCurrency;AllowAdjs;IsICP;AllowAdjFromChildren;SecurityClass;UserDefined1;UserDefined2;UserDefined3;HoldingCompany;SecurityAsPartner;Default Parent;Description(s)\n”)
#Include a counter to skip record in the .dat file generated by FDMEE
i = 0
#Parse the records in .dat file & replace BLANKs with empty string. (FDMEE puts BLANK for empty columns by default & might be rejected by target EPM application.)
for memberLine in datFile:
i = i + 1
if i > 1:
memberLine = memberLine.replace(“BLANK”,””)
memberLine = memberLine.split(“,”)
#Derive the Default Parent
if memberLine[15] != “”:
DefParent = memberLine[14]
elif memberLine[14] != “”:
DefParent = memberLine[13]
else:
DefParent = ‘#root’
#Write the .app file record for members using the below format.
appFile.write(“%s;%s;%s;%s;%s;%s;%s;%s;%s;%s;%s;DefaultParent=%s;English=%s\n” % (memberLine[0],memberLine[1],memberLine[2],memberLine[3],memberLine[4],memberLine[5],memberLine[6],memberLine[7],memberLine[8],memberLine[9],memberLine[10],DefParent,memberLine[12]))
#Create another header record in the .app file for the hierarchy section.
appFile.write(“\n!HIERARCHIES=Entity\n”)
#Close the .dat file
datFile.close()
#Open the .dat file again for parsing the records for hierarchy section.
datFile = open(datFileName,”r”)
#Write a logic such that hierarchy sections are not repeated in the .app file.
j = 0
seen = set()
for hierarchyLine in datFile:
j = j + 1
if j > 1:
hierarchyLine = hierarchyLine.replace(“BLANK”,””)
hierarchyLine = hierarchyLine.split(“,”)
gen1 = “;%s\n” % (hierarchyLine[13])
gen2 = “%s;%s\n” % (hierarchyLine[13],hierarchyLine[14])
gen3 = “%s;%s\n” % (hierarchyLine[14],hierarchyLine[15])
if gen1 not in seen and hierarchyLine[13] != “”:
seen.add(gen1)
appFile.write(gen1)
if gen2 not in seen and hierarchyLine[14] != “”:
seen.add(gen2)
appFile.write(gen2)
if gen3 not in seen and hierarchyLine[15] != “”:
seen.add(gen3)
appFile.write(gen3)
#close the .app file & .dat file.
appFile.close()
datFile.close()

BefLoad – Load the file to HFM

#Import required HFM & Java libraries
from java.util import Locale
from java.io import File
import shutil
from oracle.epm.fm.hssservice import HSSUtilManager
from oracle.epm.fm.domainobject.application import SessionOM
from oracle.epm.fm.domainobject.loadextract import LoadExtractOM
from oracle.epm.fm.common.datatype import transport
#HSS Username / Password
UserName = “admin”
Password = “Password”
#Target HFM connection properties
clusterName = “HFMCluster”
application = “DEVHFM”
#Authenticate user & create HFM Session
ssoToken = HSSUtilManager.getSecurityManager().authenticateUser(UserName,Password)
hfmsession = SessionOM().createSession(ssoToken, Locale.ENGLISH,clusterName,application)
#Load members to HFM
loadOM = LoadExtractOM(hfmsession)
appFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.app”
metadata_file = File(appFileName)
memberLoadOptions = transport.MetadataLoadOptions()
memberLoadOptions.setUseReplaceMode(False)
memberLoadOptions.setEntities(True)
memberLoadOptions.setDelimiter(“;”)
setCustomDims = [False,False,False,False]
memberLoadOptions.setCustomDims(setCustomDims)
loadInfo = loadOM.loadMetadata(metadata_file, memberLoadOptions)
logFile = loadInfo.getLogFile()
logFileName = str(fdmContext[“OUTBOXDIR”].replace(“\\”,”/”)) + “/” + str(fdmContext[“TARGETAPPNAME”]) + “_” + str(fdmContext[“LOADID”]) + “.log”
shutil.copy(logFile.getPath().replace(“\\”,”/”),logFileName)
SessionOM().closeSession(hfmsession)
fdmAPI.showCustomMessage(“Metadata Loaded Successfully”)

Continue – Part-3

FDQM/EE Loading Metadata to on-premise and cloud EPM Applications – Part-1


On-premise FDMEE will be used as the tool that facilitates loading metadata to the target EPM application. The high level process to use on-premise FDMEE to load metadata is
as follows:

1) Create a target custom application in FDMEE that includes a column for each of the attributes that the user wishes to load. Some of these attributes will be in the source file, and some may be defaulted during the import process. For example, the source file may contain the member name, description, parent, IsICP flag, SecurityClass, ValidForPlanType, etc.

2) Define an import format in FDMEE to map the columns in the source file to the dimensions in the target custom application. For attributes not in the source file, define an optional default in the import format.

3) Enter any necessary mapping rules for any required transformations.

4) When running the integration with FDMEE, the system will create a .dat file in the outbox directory with all the columns specified in the custom Application definition when the Export button is clicked. This file cannot be directly loaded into the target EPM application and requires further processing to match the required format of the target EPM application.

5) Create an AftExportToDat event script that will be used to format the data loaded to FDMEE into the file format required to load data to the target application. An additional script (BefLoad) will also be used to call the necessary API to load the file to the desired target application. In some cases file headers, section headers, etc. may be required as part of the metadata load file.

Please note that there is more than one way to generate a metadata file from FDMEE that can be loaded to a target application. Some users prefer to load data to TDATASEG, and then query TDATASEG to select the necessary data. Other users will scan the system generated file from the export process to generate the file to load rather than selecting data from the TDATASEG table.

Both methods are valid, and it is really the user’s preference to select the method they desire. Once these artifacts are defined in FDMEE, users can begin to load metadata from a file using FDMEE. It is assumed that the reader of this document is familiar with FDMEE, and Jython scripting.

Custom Application Setup
The first step is to create a custom application in FDMEE that can be used as a dummy target for setting up the FDMEE artifacts. The custom application will contain a column for each attribute that the user intends to load as part of the metadata import process. In addition, the customer can also add description and attribute columns to the import format to extend the number of attributes beyond what is defined in the custom application. Please note that the customer is constrained by the TDATASEG table definition in regards to the number of columns that can be used to support the desired set of attributes. The set of columns that can be used includes Account, Entity, ICP, UD1-20, ATTR1-14, DESC1-2 which is a total of 39 columns that can be used to define the metadata that is desired to be loaded. (DATAX is not used because custom applications only support numeric data loads.)

Import Format
The import format is used to map the data from the input file to the dimensions in the target application, and is also a way to set defaults for attributes that are not included in the source file. As of the date of publication, custom applications only allow loading of numeric data, and the user will need to define the source file type as “Delimited – Numeric Data”, and provide a number as a default in the expression field for the amount. This does not impact the metadata load process, but is required to successfully load the file to FDMEE as non-numeric amounts are rejected by FDMEE while performing a numeric data load.

Additional FDMEE Setup
Users will perform the rest of the required setup like a regular data load process and define the required periods, categories, mapping rules and data load rules. Artifacts like periods and categories are not required to load metadata, but are required in FDMEE to successfully process a data load file and will need to be setup. A generic category/period mapping like Metadata / Metadata_Period for example can be created for this purpose.

Event Script
An event script is used to either extract data from the TDATASEG table and reformat into a text file, or process the data file that is generated from the custom application export process. In addition to reformatting the data into the correct format for the target application, the event script must also call the appropriate API or command for the target application. For HFM, the script needs to call the HFM Java API. For on-premise planning, the script must call the outline load utility or the REST API (11.1.2.4 version). For the cloud, the script must call the REST API
or EPMAutomate.

Load to on-premise Financial Management (HFM)
For this blog , we are assuming that the user is loading metadata to HFM version 11.1.2.4.x, and will be using the Java API to load the metadata file. The unique aspect about loading metadata to HFM is that the metadata load file includes multiple sections, and the event script needed to output the metadata and load the file will need to take these different sections into consideration. This
example will show how to load a data file in a specific format to an HFM application. It is possible that users may need to load a metadata file with additional sections not included in this example, and the user should reference the HFM administration guide to understand the format required for any additional information.

Please note that the process to generate the metadata file to load to HFM is as follows:

1. Import file into FDMEE.
2. Perform mapping.
3. Process *.dat file from the export step to generate member data and hierarchy data. The output file will be read one time to generate the individual member data, and one time to generate the hierarchy section of the file.
4. Any logic to determine the “Default Parent” needs to defined accordingly.

Source File
For this example, the following file was used:
EntityName,Description,IsICP,SecurityClass,HFMLevel1,HFMLevel2,HFMLevel3

World,Earth,N,OPWP_90,World,,
Asia,Asian Countries,N,OPWP_90,World,Asia,
Europe,European Countries,N,OPWP_90,World,Europe,
North America,Northern America,N,OPWP_90,World,North America,
India,India,Y,OPWP_90,World,Asia,India,Asia
China,China,Y,OPWP_90,World,Asia,China,Asia
United Kingdom,England Wales & Scotland,Y,OPWP_90,World,Europe,United Kingdom
Canada,Canada,Y,OPWP_90,World,North America,Canada

The input file includes a header row which documents the layout of the file.Loading metadata to HFM also requires a hierarchy, and this data is included in the metadata file in the last 3 positions labeled as HFMLevel1, HFMLevel2 and HFMLevel3.
The intended member output for the load file is defined as the following:

‘Label;DefCurrency;AllowAdjs;IsICP;AllowAdjFromChildren;SecurityClass;UserDefin
ed1;UserDefined2;UserDefined3;HoldingCompany;SecurityAsPartner;Default
Parent;Description(s)

Some of the output is not included in the input file, and these items will be added as defaults from the import format, or as part of the event script that generates the output file.

Custom Application Definition
The custom application used in this example uses the following dimensions:

Dimension Name Dimension Class Data Table Column
AllowAdjFromChildren Generic UD5
AllowAdjs Generic UD3
DefaultParent Generic UD12
DefCurrency Generic UD2
Description Generic UD13
HoldingCompany Generic UD10
IsICP Generic UD4
Level1 Generic UD14
Level2 Generic UD15
Level3 Generic UD16
MemberName Generic ACCOUNT
SecurityAsPartner Generic UD11
SecurityClass Generic UD6
UserDefined1 Generic UD7
UserDefined2 Generic UD8
UserDefined3 Generic UD9

Please note that this is just an example, and the custom application used for this process can use other dimensions as required – you are not locked into this definition. Below is a partial screen shot of the custom target application definition in FDMEE. (Please note that ACCOUNT is a required dimension in a custom application.)

Continuation-

FDMEE: Where Are The Logs For FDMEE


What are the logs for FDMEE and where are they stored?

 

When providing logs for support please ensure you have set the logging level to 5. This can be done inside FDMEE in the System Settings.

 

11.1.2.3

1) ERPIntegrator0.log

Stored under: \Oracle\Middleware\user_projects\domains\domain\servers\ErpIntergrator0\logs

Often referred to as the FDMEE or ERPI log

 

2) Process Log  (<EPM Application Name>_<processid>.log)

Stored under: FDMEE Root Folder\Outbox\Logs  (to see the location of the Root Folder please check the System Settings or Application Settings within FDMEE)

This log can also be seen from with FDMEE on the Process Details screen. Check the Show Log option for the process

 

3) ODI Agent Log

Stored under: \Oracle\Middleware/user_projects/domains/EPMSystem/servers/ErpIntegrator0/logs/oracledi

 

11.1.2.4

1) aif-webapp.log

Stored under: \Oracle\Middleware\user_projects\domains\domain\servers\ErpIntergrator0\logs

 

2) Process Log  (<EPM Application Name>_<processid>.log)

Stored under: FDMEE Root Folder\Outbox\Logs  (to see the location of the Root Folder please check the System Settings or Application Settings within FDMEE)

This log can also be seen from with FDMEE on the Process Details screen. Check the Show Log option for the process

 

3) ODI Agent Log

Stored under: \Oracle\Middleware/user_projects/domains/EPMSystem/servers/ErpIntegrator0/logs/oracledi

Thanks,

~KKT~

FDM Error “Unable To Load The JVM” When Importing Data From ERPI


This error generally occurs when you apply a patch to upgrade your EPM to 11.1.2.2.500

In Financial Data Quality Management (FDM) the following error message is displayed during import from ERPI:

Unable to load the JVM.

The data load in ERPi runs without issue.

The main reason it occurs is as during the patch installation the Java Virtual Machine (JVM) path to the ERPICOMJNIBridge has not been updated correctly.

so as to fix this issue Reconfigure the FDM Load Balancer. This will configure and register the JVM path.

 

Load Data From FDMEE To ARM Error:” Export terminated due to mismatched currency buckets.”


When loading data to ARM  from FDMEE getting the error message:
Export terminated due to mismatched currency buckets “XXXXXX”. Ensure that Financial Data
Quality Management, Enterprise Edition Category names match ARM currency/buckets.

Error

To Fix-

Verify that currency bucket in ARM match currency in FDM:

Check currency mappings in FDMEE:

1. Log into Workspace

2. Navigate->Administer-> Setup->  Category Mapping ->  Application Mapping

3. Ensure that Target Application is “Account Reconciliation Manager”.

4. Make sure that you have appropriate Categories Setup.

FDMEE_Category

Thanks,

~KKT~