In the past integrating data into EPM Cloud has been limited. Unless you needed to integrate with an Oracle Cloud source like Financials, HCM or NetSuite then you are restricted to generating a file in the correct format and uploading. If there is the option of on-premise FDMEE then the hybrid functionality is a workaround for these limitations. However; with the newly released EPM Integration Agent these restrictions are a thing of the past.

Note: The EPM Integration Agent requires a Java Runtime Environment (JRE) 8+

Downloading Agent

We begin by downloading the EPM Integration Agent from the Data Integration within Oracle EPM Cloud. To do this, we go to Data Exchange and select Actions as shown below.

Creating Supporting Directories

Once downloaded, the zip should be extracted, make note of this location as it will be used in steps in sections below. For this post, I have brought it to my local C drive and in a folder, I called EPMAGENT1.

Within the bin folder there are scripts available for both Windows or *nix type platforms. Here is where we will start to configure and run the EPM Integration Agent . First, we need to execute “createAppFolder”. This EPM Automate command will create the application folder structure where we are able to store settings for our environment.

Note: That if you want to set up multiple EPM Integration Agents you will want to create multiple application folders. For example, multiple environments held on the same VM would use multiple “app” folders.

For the sake of this demo, I will be using one App Folder and calling the folder “appdata”.

Once the script is complete, you will find the folder structure below within your new folder called “appdata” – or whatever you chose to call it.

Configure INI File

Config

Within the config directory you will see the initialization file for the start parameters of the EPM Integration Agent. You will need to complete this file before we are able to start the EPM Integration Agent.

Below is an example of a completed ini file. This file stored on our EPM Integration Agent connects it to the EPM Cloud instance with the CLOUD_URL, CLOUD_DOMAIN, CLOUD_USER, CLOUD_PASSWORD.

For the CLOUD_PASSWORD you will be required to encrypt that password using the script in the EPM Integration Agent bin directory. Once encrypted, copy and paste to your ini file. Note: Every time the password expires or is changed, the password will need to be re-encrypted and updated in the ini file.

LOG_LEVEL can be either ALL, INFO or ERROR.

POLL_INTERVAL is for asynchronous mode only, it is the time interval in seconds that the EPM Integration Agent checks whether a process has been initiated in the cloud.

REGISTER_WITH_IP defines whether to register the EPM Integration Agent against the host name or IP address.

EXECUTION_POOL_SIZE is the number of executions that the EPM Integration Agent can run in parallel.

JYTHON_HOME and GROOVY_HOME is optional, they specify the installation location. Earlier I installed Jython to C:\jython2.7.1\ so this is used for the JYTHON_HOME parameter value.

CUSTOM_MEM_ARGS sets the minimum and maximum JVM memory sizes. The default values are just a guide and can be tuned after monitoring memory usage from running integrations.

CUSTOM_JAVA_OPTIONS allows you to add additional Java parameters, for example if the EPM Integration Agent needs to go through a proxy to reach EPM Cloud.

CUSTOM_CLASS_PATH provides the option to add additional jar files to Java classpath, for example additional JDBC drivers.

CUSTOM_INTERFACE_CLASS_NAME is only required if custom Java classes are implemented to override default functionality through events.

Data

The data directory will hold the data files that are created after extracting from the source. Those files then will be uploaded to the EPM Cloud as part of this integration process. Within this folder, there will be some housekeeping to maintain the directory as files are not deleted automatically.

Logs

The logs directory will contain the logs for the EPM Integration Agent in each process. These logs also get appended and uploaded to the EPM Cloud main process log.

Scripts

Script directory can contain any of four available event scripts. These are optional and can be covered in a later post.

Creating Clusters

You must create the CLUSTER, you will see in the ini file, we have EPMAGENT_ASYNC. Below you will see the steps to create the cluster. Within Data Integration, navigate to the EPM Integration Agent.

Within the EPM Integration Agent, there is a pre-created EPMCLUSTER, you can configure this one to your liking or create a new one. I have created EPMAGENT_ASYNC for the use of this demo – using the Asynchronous mode.

Save Certificate

Before we can kick off the EPM Integration Agent, there is one more task that is required. The EPM Cloud SSL certificate needs to be exported and copied to the EPM Integration Agent cert folder. I am using Chrome, but the process is similar for other browsers. Log into your EPM Cloud instance and click on the padlock icon.

Select site settings > Certificate.

The certificate will open.

Click the Details tab and select ‘Copy to File’.

Save the file to the EPM Integration Agent cert folder. This cert will expire, so make note of the valid date and know that you will need to replace it close to the expiration date or the EPM Integration Agent will stop working.

Starting EPM Integration Agent

Now we are ready to start the EPM Integration Agent. We will be using the scripts in the bin folder.

The syntax is:

startagent

or
epmagent

When you go back to Data Integration and select EPM Integration Agent and see that the cluster you identified in the ini file has been activated by the EPM Integration Agent.

Now that the EPM Integration Agent is running, it is polling to the cloud and check whether any processes have been initiated. In the below example it is checking the interval every 60 seconds. – that interval is set in the ini file.

Configure Data Management

Create Query

Now that the EPM Integration Agent is all setup, let’s begin setting up the Data Management integration. To start, we are going to define the SQL query that will pull our data. This query will be executed by the EPM Integration Agent to extract the data.

Data Integration > Actions > Query

Within the Query page you can add, edit, and delete. Below is the two examples that we have set up for the purposes of this demo. The name of the query is important to note, as we will be referencing that later in our Target Application set up.

When creating a new query, give the query a name and input the query string to be executed by the EPM Integration Agent.

Create Target Application

Navigate to Data Management > Target Application. Add a new target application using the Data Source select option.

From there, the drop-down menu has all of the options for Data Source. We will be using “On Premise Database”.

Selecting that option, the inbox will pop up. Here, you will need to import a file that has all the columns of your table. See the example below.

NOTE: The name of the file becomes the name of the Target Application.

Below is an example of the file I used to import.

Once imported, the Target Application is created and in the dimension details are the columns from the file we imported. These columns can be used as the source column for any of your planning dimensions.

Next on the application filters tab, there are options to configure the connection to the data source.

Data Extract Query – This is the name of the SQL query to execute. It is possible to define this at application or rule level.

Delimiter – This will be the delimiter that is used when the EPM Integration Agent creates the data extract file.

Credential Store – This provides the option to either store database connection information in the cloud or in a file. If “Cloud” is selected the connection information is entered in the application filters. If File is chosen, a file will need to be created in the EPM Integration Agent’s application data config directory in the same location as the EPM Integration Agent initialisation file. The database user’s password in the file will require to be encrypted using the “encryptpassword” script.

For this example, I am going to go for the “Cloud” option and will cover “File” in the next example.

JDBC Driver – This is the driver for connecting to the source database. Currently there is only “Oracle” and “Microsoft SQL Server”, but I am sure this will increase in future releases. This doesn’t mean you can connect to only those database types though, with the use of event scripting you can connect to pretty much any source.

JDBC URL– This is the database connection string. The Oracle syntax I will be using is

jdbc:oracle:thin:@dbhost:port/service

Username/Password are for the user you want to connect to the database with.

Fetch Size – This is the number of records that the JDBC driver fetches with each call to the database when executing the SQL query.

Create Import Format

As you continue through the process of your data management set up, you will notice that your import format will look a little different than what you are used to seeing. The source columns are now using the column headers that we uploaded in the file during initial target application set up. These become a drop-down option for you to select for each dimension you are mapping.

Create Location

Nothing changes with the way that you would set up your location. You are simply connecting the import format to the location and all other functionality here remains the same.

Create Data Load Rule

Moving to the data load rule, here is where you connect the mappings to the source data using data mappings.

Create Data Load Mappings

Using the data load mappings, you define the dimensionality of the source data. You can use many different mapping types to get as creative or as explicit as you’d like.

Executing Data Load Rule

When mappings are complete, you can kick off your load rule to run. Within Data Integration, you’ll see the queue in the EPM Integration Agent. You’ll see completed, processing, and failed jobs. You can go into process details and look at the logs to help clarify your error.

Once complete and successful, you’ll see 3 goldfish within data management. The data was successfully loaded into the application.

This new functionality will allow flat file loads to be a thing of the past. EPM Integration Agent will change how many clients are using Data Management today and can help streamline processes straight from the source.