HDL (Inbound) Integration
Requirement:
Automating Job creation in Oracle HCM Cloud using external CSV data.
High-Level Integration Flow
We developed an end-to-end integration that pulls CSV files from a remote server, transforms them into the mandatory HDL .dat structure, and stages them on an FTP server.
The final process picks up these files and pushes them into Oracle Fusion HCM, ensuring seamless job record synchronization.
Integration Metadata
Source: FTP Server
Target: Oracle HCM Cloud Fusion Application
Direction: Inbound
Pattern: Schedule-based Orchestration
Implementation Strategy
Before developing the integration logic, we must establish secure connectivity between the source and target systems. This involves:
FTP Connection: Configuring the SFTP/FTP adapter to access the remote data server.
HCM Cloud Connection: Setting up the Oracle HCM Cloud adapter to facilitate data loading and web service invocations.
Login with only HCM user as HDL works with only HCM Objects.
Test the connection and save it
Now, open FileZilla
Click on Connect button to access FTP Remote Server
Inside remote server site, right click and choose create
directory
Type HDL in place of New Directory
Place job.zip file from local site to remote site under the
path: A2CF77/HDL
Now, start the Integration Flow
Choose Schedule Style of Integration
Click on create
First step is to list down the files by using the FTP List File
operation
Choose List Files operation
*.zip means it will consider all the files ending with .zip
As I have only one file, I have used * but in case if there
are multiple files then we need to specify the name of the file here
Now, use for-each-function
It will check for all the files
Click on Save
Parallel Processing is not checked
Since HDL gets large amount of data, we will go for download
the file option instead of read the file.
Read File can do up to Max(10 MB for read file)
Download operation, we can use upto 1 GB
Choose download
Save the downloaded file in temp directory
Click on mapper
Now, connect to HCM Cloud
Click on + symbol
Choose HCM Cloud
Choose import bulk data using HDL
Choose the first option
And later second one
We are not getting any encrypted files. If we do then we
need to decrypt the files.
From OIC to HCM Cloud mapping
File reference is mandatory when we are dealing with bulk
data
File Name is also y mandatory
We have submitted the job and we need to know the status.
We need to wait for some to run the process
So, let us wait for some time; choose wait option under
actions in the Integration
Click on + ad add wait
Wait for 2 min.
Click on + and go for HCM Cloud connection again
Get Status of HDL job
Previously we have submitted the data, now choose query the
status of job.
Click on mapper
Click on + and add the Logger
Logger helps to give the information about success or
failure
Choose Logger
Under status
We can see status
List Files: Identify all available files in the remote FTP directory.
Looping Mechanism: Use a For-Each action to process multiple files in a single run.
Download to Staging: Transfer files from the FTP to the OIC temporary directory.
Submit HDL Job: Trigger the Bulk Data Import in HCM Cloud to process the .dat files.
Process Monitoring: Implement a Wait action followed by a Get Status call to monitor the import progress.
Result Logging: Log the final outcome (Success/Failure) for auditing and troubleshooting.
Now, click on (I) symbol on the top and enable the Business Identifiers
Click on Save to enable Business Identifier
Activate Integration
Run the integration
Navigation to HDL in Fusion Application
Go to Myclient Groups---?Data Exchange-->HDL
Setup and maintenance
Search for manage common lookup
We can get the source system owner
Update the correct source system owner and then run the
integration again
When you make changes in Integration then need to deactivate
and then reactivate it again
Compress and zip the file and place it in
No comments:
Post a Comment