Search 800 + Posts

Sep 10, 2023

High Level Process flow to Extract Data from Oracle Financial/SCM and WMS Cloud and Load into On Premise or Cloud Data Warehouse

 There are many ways to extract data from Oracle SaaS cloud to local on-Premise or On Cloud data  warehouse. In this blog I have highlighted one such approach ( at very high level) , for additional details feel free to reach me at

Step #1

Data Extraction

  1. SCM Cloud
  2. Financial Cloud
For data extraction from SCM/Financial cloud I have configured BICC , it is very easy to configure BICC and it's part of your SaaS ( no additional cost involve in that) , and configurations are very quick
  1. Metadata as well sample data for each extract is available.
  2. At any point you can Switch back to full load ( just in case there is any data corruption in your warehouse).
Only Challenge with BICC ,we need to do additional configurations for for EFF extractions , and incremental load is not available for some of extracts ( as of now ).
For Delivery of extracts , I have configured UCM server , with File age = 3 days.

On completion BICC extracts are will creates 2 files in UCM server

  2. Data files in .zip formart
For WMS Cloud
For WMS cloud , I have configured Data Extracts ( as part of Job Schedules) and configured SFTP Server as destinations for data files.

On completion data extracts for WMS are creating .csv file in SFTP server

Data files Consumption

On successful completion BICC extracts would create files in UCM server. Once BICC extracts are completed , I have a scheduled OIC job , and on execution this OIC job will
  1. Process and download all the latest from UCM server
  2. Download all the .zip files ( listed in from UCM server
  3. unzip zip files ( to extract .csv files) in SFTP serever
  4. Archive the .zip and .mf files
  5. No such step is required for WMS extracts , as WMS always send .csv files to SFTP server
Data File Load
This is the last step. As part of this step I have multiple OIC jobs ( one for each Module , DOO/AP/AR/WMS/PO/Product etc).
These are scheduled jobs and on execution these jobs will pick the respective files and load data into data warehouse . For example
OIC job for DOO will
  1. Pick Order Header/Order Line/Fulfillment line /EFF files from SFTP server.
  2. Read these files
  3. Load data into Specific DOO table created in datawarehouse
To handle errors at various level I have design and develop error handling mechanism at OIC level.

No comments:

Post a Comment