In modern enterprise resource planning, data is often siloed across different cloud modules. To achieve a truly unified view of your business, you must move beyond simple point-to-point integrations. This use case explores a robust architecture using Oracle BICC for Fusion SaaS, OOB Extracts for WMS Cloud, and Oracle Integration Cloud (OIC) as the conductor to stage and load data into Oracle ADW
👉 Youtube link
1. The
Strategy: Custom Orchestration vs. Oracle FDI
A common question is: "Why not use Oracle Fusion Data Intelligence (FDI)?" While FDI is an excellent "out-of-the-box" tool for standard Fusion KPIs, this custom architecture is preferred when:
- Deep Integration is Required: You need to tightly join non-Fusion data (like WMS Cloud) with SCM records.
- Proprietary Logic: You have specialized business rules or packaging tool data that standard FDI schemas don't support.
- Cost Efficiency: You want a consumption-based model using existing OIC and ADW licenses.
2. Phase 1:
High-Volume Extraction
3. Phase 2: The
Orchestration Layer (OIC)
Oracle Integration Cloud (OIC) acts as the "Conductor," not a "Data Mover." Its role is restricted to two critical tasks: 1. Centralizing the Data Lake: OIC picks up the .csv files from the WMS SFTP server and moves them to the OCI Object Storage Bucket. This ensures all enterprise data is staged in a single high-performance OCI location.
2.
Parsing the BICC Manifest: BICC generates a Manifest.mf file listing
all data ZIPs in a batch. OIC reads this manifest, identifies the specific data
files, and maps them to the correct target tables in ADW.
4. Phase 3:
High-Speed Ingestion via DBMS_CLOUD
The core technical
upgrade is moving away from row-by-row processing. Instead of OIC
"pushing" data, ADW "pulls" it.
OIC triggers a PL/SQL API in ADW that utilizes the DBMS_CLOUD.COPY_DATA procedure.
- Performance: It uses ADW’s parallel processing to ingest millions of rows in minutes.
- Native Compression: ADW natively unzips BICC files directly from Object Storage, eliminating the need for OIC to handle large file sizes.
- Security: It uses secure OCI credential objects, leveraging the encrypted OCI backbone for data transfers.
5. Governance:
Audit, Error Handling, and Archiving
A production-grade pipeline must be "self-healing" and transparent.
- Audit Trail: Every load is logged in a BIZ_DATA_LOAD_LOG table in ADW. Functional users can verify data freshness and row counts via a simple dashboard.
- Error Handling: We implement a multi-tier approach. OIC validates file integrity and manifest logic, while DBMS_CLOUD automatically manages "Reject Tables" for any data-level failures.
- Data Archiving: To keep the landing bucket clean, OIC moves processed files to an /archive/[timestamp]/ folder with a defined retention policy (e.g., 30 days), allowing for easy reprocessing if needed.
By
centralizing your SCM and WMS data in OCI Object Storage and leveraging the
power of DBMS_CLOUD, you transition from a slow, fragile integration to a
robust, high-performance data lake that scales with your business.