Search 800 + Posts

Jun 27, 2025

Powering Scalable Outbound Item Integrations from Oracle PDH to Oracle ATP Using OIC

In today's dynamic cloud-driven landscape, integrating core product data between Oracle SaaS and autonomous databases is essential for building scalable enterprise data hubs. One of our recent success stories involved designing and implementing a robust, scalable outbound integration from Oracle Product Hub Cloud (PDH) to Oracle Autonomous Transaction Processing (ATP) using Oracle Integration Cloud (OIC).

The Business Challenge

Our customer needed to extract and synchronize high-volume item master data—including **Item details, Item Relationships, Item Categories, and Item Structures—from Oracle PDH to a custom application layer built on Oracle ATP.

Given the volume and hierarchical nature of item data, a direct real-time query approach was not optimal. Instead, we architected a high-throughput batch integrationusing Oracle’s BIP reports.

The Solution: OIC at the Core

We leveraged the strength and flexibility of Oracle Integration Cloud (OIC) to orchestrate the integration in a scheduled fashion, ensuring accuracy, performance, and resilience.

Step 1: Data Extraction via Oracle BIP Report

We designed a single, unified BIP (BI Publisher) Report in Oracle Fusion SaaS to extract:

  • Item Master details
  • Item Category Assignments
  • Item Relationships
  • Item Structures

This BIP report generates a large output file (CSV) which becomes the input to our OIC integration.

OIC Scheduled Integration – Designed for High Volume

We created a Scheduled Orchestration Integration in OIC with the following core components:

Invoke BIP Report

  • Dynamically trigger BIP report and retrieve the report output URL.\
  • Supports BIP authentication and payload control.

Read File in Segments – Using ‘Stage File’ + Looping Logic

Given the large size of the BIP output file, we avoided loading it fully into memory.

OIC’s “Stage File” action lets us:

  • Read files in segments or chunks
  • Process them in a looped fashion, ensuring scalability and fault-tolerance
  • This avoids timeouts and memory constraints typical in large file processing.

Parse & Transform

  • Using XSLT mappings and switch logic, we parsed each record segment into normalized item rows.
  • This included lookups, cleansing, and key mapping as needed.

Insert to ATP – Using OIC Database Adapter

Leveraged OIC’s Oracle Database Adapter to perform:

Conditional upserts based on item identifiers

Key OIC Capabilities Demonstrated

  • Stage File                          Process large BIP output file in manageable segments                 
  • Read File in Segments      Controlled looping to ensure high volume processing without timeouts 
  • Database Adapter             Seamless connectivity to Oracle ATP with insert/upsert support       
  • Scheduled Integrations     Automate data sync based on business timing requirements             
  • SOAP/REST Support       Trigger and fetch reports securely from Oracle SaaS                  

Outcome

  • Improved performance by chunking data intelligently
  • High fault tolerance restarts can resume from last successful segment
  • Secure, scalable data sync from Oracle PDH SaaS to Oracle ATP
  • Easy to extend and monitor with built-in logging and error handling


Seamless Sales Order Integration: Oracle APEX + Oracle Fusion Order Management with OIC

In today’s digital enterprise landscape, seamless integration across cloud-native platforms is the key to operational efficiency and enhanced user experience. This blog outlines a real-world integration scenario where Oracle APEX is integrated with Oracle Fusion Cloud Order Management using Oracle Integration Cloud (OIC). This integration empowers users to create, manage, and synchronize sales orders in real time across platforms.

Use Case Overview

Business Need
A mid-size enterprise wanted to provide their sales team with a simplified and user-friendly front-end interface for creating and managing sales orders—without sacrificing the robustness and compliance of Oracle Fusion Order Management. To address this, a custom UI was built in Oracle APEX (running on Oracle ATP), and Oracle Integration Cloud was used to orchestrate the interaction between APEX and Fusion.

Key Features of the Solution

  • Custom Sales Order UI in Oracle APEX (powered by Oracle ATP)

  • Real-Time Order Creation using Oracle Fusion REST APIs via OIC

  • Two-Way Synchronization of order changes through OIC event subscriptions

  • Support for Order Line Additions, Cancellations, and Modifications directly from APEX

  • Event-driven updates from Fusion Order Management to keep APEX in sync

Technical Architecture

Components Involved:

  • Oracle APEX (on ATP):

    • Custom UI to create and manage sales orders/database tables

    • Sales data is persisted in Oracle Autonomous Transaction Processing (ATP)

  • Oracle Integration Cloud (OIC):

    • Receives and transforms data from APEX

    • Calls Fusion Order Management REST APIs

    • Subscribes to Order Change Events and syncs data with APEX

  • Oracle Fusion Order Management:

    • Authoritative source for all Sales Order processing and status updates

End-to-End Flow

  1. Order Creation via APEX UI

    • User enters customer, item, and order details in the APEX UI.

    • On clicking "Submit", an HTTPS POST request is sent to an OIC endpoint.

  2. Order Processing in OIC

    • OIC parses and validates the payload.

    • Invokes Oracle Fusion REST API to create the Sales Order.

    • Response is returned to APEX, and ATP is updated with status.

  3. Real-Time Event Sync

    • OIC subscribes to Oracle Fusion's Order Change Events.

    • On event trigger, OIC pushes the update back to the ATP tables to keep the UI current.

  4. Order Maintenance from APEX

    • Users can add lines, cancel orders, or modify lines.

    • Each action triggers a call to OIC which updates Fusion accordingly.

Highlighted OIC Capabilities

  • REST Adapter: Used to connect securely with Oracle Fusion SaaS APIs

  • Event Subscription Framework: Enables OIC to react to Fusion-side changes

  • ATP Integration: Writes status and responses to Custom tables (in ATP)  from OIC

  • Rich Mapping Engine: Handles JSON/XML to structured data transformations

  • Robust Error Handling: Ensures resiliency and alerts for integration issues

Benefits Achieved

  • Unified Order Experience across custom APEX UI and Oracle Fusion

  • Near real-time synchronization between ATP and Fusion OM

  • Minimal manual intervention and error handling

  • Scalable and Secure architecture with low-code development


Jan 5, 2025

Data Integration from Oracle, SQL Server, and Excel to ADW using ODI Marketplace

 

Objective of this use case is to To create a data integration pipeline using Oracle Data Integrator (ODI) Marketplace that extracts data from Oracle Database, SQL Server, and Microsoft Excel, and loads it into Autonomous Data Warehouse (ADW). The solution will perform a full load initially and incremental loads subsequently.

Architecture Overview

Source Systems:

  1. Oracle Database
  2. SQL Server
  3. Microsoft Excel file (hosted on a shared file system)

Target System:

  • Autonomous Data Warehouse (ADW)

Tools:

  • ODI Marketplace

Implementation Steps

Step 1: Setup ODI Marketplace Environment

  1. Create physical and logical schemas for the source and target systems in ODI.
  2. Configure agents and ensure connectivity to Oracle, SQL Server, and Excel sources, as well as the ADW target.

Step 2: Create Models and Datastores

  1. Create Models for each source (Oracle, SQL Server, Excel) and the target (ADW).
  2. Reverse Engineer the tables and data structures from the source systems into ODI.
  3. Create datastores representing the source and target tables.

Step 3: Design Interfaces (Mappings) in ODI

Full Load Process

  1. Oracle to ADW:

    • LKM Oracle to Oracle (DBLINK)
    • IKM Oracle Incremental Update
    • CKM Oracle
  2. SQL Server to ADW:

    • LKM SQL Server to Oracle (JDBC)
    • IKM Oracle Incremental Update
    • CKM Oracle
  3. Excel to ADW:

    • LKM File to SQL
    • IKM Oracle Incremental Update
    • CKM Oracle

Step 4: Full Load Execution

  1. Run the mappings for all sources to load data into ADW.
  2. Validate the data in ADW to ensure successful full load.



Step 5: Incremental Load Process

Oracle to ADW

  1. Use Last Update Timestamp to capture incremental changes.
  2. Modify the mapping to use IKM Oracle Incremental Update.

SQL Server to ADW

  1. Implement CDC using SQL Server's built-in CDC functionality.
  2. Use LKM SQL Server to Oracle (JDBC) and IKM Oracle Incremental Update.

Excel to ADW

  1. Track changes using versioning or timestamps in the Excel file.
  2. Use LKM File to SQL and IKM Oracle Incremental Update for loading.

Step 6: Schedule Jobs

  • Use ODI Scheduler to automate the full and incremental load processes.
  • Set up email notifications for job success/failure.

Dec 2, 2024

ETL vs. E-LT Architectures

 ETL vs. E-LT Architectures

Understanding the ETL Bottleneck

The traditional ETL (Extract, Transform, Load) architecture often faces performance challenges due to the following factors:

  1. Compute-Intensive Transformations: The ETL engine, a specialized tool, performs data transformations row-by-row. This can be inefficient, especially for large datasets.
  2. Network Bottlenecks: Data is moved multiple times over the network, increasing latency and potential for errors.
  3. Referential Integrity Checks: These checks can be resource-intensive, especially when data needs to be fetched from the target database for comparison.

The E-LT Paradigm: A Shift in Approach

E-LT (Extract, Load, Transform) is a newer architectural approach that addresses the limitations of ETL by shifting the transformation step to the target database. Here's how E-LT works:

  1. Extract: Data is extracted from source systems.
  2. Load: The extracted data is loaded into the target database.
  3. Transform: Data transformations are performed using native SQL or other target database-specific languages.

Key Advantages of E-LT:

  • Improved Performance:
    • By leveraging the native processing capabilities of the target database, E-LT can significantly improve performance, especially for complex transformations.
    • Reduced network traffic as data is moved only once.
  • Enhanced Scalability:
    • The target database can handle large datasets and complex transformations more efficiently.
  • Leveraging Existing Skills:
    • Database administrators and SQL developers can directly work on data transformations, reducing the need for specialized ETL tools and expertise.
  • Flexibility:
    • Greater flexibility in customizing transformations and optimizing performance.
  • Reduced Tool Dependency:
    • Less reliance on proprietary ETL tools, potentially lowering licensing costs.

However, it's important to note that E-LT is not a one-size-fits-all solution. It's best suited for scenarios where:

  • The target database has powerful processing capabilities.
  • The transformations are relatively simple or can be efficiently implemented using SQL.
  • Data quality checks can be performed after the load, or by the source system itself.

For complex transformations and data quality requirements, a hybrid approach combining ETL and E-LT might be more suitable.