Search 800 + Posts

Aug 12, 2025

Seamless Integration between APEX Order Entry and Oracle Fusion Order Management Using Oracle Integration Cloud

 Overview

In modern enterprise order processing, organizations often require a hybrid architecture that combines custom-built applications with enterprise SaaS solutions. In this project, we implemented an end-to-end integration between an APEX-based custom Order Entry application and Oracle Fusion Order Management (OM) Cloud using Oracle Integration Cloud (OIC).

The integration enables users to perform various order management actions—such as creating new orders, adding lines, deleting orders or lines, updating order details, and applying or releasing holds—directly from the APEX interface. OIC validates, processes, and synchronizes these transactions with Oracle Fusion OM, ensuring real-time visibility and seamless business operations.






Key Business Requirements

  1. Real-Time Order Processing: Allow APEX users to trigger order-related actions in Fusion OM without manual re-entry.
  2. Validation Before Processing: Run basic business validations in OIC before sending requests to Fusion OM.
  3. Bidirectional Synchronization: Update APEX application whenever relevant events occur in Fusion OM.
  4. Error Handling & Feedback Loop: Return success or detailed error messages to APEX for user visibility.
  5. Event-Driven Updates: Subscribe to Fusion OM events and push updates to the APEX application automatically.

High-Level Process Flow

The integration is built around two-way communication between APEX, OIC, and Fusion OM.



1. User Action in APEX

From the APEX UI, a user can perform:

  • Create New Order
  • Add Lines to Existing Order
  • Update Order/Order Lines
  • Delete Order/Order Lines
  • Apply Hold
  • Release Hold

Once the user saves their changes, APEX triggers an OIC job, passing:

  • Order Number or Order Line reference
  • Action to be performed
  • Required order details

2. OIC Request Processing

When OIC receives the request:

  1. Read Order/Line Data from the APEX tables via the Oracle ATP Adapter.
  2. Run Custom Validations:
    • Ensure mandatory attributes are present.
    • Check business rules (e.g., pricing, product availability).
  3. Validation Outcome:
    • Fail → Update APEX request status to E (Error) with error messages.
    • Pass → Update APEX request status to S (Submitted) and proceed.

3. Interaction with Oracle Fusion OM

OIC invokes Oracle Fusion OM REST APIs such as salesOrderForOrderHub to:

  • Create new sales orders.
  • Add or update lines.
  • Apply or release holds.
  • Cancel orders or lines.

If the API call succeeds:

  • Return Status = S to APEX with the Fusion Order Number.
    If it fails:
  • Return Status = E with detailed error information.

4. Event Subscriptions for Status Updates

OIC also subscribes to Fusion OM business events like:

  • Line Status Change
  • Pick Wave Creation
  • Pick Confirmation
  • Shipment Advice

When these events occur:

  1. OIC fetches the latest order status from Fusion OM.
  2. OIC updates the corresponding order in APEX.
  3. The APEX user always has real-time visibility into the order lifecycle.

Key Integration Highlights

  • Hybrid Processing: Combines synchronous request/response for order actions with asynchronous event-driven updates from SaaS.
  • Data Consistency: Ensures both APEX and Fusion OM reflect the same order status at all times.
  • Validation Layer: Prevents bad data from entering Fusion OM, reducing downstream exceptions.
  • Reusable Framework: The OIC flows are modular, making it easy to add new order actions in the future.

Benefits

  1. Faster Order Turnaround: Orders move from APEX to Fusion OM instantly.
  2. Reduced Errors: Pre-validation in OIC prevents common mistakes.
  3. Operational Efficiency: Eliminates manual re-keying of orders.
  4. Real-Time Updates: Business events keep both systems in sync.
  5. Scalable Design: Can easily accommodate future expansion of order workflows.


 

Aug 10, 2025

Building a Robust Data Integration Pipeline with Oracle Data Integrator (ODI)

 Introduction

In large-scale analytics and reporting environments, organizations often need to integrate data from multiple heterogeneous source systems into a centralized dimensional model.
This requires an efficient, traceable, and error-resilient ETL/ELT process.




The architecture shown in the diagram below illustrates how Oracle Data Integrator (ODI) can be designed to extract, stage, transform, and load data into a data warehouse while ensuring data quality, auditing, and monitoring at every step.

High-Level Data Flow


The integration follows a two-step ODI process:

Step 1: Source to Stage

  • Sources: Multiple source systems (ERP, CRM, on-premise databases, cloud applications, flat files, etc.)

  • ODI Extraction: ODI extracts data from each source system using the appropriate technology adapter (JDBC, File, REST, etc.).

  • Staging Layer: Each source system has its own dedicated staging tables to store extracted data and there is 1-1 mapping between source and staging tables:

    • Stage Table for Source System 1

    • Stage Table for Source System 2

    • Stage Table for Source System N

  • Error Handling During Stage Load:

    • E$ Tables: Capture records failing flow control or data quality rules.

    • C$ Tables: Temporary tables used during data loading.

    • I$ Tables: Temporary integration tables used during transformation.

    • Custom Audit Tables: Log the run history, row counts, and status.

    • Error Notifications: Email or alert messages sent for failed records.

Step 2: Stage to Dimensional Model

  • Transformation: ODI applies business rules, data cleansing, and mapping logic.

  • Loading into Dimensional Model: Data is loaded into the target star/snowflake schema for analytics.

  • Error Handling During Target Load:

    • Same E$, C$, and I$ tables capture any transformation or constraint violations.

    • Audit tables and notifications ensure transparency and quick resolution

Key Benefits of This Architecture

  1. Isolation of Issues
    By staging data from each source system separately, you can reprocess failed data without impacting other sources.

  2. Improved Performance
    Bulk loading and ELT optimization reduce transformation time and database overhead.

  3. End-to-End Data Quality Checks
    Flow control and static checks prevent bad data from propagating into the warehouse.

  4. Auditability & Compliance
    Custom audit and history tables provide a full trail of load execution, satisfying compliance requirements.

  5. Proactive Monitoring
    Error notifications allow support teams to respond quickly before downstream processes are impacted.



Best Practices for Implementing This Flow

  • Use Separate Staging Schemas for each source to avoid data collisions.

  • Leverage ODI Knowledge Modules (IKM, LKM, CKM) for reusability and performance tuning.

  • Implement Incremental Loads to reduce processing time and resource usage.

  • Automate Error Handling so failed records are logged and can be reprocessed without manual intervention.

  • Maintain Comprehensive Documentation of mappings, rules, and dependencies.

This ODI architecture offers a scalable, maintainable, and transparent approach to data integration.
By combining staging layers, robust error handling, and auditing mechanisms, you can ensure that data from multiple sources flows reliably into your dimensional model, ready for analytics and reporting.

Whether you’re integrating ERP and CRM systems, consolidating regional databases, or blending on-premises and cloud data, this design pattern provides a proven foundation for enterprise data warehousing.

Streamlining Item & Structure Management: Oracle APEX + OIC + Oracle PDH Integration

Introduction

In many organizations, Product Data Hub (PDH) serves as the single source of truth for product master data. However, business users often need simplified, custom UIs to create or maintain items and their structures — without navigating the complexity of PDH’s native UI.

In this implementation, we combined Oracle APEXOracle Integration Cloud (OIC), and Oracle PDH REST APIs to create a seamless, reliable workflow for item and structure creation/updates.





Business Requirement

  • Custom UI in Oracle APEX for creating/updating:

    • Items

    • Item Structures

  • Real-time integration with Oracle PDH

  • Robust error handling with feedback stored in APEX

  • Support for both create and update requests

  • Granular record processing (one record at a time to isolate failures)

High-Level Architecture

Actors:

  • APEX Application → Front-end data entry

  • Oracle ATP (Autonomous Transaction Processing) → Backend DB for APEX

  • Oracle Integration Cloud (OIC) → Orchestration and API integration layer

  • Oracle PDH (Product Data Hub) → Final system of record

Architecture Flow Diagram

  1. User Action in APEX

    • User creates or updates an Item or Structure

    • Record is inserted into an APEX backend table with a status_flag = 'N' (New)

  2. OIC Trigger

    • OIC integration is invoked (either on-demand or scheduled)

    • Reads one unprocessed record at a time from the APEX table

  3. Business Logic in OIC

    • Determine request type:

      • Create New Item

      • Update Existing Item

      • Create New Structure

      • Update Existing Structure

    • Route flow accordingly using OIC orchestration

  4. Calling PDH REST APIs

    • Use PDH REST endpoints to:

      • Create/Update Item

      • Create/Update Structure

    • Map APEX table fields to PDH payload structure

  5. Response Handling

    • If success → Update status_flag = 'S' in APEX table

    • If error → Update status_flag = 'E' and store error message for user review


OIC Orchestration Details

Integration Type: Orchestration (Application-driven or Scheduled)

Key Steps:

  1. Read Next Record from APEX

    • OIC uses ATP Database Adapter to query 1 row with status_flag = 'N'

  2. Identify Action

    • Conditional branches:

      • IF Request Type = ITEM_CREATE → Call PDH Item Create API

      • IF Request Type = ITEM_UPDATE → Call PDH Item Update API

      • IF Request Type = STRUCTURE_CREATE → Call PDH Structure Create API to Process Items/Components  as well Structure item

      • IF Request Type = STRUCTURE_UPDATE → Call PDH Structure Update API to Process Items/Components as well as structure item

  3. Invoke PDH REST API

    • Use REST Adapter with proper authentication

  4. Update Status

    • Success → Set status_flag = 'S'in ATP database ( APEX or CUSTOM Application)

    • Failure → Set status_flag = 'E' and store error_message (APEX or CUSTOM application)

  5. Loop Until No Records Left

    • Repeat until all new records are processed

Error Handling Approach


Benefits of the Solution

  • User-Friendly → APEX UI is much easier for business users than native PDH forms.
  • Decoupled Architecture → OIC handles all PDH API logic; APEX only stores and forwards requests.
  • Resilient → One-record-at-a-time processing ensures no mass failure due to one bad record.
  • Traceable → status_flag and error messages give full transparency to users and support team.
  • Extensible → New request types (e.g., Item Attribute Updates) can be added easily.



Aug 6, 2025

Why to Use Oracle Data Integrator (ODI) Instead of Writing SQL Queries

Why Use Oracle Data Integrator (ODI) Instead of Writing SQL Queries

Recently some one asked why to use ODI for loading data from Source system into data warehouse , why not just write Sql scripts,pl/sql apis.

In the world of enterprise data integration, simplicity is key but so is scalability, governance, and performance. One common question data architects ask:

Why should we use Oracle Data Integrator (ODI)? Why not just write SQL queries and create views to move data from source to target?

While SQL views and manual scripting work for some lightweight scenarios, they quickly hit walls in larger, more complex environments. This post highlights 10 compelling reasons to choose ODI for modern data integration.




1. Built for Heterogeneous Data Sources

ODI connects out-of-the-box with databases, files, APIs, and cloud applications. Writing SQL views across heterogeneous systems is painful and error-prone.

2. Declarative Design: Focus on What, Not How

ODI uses mappings and declarative logic to describe what needs to happen. It generates optimized code using Knowledge Modules (KMs).

3. Reusable and Modular Components

Reuse transformations with Reusable Mappings, variables, filters, and Load Plans. SQL scripts often become unmanageable at scale.

4. Built-in Logging, Auditing, and Error Handling

ODI auto-generates E$, C$, and I$ tables during execution. Logs and error handling are built-in. SQL-based ETL lacks this robustness.

5. Native Support for Incremental Loads (CDC)

ODI includes CDC, SCD, and delta load strategies without complex scripting.

6. Built-in Job Scheduling and Dependency Management

Define dependencies, schedule jobs, and manage parallelism all within ODI.

7. Performance Optimizations with Push-Down Technology

ODI executes code natively on the database leveraging partitioning and parallelism. SQL Views don't optimize as efficiently.

8. Cloud-Ready & API-Friendly

ODI can call REST/SOAP APIs, load data into ATP/ADW, and integrate with Oracle SaaS and OIC.

9. Data Governance and Security

Role-based access control, metadata tracking, and environment promotions make ODI enterprise-grade.

10. Maintainability at Scale

Centralized, visual, and debuggable designs make ODI ideal for teams and growing ecosystems.

Conclusion

If your integration needs are growing, ODI is not just a better tool it's the right architecture.

Feature

Raw SQL & Views

Oracle Data Integrator (ODI)

Multi-source Integration

Manual

Native support

Incremental Load Support

Custom Logic

Built-in CDC

Governance / Audit

Missing

Automated

Performance Tuning

Manual Indexes

Push-down Optimization

Job Scheduling & Workflow

Cron or Scripts

Integrated Load Plans

Cloud/API Support

Complex

Native REST Adapter


Looking Ahead: If you want to modernize your data pipeline, reduce technical debt, and ensure auditability and compliance ODI is a strategic move.