Creating Custom Batching Logic Against an Array of Records Using OIC Integration
In many Oracle Integration Cloud, or OIC, implementations, we receive a large array of records that cannot always be processed in one shot. These records may represent invoices, purchase orders, employee details, files, payslips, or any other business object.
A common requirement is:
OIC receives an array of records, and the records need to be split into multiple batches based on a configurable “records per batch” count.
This article explains how we can implement custom batching logic inside OIC using Assign actions, Stage File JSON Write, and mapper logic.
Business Requirement
Assume OIC receives an array of file information records.
For example, the incoming array may contain:
The requirement is to split this source array into multiple batches.
For example: Total records: 45 ---Batch size: 20---Number of batches: 3
The expected batching would be:
This allows downstream processing to happen batch by batch instead of processing all records together.
High-Level Flow
The integration flow can be designed as follows:
Step 1: OIC Receives an Array of Records
The source payload may contain an array like this:
{
"FileInfo": [
{
"FileName": "Employee_01.csv",
"Directory": "/oic/stage",
"FileSize": "1234"
},
{
"FileName": "Employee_02.csv",
"Directory": "/oic/stage",
"FileSize": "1234"
}
]
}
The same approach can be applied to invoices, purchase orders, employee records, or any array-based payload.
Step 2: Calculate Total Records, Batch Size, and Number of Batches in Assign Action
Create an Assign action and define the following variables.
Total Number of Records
TotaNoOfRecods = fn:number(fn:count(TriggerArray))-->This counts the total number of records received in the input array.
Batch Size
BatchSize = 20-->Using a Lookup gives flexibility. If the business wants to change the batch size from 20 to 50, we can update the Lookup value without modifying the integration logic.
Number of Batches
NoOfBatches = fn:ceiling(fn:number($TotaNoOfRecods) div fn:number($BatchSize)) -->The ceiling function is important because the last batch may not be fully filled.
Total Records = 45 ; Batch Size = 20 ;(45 div 20 )= 2.25 -->ceiling(2.25) = 3-->So OIC creates three batches.
Step 3: Define the Target Batch JSON/Sample XML Structure
The target JSON structure can be created using Stage File / Stage JSON Write(Sample XML Aslso can be used).
Sample Target JSON structure:
{
"BatchSet": [(Holds No Of Batches)
{
"BatchNumber": "Batch_001",
"FileInfo": [(Holds the Source Trigger Array records innfo being Batched)
{
"FileName": "Employee_Details.csv",
"Directory": "/oic/stage",
"SerialNumber": "1",
"FileSize": "1234"
}
]
}
]
}
Step 4: Mapper Logic for Creating Batches and sort based on Filenames
The core logic is implemented using nested for-each loops.
<nstrgmpr:Write xml:id="id_12">
<ns28:request-wrapper>
<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) <= number($NoOfBatches)]">
<xsl:variable name="CurrentBatchCounters" select="number(position())"/>
<xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/>
<xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/>
<ns28:BatchSet>
<ns28:BatchNumber>
<xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/>
</ns28:BatchNumber>
<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) >= number($BatchStart) and number(position()) <= number($BatchEnd)]">
<xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/>
<ns28:FileInfo>
<ns28:FileName>
<xsl:value-of select="ns19:filename"/>
</ns28:FileName>
<ns28:Directory>
<xsl:value-of select="ns19:directory"/>
</ns28:Directory>
<ns28:SerialNumber>
<xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/>
</ns28:SerialNumber>
</ns28:FileInfo>
</xsl:for-each>
</ns28:BatchSet>
</xsl:for-each>
</ns28:request-wrapper>
</nstrgmpr:Write>
Outer Loop
The outer loop runs based on the number of batches(Using Same Source Array for Creating the Array defination for using in for loop with Batch size in Predicate/Condition)
<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) <= number($NoOfBatches)]">
This loop creates one BatchSet entry for each batch.
Batch Counter
<xsl:variable name="CurrentBatchCounters" select="number(position())"/>
This variable stores the current batch number.
Batch Start Position
<xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/>
This calculates the first record position for the current batch.
For example, if the batch size is 20:
Batch 1 Start = ((1 - 1) * 20) + 1 = 1
Batch 2 Start = ((2 - 1) * 20) + 1 = 21
Batch 3 Start = ((3 - 1) * 20) + 1 = 41
Batch End Position
<xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/>
This calculates the last record position for the current batch.
Batch 1 End = 1 * 20 = 20
Batch 2 End = 2 * 20 = 40
Batch 3 End = 3 * 20 = 60
Even if the last batch has fewer records, the inner loop will only pick available records from the source array.
Step 5: Generate Batch Number
Each batch can be assigned a formatted batch number.
<ns28:BatchNumber> <xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/> </ns28:BatchNumber>
This generates values like:
Batch_001 Batch_002 Batch_003
This is useful for logging, auditing, error handling, and downstream tracking.
Step 6: Inner Loop to Assign Records to Each Batch
The inner loop filters the source records based on the current batch range. Use Sort function if needed for sort the filenames.
<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) >= number($BatchStart) and number(position()) <= number($BatchEnd)]">
<xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/>
This means:
Then map the required fields.
<ns28:FileInfo>
<ns28:FileName> <xsl:value-of select="ns19:filename"/> </ns28:FileName> <ns28:Directory> <xsl:value-of select="ns19:directory"/> </ns28:Directory> <ns28:SerialNumber> <xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/> </ns28:SerialNumber>
</ns28:FileInfo>
The SerialNumber logic ensures that the numbering continues across batches.
For example:
Batch_001: 1, 2, 3, ... 20
Batch_002: 21, 22, 23, ... 40
Batch_003: 41, 42, 43, ... 45
Complete Sample Mapper Logic
<nstrgmpr:Write xml:id="id_12"> <ns28:request-wrapper> <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) <= number($NoOfBatches)]"> <xsl:variable name="CurrentBatchCounters" select="number(position())"/> <xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/> <xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/> <ns28:BatchSet> <ns28:BatchNumber> <xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/> </ns28:BatchNumber> <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) >= number($BatchStart) and number(position()) <= number($BatchEnd)]"> <xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/><ns28:FileInfo> <ns28:FileName> <xsl:value-of select="ns19:filename"/> </ns28:FileName> <ns28:Directory> <xsl:value-of select="ns19:directory"/> </ns28:Directory> <ns28:SerialNumber> <xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/> </ns28:SerialNumber> </ns28:FileInfo> </xsl:for-each> </ns28:BatchSet> </xsl:for-each> </ns28:request-wrapper> </nstrgmpr:Write>
Sample Batched Response
Corrected sample JSON structure(use Sample XML If you prefer):
{
"BatchSet": [
{
"BatchNumber": "Batch_001",
"FileInfo": [
{
"FileName": "Employee_Details.csv",
"Directory": "/oic/stage",
"SerialNumber": "1",
"FileSize": "1234"
},
{
"FileName": "Employee_Details2.csv",
"Directory": "/oic/stage",
"SerialNumber": "2",
"FileSize": "1234"
}
],
"BatchNumber": "Batch_002",
"FileInfo": [
{
"FileName": "Employee_Details3.csv",
"Directory": "/oic/stage",
"SerialNumber": "3",
"FileSize": "1234"
},
{
"FileName": "Employee_Details4.csv",
"Directory": "/oic/stage",
"SerialNumber": "4",
"FileSize": "1234"
}
]
}
]
}
Each batch is a separate object inside the BatchSet array.
How to Use the Batched Output
Once the batch JSON/XML is generated, it can be used for further processing.
A common approach is:
For example:
Why This Approach Is Useful
This custom batching approach is useful when:
Instead of building multiple integrations or depending on external preprocessing, OIC itself can create a clean batch structure using mapper logic.
Best Practices
Here are a few recommendations when implementing this pattern:
Note: The Designer mode will not be rendered since we make use of XSLT Variables, pay attention to the mapping feilds and do demo test in Mapper run with Sample inputs
Conclusion
Custom batching logic in Oracle Integration Cloud is a very effective design pattern when handling large arrays of business records.
By calculating the total record count, deriving the number of batches, and using nested mapper logic, we can split a single source array into multiple structured batches. This improves scalability, control, retry handling, and downstream processing efficiency.
This pattern can be reused across many integration scenarios, including employee file processing, invoice integrations, purchase order processing, payroll files, and other bulk data flows.
#OracleIntegrationCloud #OIC #Gen3 #CustomBatching #Batching #OracleCloud #Integration #Middleware #OracleIntegration #CloudIntegration #APIs #Automation #EnterpriseIntegration