Creating Custom Batching Logic Against an Array of Records Using OIC Integration

Creating Custom Batching Logic Against an Array of Records Using OIC Integration

In many Oracle Integration Cloud, or OIC, implementations, we receive a large array of records that cannot always be processed in one shot. These records may represent invoices, purchase orders, employee details, files, payslips, or any other business object.

A common requirement is:

OIC receives an array of records, and the records need to be split into multiple batches based on a configurable “records per batch” count.

This article explains how we can implement custom batching logic inside OIC using Assign actions, Stage File JSON Write, and mapper logic.


Business Requirement

Assume OIC receives an array of file information records.

For example, the incoming array may contain:

  • Employee files
  • Invoice files
  • Purchase order records
  • Payroll files
  • Any structured business payload

Article content

The requirement is to split this source array into multiple batches.

Article content


For example: Total records: 45 ---Batch size: 20---Number of batches: 3

The expected batching would be:

  • Batch_001 contains records 1 to 20
  • Batch_002 contains records 21 to 40
  • Batch_003 contains records 41 to 45

This allows downstream processing to happen batch by batch instead of processing all records together.


High-Level Flow

The integration flow can be designed as follows:

  1. OIC receives an array of records.
  2. Use an Assign action to calculate:Total number of recordsBatch sizeNumber of batches
  3. Use Stage File / Stage JSON Write to create a new batched JSON structure.
  4. Use nested looping logic in the mapper:Outer loop creates batches.Inner loop assigns source records to the correct batch.
  5. Use the generated batch array for downstream processing.
  6. Use For Each on BatchSet, and inside each batch process the child FileInfo array.


Step 1: OIC Receives an Array of Records

The source payload may contain an array like this:

{
	"FileInfo": [
		{
			"FileName": "Employee_01.csv",
			"Directory": "/oic/stage",
			"FileSize": "1234"
		},
		{
			"FileName": "Employee_02.csv",
			"Directory": "/oic/stage",
			"FileSize": "1234"
		}
	]
}        

The same approach can be applied to invoices, purchase orders, employee records, or any array-based payload.


Step 2: Calculate Total Records, Batch Size, and Number of Batches in Assign Action

Create an Assign action and define the following variables.

Total Number of Records

TotaNoOfRecods = fn:number(fn:count(TriggerArray))-->This counts the total number of records received in the input array.

Batch Size

BatchSize = 20-->Using a Lookup gives flexibility. If the business wants to change the batch size from 20 to 50, we can update the Lookup value without modifying the integration logic.

Number of Batches

NoOfBatches = fn:ceiling(fn:number($TotaNoOfRecods) div fn:number($BatchSize)) -->The ceiling function is important because the last batch may not be fully filled.

Total Records = 45 ; Batch Size = 20 ;(45 div 20 )= 2.25 -->ceiling(2.25) = 3-->So OIC creates three batches.


Step 3: Define the Target Batch JSON/Sample XML Structure

The target JSON structure can be created using Stage File / Stage JSON Write(Sample XML Aslso can be used).

Sample Target JSON structure:

{
"BatchSet": [(Holds No Of Batches) 
	{
		"BatchNumber": "Batch_001",
		"FileInfo": [(Holds the Source Trigger Array records innfo being Batched) 
		{
			"FileName": "Employee_Details.csv",
			"Directory": "/oic/stage",
			"SerialNumber": "1",
			"FileSize": "1234"
				}
			]
		}
	]
}        

  • BatchSet holds all generated batches.
  • Each BatchSet item contains one batch.
  • BatchNumber identifies the batch.
  • FileInfo contains the records assigned to that batch.


Step 4: Mapper Logic for Creating Batches and sort based on Filenames

The core logic is implemented using nested for-each loops.

 <nstrgmpr:Write xml:id="id_12">
                    <ns28:request-wrapper>
                          <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &lt;= number($NoOfBatches)]">
                                <xsl:variable name="CurrentBatchCounters" select="number(position())"/>
                                <xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/>
                                <xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/>
                                <ns28:BatchSet>
                                      <ns28:BatchNumber>
                                            <xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/>
                                      </ns28:BatchNumber>
                                      <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &gt;= number($BatchStart)                   and number(position()) &lt;= number($BatchEnd)]">
  <xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/>
                                            <ns28:FileInfo>
                                                  <ns28:FileName>
                                                        <xsl:value-of select="ns19:filename"/>
                                                  </ns28:FileName>
                                                  <ns28:Directory>
                                                        <xsl:value-of select="ns19:directory"/>
                                                  </ns28:Directory>
                                                  <ns28:SerialNumber>
                                                        <xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/>
                                                  </ns28:SerialNumber>
                                            </ns28:FileInfo>
                                      </xsl:for-each>
                                </ns28:BatchSet>
                          </xsl:for-each>
                    </ns28:request-wrapper>
              </nstrgmpr:Write>         


Outer Loop

The outer loop runs based on the number of batches(Using Same Source Array for Creating the Array defination for using in for loop with Batch size in Predicate/Condition)

<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &lt;= number($NoOfBatches)]">        

This loop creates one BatchSet entry for each batch.

Batch Counter

<xsl:variable name="CurrentBatchCounters" select="number(position())"/>        

This variable stores the current batch number.

Batch Start Position

<xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/>        

This calculates the first record position for the current batch.

For example, if the batch size is 20:

Batch 1 Start = ((1 - 1) * 20) + 1 = 1

Batch 2 Start = ((2 - 1) * 20) + 1 = 21

Batch 3 Start = ((3 - 1) * 20) + 1 = 41

Batch End Position

<xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/>        

This calculates the last record position for the current batch.

Batch 1 End = 1 * 20 = 20

Batch 2 End = 2 * 20 = 40

Batch 3 End = 3 * 20 = 60

Even if the last batch has fewer records, the inner loop will only pick available records from the source array.


Step 5: Generate Batch Number

Each batch can be assigned a formatted batch number.

<ns28:BatchNumber> <xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/> </ns28:BatchNumber>        

This generates values like:

Batch_001 Batch_002 Batch_003

This is useful for logging, auditing, error handling, and downstream tracking.


Step 6: Inner Loop to Assign Records to Each Batch

The inner loop filters the source records based on the current batch range. Use Sort function if needed for sort the filenames.

<xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &gt;= number($BatchStart) and number(position()) &lt;= number($BatchEnd)]">
  <xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/>        

This means:

  • Pick records whose position is greater than or equal to BatchStart
  • Pick records whose position is less than or equal to BatchEnd

Then map the required fields.

<ns28:FileInfo> 
<ns28:FileName> <xsl:value-of select="ns19:filename"/> </ns28:FileName> <ns28:Directory> <xsl:value-of select="ns19:directory"/> </ns28:Directory> <ns28:SerialNumber> <xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/> </ns28:SerialNumber> 
</ns28:FileInfo>        

The SerialNumber logic ensures that the numbering continues across batches.

For example:

Batch_001: 1, 2, 3, ... 20

Batch_002: 21, 22, 23, ... 40

Batch_003: 41, 42, 43, ... 45


Complete Sample Mapper Logic

<nstrgmpr:Write xml:id="id_12"> <ns28:request-wrapper> <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &lt;= number($NoOfBatches)]"> <xsl:variable name="CurrentBatchCounters" select="number(position())"/> <xsl:variable name="BatchStart" select="((number($CurrentBatchCounters) - 1) * number($BatchCount)) + 1"/> <xsl:variable name="BatchEnd" select="number($CurrentBatchCounters) * number($BatchCount)"/> <ns28:BatchSet> <ns28:BatchNumber> <xsl:value-of select="concat('Batch_', format-number(number($CurrentBatchCounters), '000'))"/> </ns28:BatchNumber> <xsl:for-each select="$ListPayslipPdfFiles/nsmpr0:ListFileResponse/ns19:ListResponse/ns19:FileList/ns19:File[number(position()) &gt;= number($BatchStart) and number(position()) &lt;= number($BatchEnd)]"> <xsl:sort select="number(ns19:FileName)" data-type="text" order="ascending"/><ns28:FileInfo> <ns28:FileName> <xsl:value-of select="ns19:filename"/> </ns28:FileName> <ns28:Directory> <xsl:value-of select="ns19:directory"/> </ns28:Directory> <ns28:SerialNumber> <xsl:value-of select="number(position()) + ((number($CurrentBatchCounters) - 1) * number($BatchCount))"/> </ns28:SerialNumber> </ns28:FileInfo> </xsl:for-each> </ns28:BatchSet> </xsl:for-each> </ns28:request-wrapper> </nstrgmpr:Write>


Sample Batched Response

Corrected sample JSON structure(use Sample XML If you prefer):

{
	"BatchSet": [
		{
			"BatchNumber": "Batch_001",
			"FileInfo": [
				{
					"FileName": "Employee_Details.csv",
					"Directory": "/oic/stage",
					"SerialNumber": "1",
					"FileSize": "1234"
				},
				{
					"FileName": "Employee_Details2.csv",
					"Directory": "/oic/stage",
					"SerialNumber": "2",
					"FileSize": "1234"
				}
			],
			"BatchNumber": "Batch_002",
			"FileInfo": [
				{
					"FileName": "Employee_Details3.csv",
					"Directory": "/oic/stage",
					"SerialNumber": "3",
					"FileSize": "1234"
				},
				{
					"FileName": "Employee_Details4.csv",
					"Directory": "/oic/stage",
					"SerialNumber": "4",
					"FileSize": "1234"
				}
			]
		}
	]
}        

Each batch is a separate object inside the BatchSet array.


How to Use the Batched Output

Once the batch JSON/XML is generated, it can be used for further processing.

A common approach is:

  1. Use a For Each action on the BatchSet array.
  2. Inside each batch, use another For Each action on the child FileInfo array.
  3. Apply the required business logic.

For example:

  • Process one batch at a time.
  • Send each batch to an external system.
  • Create separate files for each batch.
  • Invoke child integrations per batch.
  • Add error handling and retry logic batch-wise.
  • Track batch status in logs or a database.


Why This Approach Is Useful

This custom batching approach is useful when:

  • The source system sends a large number of records.
  • The target system has record count limits.
  • Processing all records together may cause timeout issues.
  • The integration needs better control over retries.
  • Batch-level tracking is required.
  • Business users want to configure batch size dynamically.

Instead of building multiple integrations or depending on external preprocessing, OIC itself can create a clean batch structure using mapper logic.


Best Practices

Here are a few recommendations when implementing this pattern:

  • Maintain BatchSize in an OIC Lookup instead of hardcoding it.
  • Use clear batch numbering like Batch_001, Batch_002, and so on.
  • Log total records, batch size, and number of batches.
  • Add validation for empty arrays.
  • Always use number() Fucntion for conversions, XSLT may fails sometimes.
  • Track failures at batch level, not only at integration level.
  • Keep the generated batch JSON simple and reusable.
  • Reuse the same pattern for invoices, POs, employees, files, or other array-based objects.


Note: The Designer mode will not be rendered since we make use of XSLT Variables, pay attention to the mapping feilds and do demo test in Mapper run with Sample inputs

Conclusion

Custom batching logic in Oracle Integration Cloud is a very effective design pattern when handling large arrays of business records.

By calculating the total record count, deriving the number of batches, and using nested mapper logic, we can split a single source array into multiple structured batches. This improves scalability, control, retry handling, and downstream processing efficiency.

This pattern can be reused across many integration scenarios, including employee file processing, invoice integrations, purchase order processing, payroll files, and other bulk data flows.

#OracleIntegrationCloud #OIC #Gen3 #CustomBatching #Batching #OracleCloud #Integration #Middleware #OracleIntegration #CloudIntegration #APIs #Automation #EnterpriseIntegration

To view or add a comment, sign in

More articles by Shashikumar D.

Explore content categories