BatchJob Serialization in Salesforce

Batch Jobs in Dynamic Pragmatic Situations:

Batch Jobs execute in an asynchronous mode. In some situations, we need to use dynamic programming to provision or revoke Permission-Sets access or Dynamic Apex Sharing reasons to provision or revoke the same.

Thumb Rule: To Follow when designing batch jobs for handling dynamic scenarios involving Salesforce based sharing access

Fundamentally, we need to ensure that batch job executions are never invoked from Apex Triggers in the Salesforce Platform. The reason being Apex Triggers are designed for synchronous execution whereas batch jobs execute asynchronously. So, it never makes sense to have an Apex Trigger based synchronous mode of execution to invoke a Batch Apex Job that runs asynchronously and vice-versa.

How do Batch Jobs Work In Salesforce?

A typical Salesforce Batch Apex Job has 3 methods namely:

  1. Start( ) – This is the very first method that gets invoked when a Batch Job is initiated in Salesforce. In this method, the initial query is issued to fetch all the records.
  2. Execute( ) – These will be executed multiple times depending on the Batch Job size. Moreover, by default a batch job is designed to run in parallel mode of execution, meaning if there are 1000 records to be processed and each batch is of 200 records size then there will be ideally 5 batches that will get executed in parallel.
  3. Finish( ) – This is the last method that is executed only once. This method is executed as the last method in the batch job execution.

What needs to be done to make Batch Jobs run Serially ?

On the batch class, we must add the keyword

global class <Class Name> implements Database.Batchable<sObject>, Database.Stateful{ }

In some situations, we may leverage the batch class constructor to initialize memory for prospective collections as follows: Moreover, these collections can help control and harmonize the state of each batch execution in the serial context of processing records.

global <Class Name>(set<Id> abcdIdSet){

        abcdSet = abcdIdSet;

        abcdMap = new map<String, list<sObject>>();

        abcdList = new list<sObject>();

    }

 

The execute( ) method will process the records and based on the records that have been processed, we need to perform a DML like update/delete if lets say we are dealing with Share Records or PermissionSetAssignment records through the finish( ) to serially collect the set of data and delete it.

Parallel Mode of Execution Of the Batch

Parallel Total Records 1000 Records   Batch Size 200 Records
           
  start( )   execute( )   finish( )
      batch-1 (200)    
      batch-2 (200)    
      batch-3 (200)    
      batch-4 (200)    
      batch-5 (200)    

 

Serial Total Records 1000 Records   Batch Size 200 Records
start( ) batch-1 batch-2 batch-3 batch-4 batch-5
execute( ) 200 200 200 200 200
finish( ) Final processing is done here like total of all the records or like provisioning or removing permissions.

 

In case of parallel mode of execution of the execute( ) method, each batch is executed parallelly (i.e. mutually exclusive of the other) whereas when batch job is executed serially, each batch is executed serially one after the other i.e. after the first execute( ) is finished only then the subsequent execute( ) methods are executed.

Note:

While dealing with insertion and deletion of PermissionSetAssignment records or Share records, we need to ensure the deletions are processed in the finish( ) method and provisioning can happen during the execute() running context of the batch jobs.

 

Article Contributor:

Rakesh RamaSwamy

LinkedIn: https://www.linkedin.com/in/rakesh-ramaswamy-38062385/

 

 

sfdclightning

Related post