Hello Trailblazers,
Recently, I faced this issue in one of the projects I was working on. The issue was UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record or N records, here N is just a placeholder in "N records" it can be 10, 20, 30 or even 100...200 for you. Let's have a look at the use case first and then we'll discuss the solution.
Use Case
I was having two objects connected with master detail relationship. So, you can consider it like: Object A which is the parent object and Object B which is the child object.
Object B (Child) -> Object A (Parent)
To simplify the terminology here, we're going to consider that our custom Object B is "Product Parts" and our custom Object A is "Product". So, each product part must be linked to a single product and one product can have multiple parts. So, our relationship is simplified as:
Product Part (Child) -> Product (Parent)
Let's consider that we have a field named as: Product__c on the Product Part object which is a master detail relationship between product and product parts.
Now, there were around 1.5 million records of Product Part and as we have a master detail relationship, so, it was sure that each Product Part record is linked to a record of Product. In our use case, we're updating the count of each product part on a daily basis using a batch class.
So, you can consider that 1.5 million records are updated on a daily basis. Even if we consider the maximum batch size of 2000, there will be around 750 batches that are running on a daily basis to update all records.
This is how the start method of my batch class looked like:
As you can see above, I am simply returning all the records of my Product_Part__c object as I need to update the count of all of them. Let's have a quick look at execute method as well.
As you can see above, I am simply updating the count of all the product parts to 100. There was nothing in my finish() method, so, I am not specifying that here. It was an empty method.
So, let's consider that for about 1.5M product parts, we're having around 15,000 products to which these product parts are linked to. When I ran my batch class over these records, I got the below error:
Developer script exception from SFDC Stop : 'ProductPartCountUpdateBatch' for job id '1024K0000OVBnlO' : Update failed. First exception on row 0 with id o1g4N00002wG5R4MBT; first error: UNABLE_TO_LOCK_ROW, unable to obtain exclusive access to this record or 100 records: 025A4m00003uCJs5QBL,025LO00002eCVhoAGH,.....
So, my batch class failed, I wasn't able to find any issue in the code, so, let's learn why it failed first of all.
Why it failed?
I noticed that all those ids that I got in my error message were not really the id of my Product Part records, They were the ids of my Product records. Interesting right?
After proceeding with my investigation, I read this in
Salesforce documentation:
If a record on the master side of a master-detail relationship has too many child records (thousands) you are likely to encounter these errors as well, as every time you edit the detail record, the master record is locked.
So, now we know that our batch was failing because I was querying the records in the random order and multiple batches were trying to lock the same parent (product) record again and again. Let's have a look at the resolution now.
How to resolve it?
To resolve this issue, I just ordered my query in the start method as shown below:
As you can see above, I just ordered my product part records by using the Product__c which is nothing but the master detail field connecting product and product parts. Due to this small change, my product part records were clubbed together that belong to the same Product and were processed together as well.
There was no record locking issue after that as all the product part records which are linked to the same product were processed together and the same product record was not locked again and again by different child records in different batches.
Easy! right?
So, next time you face this UNABLE_TO_LOCK_ROW issue make sure to have a look at your object relationships and try to minimize the possibilities of error by making small enhancements in your query/code. Here we resolved a big issue by just Ordering the records together!
Happy Trailblazing..!!
Goood stuff! Any workaround for this while trying to import data using SF inspector or Data Loader?
ReplyDeleteThe inspector and data loader already work in batches of 200, I think we can export the ordered records for those like we did in case of batch and then import them as we usually do for an update.
Deleteđź‘Ť
ReplyDeleteGood, I haven't encounter this issue yet, but if in future I will definitely apply your suggestion.
ReplyDeleteHI Rahul, great article. Thanks!
ReplyDeleteAny suggestions on how to resolve this matter when using Flow automation instead?
Thanks in advance to anyone who has some valuable input on this.
I think you can try sorting the records while querying them in flow if that helps.
DeleteThanks Rahul. I managed to fix the issue by adding an assignment in between instead of directly calling the record variable to update a record.
DeleteGreat. Thanks for sharing the solution :)
DeleteWonderful explanation to resolve this error, thanks Rahul for posting such articles!
ReplyDeleteSimple improvement to query & that solves the issue. Good to know that.
ReplyDelete