What is the batch size setting in data loader? How can you use it to most effectively and efficiently load data?
So many data loader settings, so little time to figure them all out, amiright?

Learn how to use the Batch Size data loader setting to ensure you are loading data accurately and efficiently every time.
I’ll also sprinkle in some ‘best practices’ I’ve accumulated based on my 13+ years of experience loading client data.
*Note – On this site, a ‘best practice’ is labeled as such because at its origin is a deeply painful/embarrassing/time consuming mistake that has been made and learned from.
**Therefore, I do not regret the mistakes I have made, as the experiences learned from them have led me to be the super amazing data loading professional I am today.
Let’s get crackin’.
Setting: Batch Size

How does the batch size affect API calls?
Wait, what? They’re related?
Yup – see here for the official API request limits and allocations doc.
Every time a batch runs, it places a call to the API. If your batch size is 200, it’s one call. If your batch size is 1, it’s one call.
REAL WORLD EXAMPLE
You need to insert 10,000 person accounts but you want automation to run for each record (such as when you’re inserting into Financial Services Cloud). You will need to use a batch size of 1 so that the automation doesn’t break the load.
10,000 records / batch size 1 = 10,000 API calls
Next, you need to update those 10,000 newly created records with additional information, but this process doesn’t require automation to run, so you can adjust the batch size setting back to 200.
10,000 records / batch size 200 = 50 API calls
You’re now at 10,050 API calls for the 24 hour period.
If you’re on Enterprise Edition or Professional Edition with API enabled, you are allocated 15,000 API calls + 1,000 calls per Salesforce license for the 24 hour period.
REAL WORLD EXAMPLE
You are in EE (Enterprise Edition) with 24 licensed users.
24 users x 1,000 API calls = 24,000 API calls per 24 hours
Plus, you get 15,000 as a baseline of API calls.
15,000 baseline + 24,000 user-based allocation = 39,000 API calls per 24 hours
Wait, what? They’re related?
Yup – see here for the official API request limits and allocations doc.
PREPARING FOR DATA MIGRATIONS
When you are going to need to exceed your daily limit of API calls to complete a data migration, you need to log a ticket with Salesforce Support requesting a temporary increase in API calls. Do this a few days in advance so they have time to fulfill it.
HOW DO I CHECK MY API CALL USAGE?
- Go to Setup
- Type ‘Company’ into the quick find
- Navigate to company information
- Look at the field called “API Requests, Last 24 Hours” -it tells you where you’re at and the max you’re allowed

API Calls Setting in Salesforce Setup
If you’ve got integrations with other systems or you’re using certain installed apps, those will also be making API calls. Keep that in mind when requesting your temporary increase.
Gotchas – Read These First
The batch size setting does not revert back to the default setting of 200 automatically
So, don’t forget to change the setting, log out of the data loader and log back in
When you have code/automation running on the object(s) that you’re inserting/upserting/updating, you will have to lower your batch size significantly, typically to 1
Lowering your batch size to 1 means you’re making more API calls, not to mention it’s more time consuming, so keep that in mind when you’re timing your migration schedule
HOW TO USE BULK API
The bulk API setting in Data Loader allows you to load data in sets of10,000 records per batch. Benefits of this setting:
- Fewer API calls
- Faster processing time
Potential cons of using Bulk API:
- The ‘Insert Null Values’ setting cannot be enabled while Bulk API is enabled, because empty field values are ignored when you update records using the Bulk API
- The ‘Allow Field Truncation’ setting cannot be enabled while Bulk API is enabled, because the load operation fails for the row if a value is specified that is too large for the field
COMMON BATCH SIZE RELATED ERRORS
- Java Heap Space
Problem: It’s possible if you’re using the Bulk API setting that you’ll encounter Java Heap Space error messages. Data Loader runs on Java, and the issue occurs when the JVM runs out of the memory allocated to it when executed.
Solution: Newer Data Loader versions (45 and up) that use Zulu Java manage heap size differently and should not error out, so make sure you’re using the most recent version of Data Loader.
- System.Exception: Too many SOQL queries
Problem: There’s a trigger on the object you’re inserting/updating which is exceeding the Apex governor limits. Who is this governor and why are her limits so stingy? Who knows.
Solution: Lower your batch size, probably to 1, so the records process one at a time.
- Not processed due to batch error: Invalid Batch : Failed to parse CSV header
Problem: This error will only occur if the “Use Bulk API” is enabled.
Solution: Make sure that at least one field is mapped or the field mapping step is not skipped.
- AccountAfterInsert: execution of AfterUpdate
Problem: Trigger is active on the Account object resulting in Apex failing during your data operation.
Solution: Lower. Your. Batch. Size. To. 1.
- Your success and error files return 0 errors and 0 successes in the following scenarios
Problem 1: There is no batch size selected in Data Loader settings.
Solution 1: Select a batch size, dumb dumb.
** The above error could also be an issue not related to batch size.
Potentially: Data Loader setting “Start at row” is higher than the number of rows in the csv file.
Solution: Set the “Start at Row” value to “0”.
Potentially: There are improperly formatted date or date/time columns within the csv file.
Solution: Find ’em, fix ’em.
Potentially: Data storage is full.
Solution: Option 1: hard delete unneeded data. Option 2: contact Salesforce to increase storage limit.
CONCLUSION
- Using a batch size of 1 can be really helpful to insert/update records and take advantage of automation in the system.
- Using a batch size of 200 is my gold standard in migration situations where automation is off or not applicable, even though Salesforce recommends 100.
- Using bulk API loads data much faster and logs fewer API calls, which is great for loading large simple files, like historical tasks.
Blog photo of camel byDaniela Castro on Unsplash, edited by me.
Categories: Data Loader
Leave a Reply