What is Heap Size Limit Exceeded in Salesforce?

Heap size refers to the amount of memory allocated for storing objects and variables during the runtime of your Apex code. When you write Apex code that processes data or executes transactions, memory is used to store lists, maps, objects, and other variables. This memory usage is measured as heap size.

Heap Size Limited Exceeded

In Salesforce, each Apex transaction is assigned a specific heap size limit. Exceeding this limit can result in the “Heap Size Limit Exceeded” error. Since Salesforce is a multitenant platform, these limits are in place to prevent any one user or transaction from consuming too much memory, which could impact the performance of other tenants.

Heap Size Limit Exceeded in Salesforce

Solving the ‘Heap Size Limit Exceeded’ Problem

The “Heap Size Limit Exceeded “ error occurs when the memory consumed by your code exceeds Salesforce’s allotted heap size. Each Apex execution has a specific memory limit, and exceeding this threshold will stop the code from executing. Salesforce enforces these limits to ensure stability and consistency across its platform.

You may see this error in various scenarios such as querying large datasets, recursive operations, or handling complex data structures. The typical error message looks like this:

System.LimitException: Apex heap size too large: 6000000

Understanding what causes this error is essential to prevent it and ensure that your applications run smoothly.

Causes of Heap Size Limit Exceeded Error

The following factors can contribute to the “Heap Size Limit Exceeded “ error:

  1. Large Data Queries: Querying too many records or fetching unnecessary fields from large datasets.
  2. Inefficient Data Structures: Storing large collections (e.g., lists, maps) without efficient handling.
  3. Recursive Operations: Operations that loop through data recursively without limit or optimization.
  4. Complex Object Handling: Managing large, nested, or complex SObjects in memory.
  5. Heavy String Manipulations: Manipulating large strings or concatenating them repeatedly without optimization.

Knowing these common causes helps developers design more efficient Apex code to stay within heap size limits.

Salesforce Heap Size Limits for Different Contexts

Salesforce enforces heap size limits based on the context in which the code is executed, and these limits vary between synchronous and asynchronous operations. For synchronous Apex executions, the heap size limit is set at 6 MB, while asynchronous operations, such as Queueable Apex, allow for a larger heap size of 12 MB. Batch Apex operates with a 6 MB limit per batch execution, enabling efficient processing of large data sets in smaller chunks. Similarly, Scheduled Apex has a 6 MB heap size limit. Understanding these limits is crucial for developers to plan and execute their code efficiently, particularly when handling large datasets or complex processes

How to Debug and Monitor Heap Size

Salesforce provides tools to help developers monitor heap size usage and prevent exceeding limits. Below are some techniques for debugging heap size:

Using System. debug() to Monitor Heap Size

You can use the System. debug() method to monitor how much heap memory your code is consuming:

System.debug('Heap size: ' + Limits.getHeapSize());

System.debug('Max heap size: ' + Limits.getLimitHeapSize());

This helps in tracking heap usage at different points in the execution and allows you to identify potential issues before they lead to errors.

Heap Dumps

Heap dumps provide detailed memory information and can be analyzed through the Salesforce Developer Console. Enabling heap dumps gives a breakdown of memory usage for every object, helping you identify memory-heavy data structures.

How to Avoid Heap Size Limited Errors

Adopting best practices can help prevent heap size errors in Salesforce. Here are some recommended techniques:

Optimize SOQL Queries

Retrieve only the necessary fields and limit the number of records to reduce memory consumption.

List<Account> accList = [SELECT Id, Name FROM Account LIMIT 100];

Use Pagination

Instead of querying all records at once, paginate your queries to load data in smaller chunks.

List<Account> accList = [SELECT Id, Name FROM Account LIMIT :pageSize OFFSET :offset];

Leverage SOQL For Loops

SOQL for loops processes records in smaller batches, preventing large data sets from consuming too much heap memory.

for (Account acc : [SELECT Id, Name FROM Account]) {

    // Business logic here

}

Clear Unnecessary Data

If large collections or variables are no longer needed, set them to null to free up memory.

myList = null; // clears memory

Use Custom Iterators

Instead of loading large datasets at once, process them using custom iterators or Batch Apex to divide the workload into smaller parts.

Examples to Handle Heap Size Limit Exceeded

There are multiple examples to help developers manage heap size effectively in Salesforce:

Efficient Query with Limited Fields

List<Contact> contacts = [SELECT Id, FirstName, LastName FROM Contact LIMIT 100];

SOQL for Loops for Batch Processing

for (Account acc : [SELECT Id, Name FROM Account]) {

    // Process records in smaller chunks

}

Using Apex Pagination

Integer pageSize = 100;

Integer offset = 0;

List<Account> accList = [SELECT Id, Name FROM Account LIMIT :pageSize OFFSET :offset];

Reducing Heap Usage by Clearing Collections

List<Account> accList = [SELECT Id, Name FROM Account LIMIT 1000];

// Process accounts

accList = null; // Free heap memory

Batch Apex for Large Data Sets

public class AccountBatch implements Database.Batchable<SObject> {

    public Database.QueryLocator start(Database.BatchableContext BC) {

        return Database.getQueryLocator([SELECT Id FROM Account]);

    }

    public void execute(Database.BatchableContext BC, List<Account> scope) {

        // Process accounts

    }

    public void finish(Database.BatchableContext BC) {

        // Final steps

    }

}

Limiting Data Retrieved in Triggers

trigger AccountTrigger on Account (before insert, before update) {

    List<Account> accList = [SELECT Id, Name FROM Account WHERE Id IN :Trigger.newMap.keySet()];

}

Efficiently Processing Large Collections

public void processAccounts(List<Account> accList) {

    if (accList.size() > 100) {

        for (Account acc : accList) {

            // Business logic

        }

    }

}

Clearing Unused Variables to Free Memory

List<Contact> contactList = [SELECT Id, FirstName, LastName FROM Contact LIMIT 1000];

contactList = null; // Free memory

Optimizing Recursive Functions

public void recursiveFunction(Integer level) {

    if (level > 10) return;

    recursiveFunction(level + 1);

}

Using Maps to Efficiently Handle Large Data Sets

Map<Id, Account> accMap = new Map<Id, Account>([SELECT Id, Name FROM Account LIMIT 500]);

Using Chunked Queries for Data Loading

Integer chunkSize = 100;

for (Integer i = 0; i < totalRecords; i += chunkSize) {

    List<Contact> contacts = [SELECT Id, Name FROM Contact LIMIT :chunkSize OFFSET :i];

}

Managing Heap Size with Lists and Sets

Set<Id> accountIds = new Set<Id>();

List<Account> accList = [SELECT Id, Name FROM Account];

for (Account acc : accList) {

    accountIds.add(acc.Id);

}

Managing Large Data Updates

public void updateLargeData(List<Account> accounts) {

    for (Integer i = 0; i < accounts.size(); i += 200) {

        List<Account> batch = accounts.subList(i, Math.min(i + 200, accounts.size()));

        update batch; // Update records in smaller batches

    }

}

Efficient SOQL Queries with Selective Fields

List<Contact> contacts = [SELECT Id, FirstName, LastName FROM Contact LIMIT 100];

Processing Large Datasets in Queueable Apex

public class MyQueueableClass implements Queueable {

    public void execute(QueueableContext context) {

        List<Account> accList = [SELECT Id, Name FROM Account LIMIT 200];

        // Process data asynchronously

    }

}

Lazy Loading Records to Manage Heap Usage

public class LazyLoader {

    public List<Contact> loadContacts(Id accountId) {

        return [SELECT Id, FirstName, LastName FROM Contact WHERE AccountId = :accountId];

    }

}

Leveraging @future Annotation for Asynchronous Processing

@future

public static void processLargeDataSet() {

    List<Account> accounts = [SELECT Id, Name FROM Account LIMIT 500];

    // Process data in future method

}

Recursive Apex Processing

public void processInBatches(List<Account> accList, Integer index) {

    if (index >= accList.size()) return;

    // Process next batch

    processInBatches(accList, index + 100);

}

Batch Apex for Large Record Updates

global class LargeUpdateBatch implements Database.Batchable<SObject> {

    global Database.QueryLocator start(Database.BatchableContext BC) {

        return Database.getQueryLocator([SELECT Id, Name FROM Account]);

    }

    global void execute(Database.BatchableContext BC, List<SObject> scope) {

        update scope;

    }

    global void finish(Database.BatchableContext BC) {

        // Final steps

    }

}

Efficient String Handling to Avoid Heap Overload

String largeString = 'Very large string to process';

if (largeString.length() > 1000) {

    // Split or truncate string to avoid heap overload

}

Scenarios and Solutions

To further understand the “Heap Size Limit Exceeded” error, here are some real-world scenarios and their potential solutions:

Querying Large Datasets

When querying a large dataset, it’s common to hit the heap size limit if all records are loaded into memory at once.

Solution: Use SOQL for loops or chunking with pagination.

for (Account acc : [SELECT Id, Name FROM Account LIMIT 10000]) {

    // Process in chunks to avoid heap overflow

}

Processing Large Lists in Triggers

When a trigger processes a large number of records, especially in bulk data operations, it may exceed the heap size.

Solution: Optimize the trigger by processing records in smaller chunks or using Batch Apex for larger datasets.

trigger AccountTrigger on Account (before insert, before update) {

    List<Account> accList = [SELECT Id, Name FROM Account WHERE Id IN :Trigger.newMap.keySet()];

    // Handle only necessary records

}

Recursive Operations Leading to Heap Exhaustion

Recursively processing a large dataset without adequate break conditions can cause excessive memory usage.

Solution: Limit recursion depth and use techniques like bulk processing.

public void recursiveFunction(Integer level) {

    if (level > 10) return;

    // Business logic

    recursiveFunction(level + 1);

}

Advanced Optimization Strategies

In addition to the basic strategies covered so far, there are some advanced techniques that developers can leverage to further optimize memory management and prevent heap size issues.

Streamline Object Data

When handling complex objects, be mindful of which fields you need. Loading unnecessary fields can cause excessive memory usage, especially with large datasets.

List<Contact> contacts = [SELECT Id, FirstName FROM Contact LIMIT 100];

Avoid Large String Manipulations

Handling large strings, such as concatenating or splitting them multiple times, can exhaust heap size. Always use efficient string-handling methods and avoid redundant operations.

String longText = 'This is a large string that we want to process.';

if (longText.length() > 1000) {

    longText = longText.substring(0, 1000); // Truncate large strings

}

Externalizing Data Processing

If you anticipate processing very large datasets, consider externalizing this processing outside of Salesforce using APIs or external systems, which reduces the memory burden on the platform itself.

Benefits of Understanding and Managing Heap Size in Salesforce

Understanding how to manage heap size in Salesforce offers numerous benefits, enhancing both your application’s performance and scalability.

Improved Performance and Efficiency

When heap size is properly managed, Salesforce applications run more efficiently. By breaking data into manageable chunks, you reduce the risk of memory-related errors, ensuring smoother processing of complex operations.

Reduced Risk of Governor Limit Exceptions

Exceeding Salesforce’s memory limits can halt code execution, leading to transaction failures. Efficient heap size management minimizes the risk of triggering these errors, ensuring uninterrupted operations.

Enhanced User Experience

Heap size issues can lead to slower response times, incomplete processes, or failed actions for users. By keeping your application within memory limits, you provide a more seamless and reliable user experience.

Scalability

As your Salesforce instance grows, with more users and data, efficient heap size management ensures that your applications can scale without degrading performance.

Cost-Effective Resource Utilization

Salesforce is a multi-tenant platform where system resources are shared among all users. By managing heap size efficiently, you optimize your system’s resource usage, which can help reduce costs over time.

Easier Maintenance and Debugging

Code optimized for heap size is generally easier to maintain and debug. Memory-heavy sections of your code are more easily identified and fixed, resulting in fewer unexpected issues.

Key Features of Heap Size Management in Salesforce

Salesforce offers several features that help developers manage heap size efficiently:

SOQL For Loops

SOQL for loops automatically process records in small chunks, reducing the heap size used in memory. This helps in situations where you are querying large datasets.

for (Account acc : [SELECT Id, Name FROM Account]) {

    // Efficient data handling

}

Batch Apex

Batch Apex is essential for handling large data volumes without exceeding heap size limits. By processing data in smaller batches, Batch Apex keeps memory usage low.

global class MyBatchClass implements Database.Batchable<SObject> {

    global Database.QueryLocator start(Database.BatchableContext BC) {

        return Database.getQueryLocator([SELECT Id FROM Account]);

    }

    global void execute(Database.BatchableContext BC, List<SObject> scope) {

        // Process each batch

    }

    global void finish(Database.BatchableContext BC) {

        // Final steps

    }

}

Queueable Apex

Queueable Apex allows developers to manage complex, asynchronous processes without overloading the heap size. It splits tasks into smaller chunks and queues them for later processing.

public class MyQueueableJob implements Queueable {

    public void execute(QueueableContext context) {

        // Asynchronous processing

    }

}

Heap Dumps for Debugging

Salesforce offers the Developer Console Heap Dump feature, which provides insights into memory usage. This tool helps developers analyze heap consumption and optimize memory usage accordingly.

Limits Class for Monitoring Heap Size

Using Salesforce’s Limits class, you can monitor the current heap size usage and compare it with the maximum allowed heap size, helping you proactively optimize your code.

System.debug('Heap size: ' + Limits.getHeapSize());

System.debug('Max heap size: ' + Limits.getLimitHeapSize());

Leave a Comment

Your email address will not be published. Required fields are marked *

Scroll to Top