[excel] How to clear memory to prevent "out of memory error" in excel vba?

I am running VBA code on a large spreadsheet. How do I clear the memory between procedures/calls to prevent an "out of memory" issue occurring?

Thanks

This question is related to excel vba memory

The answer is


If you operate on a large dataset, it is very possible that arrays will be used. For me creating a few arrays from 500 000 rows and 30 columns worksheet caused this error. I solved it simply by using the line below to get rid of array which is no longer necessary to me, before creating another one:

Erase vArray

Also if only 2 columns out of 30 are used, it is a good idea to create two 1-column arrays instead of one with 30 columns. It doesn't affect speed, but there will be a difference in memory usage.


Found this thread looking for a solution to my problem. Mine required a different solution that I figured out that might be of use to others. My macro was deleting rows, shifting up, and copying rows to another worksheet. Memory usage was exploding to several gigs and causing "out of memory" after processing around only 4000 records. What solved it for me?

application.screenupdating = false

Added that at the beginning of my code (be sure to make it true again, at the end) I knew that would make it run faster, which it did.. but had no idea about the memory thing.

After making this small change the memory usage didn't exceed 135 mb. Why did that work? No idea really. But it's worth a shot and might apply to you.


I had a similar problem that I resolved myself.... I think it was partially my code hogging too much memory while too many "big things"

in my application - the workbook goes out and grabs another departments "daily report".. and I extract out all the information our team needs (to minimize mistakes and data entry).

I pull in their sheets directly... but I hate the fact that they use Merged cells... which I get rid of (ie unmerge, then find the resulting blank cells, and fill with the values from above)

I made my problem go away by

a)unmerging only the "used cells" - rather than merely attempting to do entire column... ie finding the last used row in the column, and unmerging only this range (there is literally 1000s of rows on each of the sheet I grab)

b) Knowing that the undo only looks after the last ~16 events... between each "unmerge" - i put 15 events which clear out what is stored in the "undo" to minimize the amount of memory held up (ie go to some cell with data in it.. and copy// paste special value... I was GUESSING that the accumulated sum of 30sheets each with 3 columns worth of data might be taxing memory set as side for undoing

Yes it doesn't allow for any chance of an Undo... but the entire purpose is to purge the old information and pull in the new time sensitive data for analysis so it wasn't an issue

Sound corny - but my problem went away


Answer is you can't explicitly but you should be freeing memory in your routines.

Some tips though to help memory

  • Make sure you set object to null before exiting your routine.
  • Ensure you call Close on objects if they require it.
  • Don't use global variables unless absolutely necessary

I would recommend checking the memory usage after performing the routine again and again you may have a memory leak.


The best way to help memory to be freed is to nullify large objects:

Sub Whatever()
    Dim someLargeObject as SomeObject

    'expensive computation

    Set someLargeObject = Nothing
End Sub

Also note that global variables remain allocated from one call to another, so if you don't need persistence you should either not use global variables or nullify them when you don't need them any longer.

However this won't help if:

  • you need the object after the procedure (obviously)
  • your object does not fit in memory

Another possibility is to switch to a 64 bit version of Excel which should be able to use more RAM before crashing (32 bits versions are typically limited at around 1.3GB).


I've found a workaround. At first it seemed it would take up more time, but it actually makes everything work smoother and faster due to less swapping and more memory available. This is not a scientific approach and it needs some testing before it works.

In the code, make Excel save the workbook every now and then. I had to loop through a sheet with 360 000 lines and it choked badly. After every 10 000 I made the code save the workbook and now it works like a charm even on a 32-bit Excel.

If you start Task Manager at the same time you can see the memory utilization go down drastically after each save.


I was able to fix this error by simply initializing a variable that was being used later in my program. At the time, I wasn't using Option Explicit in my class/module.


Examples related to excel

Python: Pandas pd.read_excel giving ImportError: Install xlrd >= 0.9.0 for Excel support Converting unix time into date-time via excel How to increment a letter N times per iteration and store in an array? 'Microsoft.ACE.OLEDB.16.0' provider is not registered on the local machine. (System.Data) How to import an Excel file into SQL Server? Copy filtered data to another sheet using VBA Better way to find last used row Could pandas use column as index? Check if a value is in an array or not with Excel VBA How to sort dates from Oldest to Newest in Excel?

Examples related to vba

Copy filtered data to another sheet using VBA Better way to find last used row Check if a value is in an array or not with Excel VBA Creating an Array from a Range in VBA Excel: macro to export worksheet as CSV file without leaving my current Excel sheet VBA: Convert Text to Number What's the difference between "end" and "exit sub" in VBA? Rename Excel Sheet with VBA Macro Extract Data from PDF and Add to Worksheet Quicker way to get all unique values of a column in VBA?

Examples related to memory

How does the "view" method work in PyTorch? How do I release memory used by a pandas dataframe? How to solve the memory error in Python Docker error : no space left on device Default Xmxsize in Java 8 (max heap size) How to set Apache Spark Executor memory What is the best way to add a value to an array in state How do I read a large csv file with pandas? How to clear variables in ipython? Error occurred during initialization of VM Could not reserve enough space for object heap Could not create the Java virtual machine