[r] R memory management / cannot allocate vector of size n Mb

I am running into issues trying to use large objects in R. For example:

> memory.limit(4000)
> a = matrix(NA, 1500000, 60)
> a = matrix(NA, 2500000, 60)
> a = matrix(NA, 3500000, 60)
Error: cannot allocate vector of size 801.1 Mb
> a = matrix(NA, 2500000, 60)
Error: cannot allocate vector of size 572.2 Mb # Can't go smaller anymore
> rm(list=ls(all=TRUE))
> a = matrix(NA, 3500000, 60) # Now it works
> b = matrix(NA, 3500000, 60)
Error: cannot allocate vector of size 801.1 Mb # But that is all there is room for

I understand that this is related to the difficulty of obtaining contiguous blocks of memory (from here):

Error messages beginning cannot allocate vector of size indicate a failure to obtain memory, either because the size exceeded the address-space limit for a process or, more likely, because the system was unable to provide the memory. Note that on a 32-bit build there may well be enough free memory available, but not a large enough contiguous block of address space into which to map it.

How can I get around this? My main difficulty is that I get to a certain point in my script and R can't allocate 200-300 Mb for an object... I can't really pre-allocate the block because I need the memory for other processing. This happens even when I dilligently remove unneeded objects.

EDIT: Yes, sorry: Windows XP SP3, 4Gb RAM, R 2.12.0:

> sessionInfo()
R version 2.12.0 (2010-10-15)
Platform: i386-pc-mingw32/i386 (32-bit)

locale:
[1] LC_COLLATE=English_Caribbean.1252  LC_CTYPE=English_Caribbean.1252   
[3] LC_MONETARY=English_Caribbean.1252 LC_NUMERIC=C                      
[5] LC_TIME=English_Caribbean.1252    

attached base packages:
[1] stats     graphics  grDevices utils     datasets  methods   base

This question is related to r matrix vector memory-management r-faq

The answer is


The save/load method mentioned above works for me. I am not sure how/if gc() defrags the memory but this seems to work.

# defrag memory 
save.image(file="temp.RData")
rm(list=ls())
load(file="temp.RData")

Here is a presentation on this topic that you might find interesting:

http://www.bytemining.com/2010/08/taking-r-to-the-limit-part-ii-large-datasets-in-r/

I haven't tried the discussed things myself, but the bigmemory package seems very useful


For Windows users, the following helped me a lot to understand some memory limitations:

  • before opening R, open the Windows Resource Monitor (Ctrl-Alt-Delete / Start Task Manager / Performance tab / click on bottom button 'Resource Monitor' / Memory tab)
  • you will see how much RAM memory us already used before you open R, and by which applications. In my case, 1.6 GB of the total 4GB are used. So I will only be able to get 2.4 GB for R, but now comes the worse...
  • open R and create a data set of 1.5 GB, then reduce its size to 0.5 GB, the Resource Monitor shows my RAM is used at nearly 95%.
  • use gc() to do garbage collection => it works, I can see the memory use go down to 2 GB

enter image description here

Additional advice that works on my machine:

  • prepare the features, save as an RData file, close R, re-open R, and load the train features. The Resource Manager typically shows a lower Memory usage, which means that even gc() does not recover all possible memory and closing/re-opening R works the best to start with maximum memory available.
  • the other trick is to only load train set for training (do not load the test set, which can typically be half the size of train set). The training phase can use memory to the maximum (100%), so anything available is useful. All this is to take with a grain of salt as I am experimenting with R memory limits.

I encountered a similar problem, and I used 2 flash drives as 'ReadyBoost'. The two drives gave additional 8GB boost of memory (for cache) and it solved the problem and also increased the speed of the system as a whole. To use Readyboost, right click on the drive, go to properties and select 'ReadyBoost' and select 'use this device' radio button and click apply or ok to configure.


I followed to the help page of memory.limit and found out that on my computer R by default can use up to ~ 1.5 GB of RAM and that the user can increase this limit. Using the following code,

>memory.limit()
[1] 1535.875
> memory.limit(size=1800)

helped me to solve my problem.


The simplest way to sidestep this limitation is to switch to 64 bit R.


If you are running your script at linux environment you can use this command:

bsub -q server_name -R "rusage[mem=requested_memory]" "Rscript script_name.R"

and the server will allocate the requested memory for you (according to the server limits, but with good server - hugefiles can be used)


Examples related to r

How to get AIC from Conway–Maxwell-Poisson regression via COM-poisson package in R? R : how to simply repeat a command? session not created: This version of ChromeDriver only supports Chrome version 74 error with ChromeDriver Chrome using Selenium How to show code but hide output in RMarkdown? remove kernel on jupyter notebook Function to calculate R2 (R-squared) in R Center Plot title in ggplot2 R ggplot2: stat_count() must not be used with a y aesthetic error in Bar graph R multiple conditions in if statement What does "The following object is masked from 'package:xxx'" mean?

Examples related to matrix

How to get element-wise matrix multiplication (Hadamard product) in numpy? How can I plot a confusion matrix? Error: stray '\240' in program What does the error "arguments imply differing number of rows: x, y" mean? How to input matrix (2D list) in Python? Difference between numpy.array shape (R, 1) and (R,) Counting the number of non-NaN elements in a numpy ndarray in Python Inverse of a matrix using numpy How to create an empty matrix in R? numpy matrix vector multiplication

Examples related to vector

How to plot vectors in python using matplotlib How can I get the size of an std::vector as an int? Convert Mat to Array/Vector in OpenCV Are vectors passed to functions by value or by reference in C++ Why is it OK to return a 'vector' from a function? Append value to empty vector in R? How to initialize a vector with fixed length in R How to initialize a vector of vectors on a struct? numpy matrix vector multiplication Using atan2 to find angle between two vectors

Examples related to memory-management

When to create variables (memory management) How to check if pytorch is using the GPU? How to delete multiple pandas (python) dataframes from memory to save RAM? Is there a way to delete created variables, functions, etc from the memory of the interpreter? C++ error : terminate called after throwing an instance of 'std::bad_alloc' How to delete object? Android Studio - How to increase Allocated Heap Size Implementing IDisposable correctly Calculating Page Table Size Pointer-to-pointer dynamic two-dimensional array

Examples related to r-faq

What does "The following object is masked from 'package:xxx'" mean? What does "Error: object '<myvariable>' not found" mean? How do I deal with special characters like \^$.?*|+()[{ in my regex? What does %>% function mean in R? How to plot a function curve in R Use dynamic variable names in `dplyr` Error: unexpected symbol/input/string constant/numeric constant/SPECIAL in my code How should I deal with "package 'xxx' is not available (for R version x.y.z)" warning? How to select the row with the maximum value in each group R data formats: RData, Rda, Rds etc