[javascript] Maximum size of an Array in Javascript

Context: I'm building a little site that reads an rss feed, and updates/checks the feed in the background. I have one array to store data to display, and another which stores ID's of records that have been shown.

Question: How many items can an array hold in Javascript before things start getting slow, or sluggish. I'm not sorting the array, but am using jQuery's inArray function to do a comparison.

The website will be left running, and updating and its unlikely that the browser will be restarted / refreshed that often.

If I should think about clearing some records from the array, what is the best way to remove some records after a limit, like 100 items.

This question is related to javascript arrays

The answer is


I have shamelessly pulled some pretty big datasets in memory, and altough it did get sluggish it took maybe 15 Mo of data upwards with pretty intense calculations on the dataset. I doubt you will run into problems with memory unless you have intense calculations on the data and many many rows. Profiling and benchmarking with different mock resultsets will be your best bet to evaluate performance.


You could try something like this to test and trim the length:

http://jsfiddle.net/orolo/wJDXL/

_x000D_
_x000D_
var longArray = [1, 2, 3, 4, 5, 6, 7, 8];_x000D_
_x000D_
if (longArray.length >= 6) {_x000D_
  longArray.length = 3;_x000D_
}_x000D_
_x000D_
alert(longArray); //1, 2, 3
_x000D_
_x000D_
_x000D_


It will be very browser dependant. 100 items doesn't sound like a large number - I expect you could go a lot higher than that. Thousands shouldn't be a problem. What may be a problem is the total memory consumption.


Like @maerics said, your target machine and browser will determine performance.

But for some real world numbers, on my 2017 enterprise Chromebook, running the operation:

console.time();
Array(x).fill(0).filter(x => x < 6).length
console.timeEnd();
  • x=5e4 takes 16ms, good enough for 60fps
  • x=4e6 takes 250ms, which is noticeable but not a big deal
  • x=3e7 takes 1300ms, which is pretty bad
  • x=4e7 takes 11000ms and allocates an extra 2.5GB of memory

So around 30 million elements is a hard upper limit, because the javascript VM falls off a cliff at 40 million elements and will probably crash the process.


EDIT: In the code above, I'm actually filling the array with elements and looping over them, simulating the minimum of what an app might want to do with an array. If you just run Array(2**32-1) you're creating a sparse array that's closer to an empty JavaScript object with a length, like {length: 4294967295}. If you actually tried to use all those 4 billion elements, you'll definitely your user's javascript process.


I have built a performance framework that manipulates and graphs millions of datasets, and even then, the javascript calculation latency was on order of tens of milliseconds. Unless you're worried about going over the array size limit, I don't think you have much to worry about.


No need to trim the array, simply address it as a circular buffer (index % maxlen). This will ensure it never goes over the limit (implementing a circular buffer means that once you get to the end you wrap around to the beginning again - not possible to overrun the end of the array).

For example:

var container = new Array ();
var maxlen = 100;
var index = 0;

// 'store' 1538 items (only the last 'maxlen' items are kept)
for (var i=0; i<1538; i++) {
   container [index++ % maxlen] = "storing" + i;
}

// get element at index 11 (you want the 11th item in the array)
eleventh = container [(index + 11) % maxlen];

// get element at index 11 (you want the 11th item in the array)
thirtyfifth = container [(index + 35) % maxlen];

// print out all 100 elements that we have left in the array, note
// that it doesn't matter if we address past 100 - circular buffer
// so we'll simply get back to the beginning if we do that.
for (i=0; i<200; i++) {
   document.write (container[(index + i) % maxlen] + "<br>\n");
}