[python] Find out how much memory is being used by an object in Python

How would you go about finding out how much memory is being used by an object? I know it is possible to find out how much is used by a block of code, but not by an instantiated object (anytime during its life), which is what I want.

This question is related to python performance memory-profiling

The answer is


I haven't any personal experience with either of the following, but a simple search for a "Python [memory] profiler" yield:

  • PySizer, "a memory profiler for Python," found at http://pysizer.8325.org/. However the page seems to indicate that the project hasn't been updated for a while, and refers to...

  • Heapy, "support[ing] debugging and optimization regarding memory related issues in Python programs," found at http://guppy-pe.sourceforge.net/#Heapy.

Hope that helps.


Try this:

sys.getsizeof(object)

getsizeof() Return the size of an object in bytes. It calls the object’s __sizeof__ method and adds an additional garbage collector overhead if the object is managed by the garbage collector.

A recursive recipe


Another approach is to use pickle. See this answer to a duplicate of this question.


This must be used with care because an override on the objects __sizeof__ might be misleading.

Using the bregman.suite, some tests with sys.getsizeof output a copy of an array object (data) in an object instance as being bigger than the object itself (mfcc).

>>> mfcc = MelFrequencyCepstrum(filepath, params)
>>> data = mfcc.X[:]
>>> sys.getsizeof(mfcc)
64
>>> sys.getsizeof(mfcc.X)
>>>80
>>> sys.getsizeof(data)
80
>>> mfcc
<bregman.features.MelFrequencyCepstrum object at 0x104ad3e90>

For big objects you may use a somewhat crude but effective method: check how much memory your Python process occupies in the system, then delete the object and compare.

This method has many drawbacks but it will give you a very fast estimate for very big objects.