joerick/pyinstrument

Memory consumption increases when profiling in an iteration

Opened this issue · 0 comments

I used the following script to measure memory consumption when code is profiled with pyinstrument and when code is run without being profiled.

import sys


def local_alloc(n):
    x = []
    for _ in range(n):
        x.append(None)


def with_pyinstrument():
    print("Profiling with pyinstrument")
    from pyinstrument import Profiler
    for i in range(20_000):
        with Profiler() as p:
            local_alloc(i)


def without_pyinstrument():
    print("Not using pyinstrument")
    for i in range(20_000):
        local_alloc(i)


if __name__ == "__main__":
    if len(sys.argv) == 2:
        if sys.argv[1] == "pyinstrument":
            with_pyinstrument()
    else:
        without_pyinstrument()

Memory consumption when the code is being profiled with pyinstrument:

$ mprof run test_pyintstruments.py pyinstrument
mprof: Sampling memory every 0.1s
running new process
running as a Python program...
Profiling with pyinstrument

$ mprof plot -o mprofile_with_pyinstrument.png /path/to/profiling/data.dat

mprofile_with_pyinstrument

Memory consumption when the code is not being profiled:

$ mprof run test_pyintstruments.py                                        
mprof: Sampling memory every 0.1s
running new process
running as a Python program...
Not using pyinstrument

$ mprof plot -o mprofile_without_pyinstrument.png /path/to/profiling/data.dat

mprofile_without_pyinstrument

What I don't understand is why memory consumption constantly increases when pyinstrument is being used, as the Profiler object is being created and destroyed in each iteration.