Skip to content

Commit bea3c39

Browse files
committed
📝 Add Memray as a profile tool
1 parent d315704 commit bea3c39

File tree

3 files changed

+99
-1
lines changed

3 files changed

+99
-1
lines changed

docs/performance/index.rst

Lines changed: 6 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -66,7 +66,11 @@ Performance measurements
6666
------------------------
6767

6868
Once you have worked with your code, it can be useful to examine its efficiency
69-
more closely. :doc:`cProfile <tracing>`, :doc:`ipython-profiler`, :doc:`scalene`
69+
more closely. :doc:`cProfile <tracing>`, :doc:`ipython-profiler`,
70+
:doc:`scalene`, :doc:`tprof` or :doc:`memray` can be used for this. So far, I
71+
usually carry out the following steps:
72+
73+
:doc:`cProfile <tracing>`, :doc:`ipython-profiler`, :doc:`scalene`
7074
or :doc:`tprof` can be used for this. So far, I usually carry out the following
7175
steps:
7276

@@ -105,6 +109,7 @@ steps:
105109
ipython-profiler.ipynb
106110
scalene.ipynb
107111
tprof
112+
memray
108113
tachyon
109114

110115
Search for existing implementations
490 KB
Loading

docs/performance/memray.rst

Lines changed: 93 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,93 @@
1+
.. SPDX-FileCopyrightText: 2026 Veit Schiele
2+
..
3+
.. SPDX-License-Identifier: BSD-3-Clause
4+
5+
Memray
6+
======
7+
8+
Memory usage is difficult to control in Python projects because the language
9+
does not explicitly indicate where memory is allocated, module imports can
10+
significantly increase consumption, and it is all too easy to create a data
11+
structure that accidentally grows indefinitely. Data science projects are
12+
particularly prone to high memory consumption, as they usually import many large
13+
dependencies such as :doc:`/workspace/numpy/index`, even if these are only used
14+
in a few places.
15+
16+
`Memray <https://bloomberg.github.io/memray/>`_ helps you understand your
17+
programme’s memory usage by tracking where memory is allocated and freed during
18+
programme execution. This data can then be displayed in various ways, including
19+
`flame graphs <https://www.brendangregg.com/flamegraphs.html>`_, which summarise
20+
`stack traces <https://en.wikipedia.org/wiki/Stack_trace>`_ in a diagram, with
21+
the bar width representing the size of the memory allocation.
22+
23+
With ``memray run``, any Python command can be profiled. For most projects, it
24+
is recommended to first use check to profile the function that loads your
25+
project. This checks the minimum effort required to start your application, for
26+
example:
27+
28+
.. code-block:: console
29+
30+
$ uv run memray run src/items/__init__.py check
31+
Writing profile results into src/items/memray-__init__.py.72633.bin
32+
[memray] Successfully generated profile results.
33+
34+
You can now generate reports from the stored allocation records.
35+
Some example commands to generate reports:
36+
37+
/Users/veit/items/.venv/bin/python3 -m memray flamegraph src/items/memray-__init__.py.72633.bin
38+
39+
The command outputs the message ``Successfully generated profile results.`` and
40+
creates a :samp:`{PROCESS-ID}.bin` file. We can then create the flame graph
41+
with:
42+
43+
.. code-block:: console
44+
45+
$ uv run python -m memray flamegraph src/items/memray-__init__.py.72633.bin
46+
Wrote src/items/memray-flamegraph-__init__.py.72633.html
47+
48+
.. tip::
49+
In many consoles, you can combine the two commands with ``&&``:
50+
51+
.. code-block:: console
52+
53+
$ uv run memray run src/items/__init__.py check && uv run python -m memray flamegraph src/items/memray-__init__.py.72633.bin
54+
55+
The result is the following HTML file:
56+
57+
.. figure:: memray-flamegraph.png
58+
:alt: memray flamegraph report
59+
60+
memray flamegraph report
61+
62+
The header area of the page contains several controls, including
63+
64+
*Memory Graph*
65+
Display of the memory space of a process in the working memory (`resident
66+
set size <https://en.wikipedia.org/wiki/Resident_set_size>`_) and the
67+
dynamic memory (heap memory) over time
68+
die Zeit
69+
*Stats*
70+
Memory statistics, in this case
71+
72+
.. code-block:: text
73+
74+
Command line: /Users/veit/items/.venv/bin/memray run src/items/api.py check
75+
Start time: Sun Feb 08 2026 12:12:27 GMT+0100 (Central European Standard Time)
76+
End time: Sun Feb 08 2026 12:12:27 GMT+0100 (Central European Standard Time)
77+
Duration: 0:00:00.068000
78+
Total number of allocations: 11142
79+
Total number of frames seen: 0
80+
Peak memory usage: 4.6 MB
81+
Python allocator: pymalloc
82+
83+
Below that is the flame graph as an icicle chart showing memory allocations over
84+
time, with the last call at the bottom. The graph shows the line of code
85+
executed at a given point in time, with the width proportional to the amount of
86+
memory allocated; if you move your mouse over it, you will see further details
87+
such as file name, line number, allocated memory and number of allocations.
88+
89+
.. tip::
90+
With :ref:`python-basics:pytest_memray`, there is also a plugin for
91+
:doc:`python-basics:test/pytest/index` that allows you to check whether the
92+
upper limits you have set for memory consumption and memory leaks are being
93+
adhered to.

0 commit comments

Comments
 (0)