Excessive memory usage #1598
Labels
No Label
bug
build
dependencies
developers
documentation
duplicate
enhancement
formatting
invalid
legal
mobile
obsolete
packaging
performance
protocol
question
refactoring
regression
security
test
translation
usability
wontfix
No Milestone
No project
No Assignees
1 Participants
Due Date
No due date set.
Dependencies
No dependencies set.
Reference: Bitmessage/PyBitmessage-2024-12-01#1598
Loading…
Reference in New Issue
Block a user
No description provided.
Delete Branch "%!s()"
Deleting a branch is permanent. Although the deleted branch may continue to exist for a short time before it actually gets removed, it CANNOT be undone in most cases. Continue?
I got a report about excessive memory usage. At first I thought it's wrongly deployed because I haven't seen it in a long time, but after upgrading to latest code I could trigger it as well. I'm now doing a poor man's bisect to see where it broke.
Perhaps the objgraph can help?
I first need to find at least approximately when it was introduced and how to trigger it. So far I was only able to trigger it once, and only after the original reporter insisted he's running the current code. Is this a new bug that was introduced last year (as I can't reproduce it with older code) or a continuation of the old leaks that I thought were fixed?
I did indeed used objgraph last time I was tracing the leaks.
Did you check this: 6ca3460? It comes to my mind first when I think about memleak.
@g1itch that was a workaround for a weird bug in Python's threading, the automatic gc was occasionally triggeigr a
RecursionError
.Yes, I remember. But it still may be a good starting point for bisect.
I can't reproduce it even with more recent code so it's unlikely a good starting point
I was able to narrow it down a bit, between
7e1f1d2604
(works ok) and03316496b7
(can be triggered). That leaves only about 4 potential sources as most of the commits in between are code quality. Continuing testing.Narrowed it futher, it's most likely the first one of these two:
a69732f060
( Addrthread finish )2a165380bb
( Restrict outbound connections on network groups )I'll narrow it down and try to write a minimal patch that stops the memory leak (even if it disables the functionality), and then I'll see how to fix the leak properly.
Pretty sure now it's
a69732f060
If there is a memleak, it's barely visible. With 10 connections (2 inbound)
memory_profiler.memory_usage()
returns 105.36.objgraph.show_growth(limit=10)
shows 1695 Peer objects.No much difference even than ran overnight: