#1380  Potential memory leak?
Closed
cyl3x opened 11 months ago

Hi there, for quite some time I experience a suddenly high memory usage.

A month ago I created #1294 and lately my log was spammed with io.onedev.commons.utils.ExplicitException: Please login to perform this query errors. Both issues I discovered because the memory exploded in my grafana dashboard and I wanted to look for the problem in the server logs.

Here are three screenshots from different days when memory usage suddenly exploded, the last one is before I limited my server to a maximum of 6Gb:

grafik_2.png grafik_3.png grafik_4.png

I think it's caused by the server logger or the database backup at 1am (but the memory usage only lines up sometimes), but to be honest I have no idea.

There is no traffic on the server (as my grafana statistics for in- and outbound traffic is zero), no scheduled build job is running or something like this.

The server itself reports a very low memory usage (real one is 5.12Gb):

grafik_5.png

My server is mostly idle and therefore I want to eliminate the high memory usage. Thanks for your help :)

Robin Shen commented 11 months ago

Looks like that JVM thinks there are plenty of memory and is lazy on releasing to system. Please reduce memory available to JVM to see if the situation is better. This can be done by changing wrapper.java.maxmemory in conf/wrapper.conf if you are running in metal mode, or limit memory available to the container with option --memory if you are running as docker container.

cyl3x commented 11 months ago

Before I set a memory limit of 6gb in wrapper.conf. Screenshot 1 & 2 are with this memory limit, screenshot 3 was before (and the reason for) the memory limit.

Yesterday I limited the container to 4GB with the --memory options and so far there has been no memory leap. But a final statement will be clear in the next few days.

Currently the server reports a memory usage of ~150mb, while the actual memory usage is 1.1gb. I have seen this difference many times in the past, but why is there a discrepancy between the server's memory usage report and the actual memory usage? Also, the actual memory usage is slowly creeping up while the report stays the same, which seems a bit odd.

cyl3x commented 11 months ago

Here is a small history of memory usage over the last 70 days (I do not keep older metrics). With a few exceptions, the memory graph only decreases when I restart/update the server or force garbage collection. Therefore, the server does not seem to free any memory at all. grafik_7.png

(because of the squeezed timeline, the initial memory usage looks higher than it really is)

Robin Shen commented 11 months ago

JVM can use up to max memory allocated to it (via conf/wrapper.conf), and it may not release free memory to OS even with low memory pressure. This is the reason why OneDev itself only uses a small portion of memory, but container consumes a lot.

JVM has some option to release memory to OS and I will experiment with it later. For now, you may further reduce the memory allocated to JVM via conf/wrapper.conf as it seems that OneDev does not need that much.

cyl3x commented 11 months ago

Okay, that is good to know, I will reduce my memory limit further.

Thank you for taking the time to investigate :)

cyl3x changed fields 11 months ago
Name Previous Value Current Value
Type
Bug
Support Request
Robin Shen changed state to 'Closed' 6 months ago
Previous Value Current Value
Open
Closed
Robin Shen commented 6 months ago

Memory usage can now be controlled via environment variable max_memory_percent which defaults to '50` meaning using 50% of available memory.

issue 1 of 1
Type
Question
Priority
Normal
Assignee
Issue Votes (0)
Watchers (4)
Reference
onedev/server#1380
Please wait...
Page is in error, reload to recover