Jump to content

Lights causes memory leak


Dreikblack
 Share

Go to solution Solved by Josh,

Recommended Posts

In my game with on level with many objects memory was increasing pretty fast and fps starts lowering at some point even if nothing happens.

In example with single light in release mode memory usage increased for me from 150 to 300 for 5 mins,

With more lights and more fps  (vsync off and maybe release mode as well) this happens much faster - around 10 Mb per seconds.

#include "UltraEngine.h"

using namespace UltraEngine;

int main(int argc, const char* argv[]) {
    auto displays = GetDisplays();
    auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR);
    auto framebuffer = CreateFramebuffer(window);
    auto world = CreateWorld();
    world->RecordStats(true);

    auto camera = CreateCamera(world);
    camera->SetClearColor(0.125);
    camera->SetFov(70);
    camera->Move(0, 2, -8);

    auto ground = CreateBox(world, 20, 1, 20);
    ground->SetPosition(0, -0.5, 0);
    ground->SetColor(0, 1, 0);

    vector<shared_ptr<PointLight>> lights;
    for (int i = -10; i < 10; i++) {
        auto light = CreatePointLight(world);
        lights.push_back(light);
        light->SetPosition(i, 0, 0);
    }

    //Main loop
    while (window->Closed() == false and window->KeyHit(KEY_ESCAPE) == false) {
        world->Update();
        world->Render(framebuffer, false);
        window->SetText("FPS: " + String(world->renderstats.framerate) + " RAM: " +
            WString(GetMemoryUsage() / 1024));
    }
    return 0;
}

 

Link to comment
Share on other sites

I am running in debug mode and memory is extremely stable. With a memory leak, you would see a steady increase in RAM usage. When the program starts, I see occasional increases until it steadies out after a few seconds, which indicates new systems are being initialized, or the number of camera visibility sets in memory is reaching its max it will have during usage, but the memory usage evens out quickly and remains stable.

I believe that GetMemoryUsage() is only accurate in debug builds, but I don't know how much it can be off by in release builds. On Windows, it is using GetProcessMemoryInfo under the hood.

image.thumb.png.7f60a953a9bab0379cf0f2d0cb508b89.png

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Although GetMemoryUsage() might not actually be accurate, judging by some old posts. If you add malloc(4) in the main loop it does not increase memory at the rate I would expect each frame. Let me look into this more closely...

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

22 minutes ago, Josh said:

Although GetMemoryUsage() might not actually be accurate

In task manager it's lower in total but increases with same high speed all time (in debug mode as well). Take in mind it keep happening no matter how long i wait without stabilizing

NVIDIA card btw

Link to comment
Share on other sites

The mem usage will now be accurate in the debug build. It fluctuates quickly, but should stay within the same range.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

That is strange, my example shows a steady 12 MB in the window and 65 in task manager. I think the application is showing how much memory the application is using, and task manager shows how much memory Windows has allocated for the process. Windows allocates big blocks of memory and divides it up as the process requests it.

light test.zip

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Your application's memory usage in the titlebar looks exactly the same as mine, but my allocated memory in Taskview shows a steady 65-70 MB.

https://superuser.com/questions/1846625/task-manager-shows-ridiculously-wrong-memory-consumed-by-processes

There are other tools that might give a better reading:
https://learn.microsoft.com/en-us/sysinternals/downloads/process-explorer

Perhaps the problem is with Windows 11 task manager?

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

5 hours ago, Josh said:

Perhaps the problem is with Windows 11 task manager?

Nope, same thing in this Process Explorer. Like i said already few times after some moment fps starts lowering very noticeable (even turn based game becomes barely playable) in my game when memory usage reach 6 Gb and more and the only thing that changes overtime is mem usage. Later i can try to wait enough in this example to see if fps start drops as well.

Which card did you use for testing? Maybe it's NVIDIA related bug this time.

Link to comment
Share on other sites

Just tried it on WIndows 11 (Nvidia mobile card) and it worked the same as on my other machine.

The only thing I can think of is to update your Nvidia drivers.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

I ran the exe myself and I encounter the same bug as Dreikblack.  I modified the source to display the  memory usage like the Process Explorer does.

#include "UltraEngine.h"

using namespace UltraEngine;

SIZE_T PrintMemoryInfo()
{
    auto myHandle = GetCurrentProcess();
    //to fill in the process' memory usage details
    PROCESS_MEMORY_COUNTERS pmc;
    //return the usage (bytes), if I may
    if (GetProcessMemoryInfo(myHandle, &pmc, sizeof(pmc)))
        return(pmc.WorkingSetSize);
    else
        return 0;
}

int main(int argc, const char* argv[]) {
    auto displays = GetDisplays();
    auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR);
    auto framebuffer = CreateFramebuffer(window);
    auto world = CreateWorld();
    world->RecordStats(true);

    auto camera = CreateCamera(world);
    camera->SetClearColor(0.125);
    camera->SetFov(70);
    camera->Move(0, 2, -8);

    auto ground = CreateBox(world, 20, 1, 20);
    ground->SetPosition(0, -0.5, 0);
    ground->SetColor(0, 1, 0);

    vector<shared_ptr<PointLight>> lights;
    for (int i = -10; i < 10; i++) {
        auto light = CreatePointLight(world);
        lights.push_back(light);
        light->SetPosition(i, 0, 0);
    }

    //Main loop
    while (window->Closed() == false and window->KeyHit(KEY_ESCAPE) == false) {
        world->Update();
        world->Render(framebuffer, false);
        window->SetText("FPS: " + String(world->renderstats.framerate) + " RAM: " +
            WString(GetMemoryUsage() / 1024) + " MEM: " + String(PrintMemoryInfo() / 1024) + " kb");
    }
    return 0;
}

Some notes i have found so far: GetMemoryUsage displays just the heap memory while the PrintMemoryInfo shows the whole RAM occupied by the app. 
In this case the heap stays the same (around 12MB) while the total RAM increases. While the RAM growth, the FPS slows down (at least over time, at the start i get around 1000fps, after a few minutes it drops to 600 and keeps dropping). 

 

Also: When no lights are in the scene, both the heap and total ram is stable.

Edited by klepto2
Addition
  • Thanks 1
  • Upvote 1
  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

11 hours ago, Josh said:

The only thing I can think of is to update your Nvidia drivers.

I have latest Nvidia drivers, Checked your compiled version on another PC with W10 and RTX 2070 with relatively old drivers - no memory issue there

Link to comment
Share on other sites

This is what GetMemoryUsage does:

	uint64_t GetMemoryUsage()
	{
		uint64_t result = 0;
#ifdef _WIN32
	#ifdef _DEBUG
		//Exact memory usage, but only works in debug mode:
		_CrtMemState memstate;
		_CrtMemCheckpoint(&memstate);
		//_CrtMemDumpStatistics(&memstate);
		return memstate.lSizes[0] + memstate.lSizes[1] + memstate.lSizes[2] + memstate.lSizes[3] + memstate.lSizes[4];
	#else
		PROCESS_MEMORY_COUNTERS process_memory_counters;
		if (GetProcessMemoryInfo(GetCurrentProcess(), &process_memory_counters, sizeof(process_memory_counters))) {
			result = process_memory_counters.WorkingSetSize;
		}
	#endif
#endif
		return result;
	}

 

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

8 hours ago, klepto2 said:

GetMemoryUsage displays just the heap memory while the PrintMemoryInfo shows the whole RAM occupied by the app. 

So you are saying the increased memory that mysteriously appears on some machines is due to stack memory?

This code should be added to the main loop, but I don't think it explains the problem:

while (PeekEvent()) WaitEvent();

 

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

24 minutes ago, Josh said:

So you are saying the increased memory that mysteriously appears on some machines is due to stack memory?

This code should be added to the main loop, but I don't think it explains the problem:

while (PeekEvent()) WaitEvent();

It seams so

  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

That seems impossible.

My theory right now is that Windows sometimes allocates far more RAM than is actually needed, and then subdivides it up to make dynamic memory allocation faster. The idea being that if available RAM exists and is not otherwise needed, Windows will make use of it.

What happens if you just let the application run? Will it eventually consume all RAM and start using virtual memory? If another memory-intensive process is running, does this still occur?

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

It's also worth checking if this behavior occurs when shadows are disabled on the lights, and if the ground model is not created.

There is also a memory monitor built into Visual Studio 2022 that runs when the app is being debugged.

I suspect this memory allocated might be in the graphics driver DLL.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...