Jump to content

AMD Radeon RX 640 problems


Charrua
 Share

Go to solution Solved by Josh,

Recommended Posts

same to me.

I create a LUA project, in the project folder appear 2 .exe files wich (I guess) are precompiled by you. From the editor, I open the start map and then press F5 or F8 and the game run ok

If I create a cpp project, in the project folder there are no .exe files. I open the .sln, Then F5, and the .exe created fails to start (release mode) or display a black window, with the error messages in the log window

Paren el mundo!, me quiero bajar.

Link to comment
Share on other sites

Does this machine have two graphics chips, like a discrete card and then maybe an integrated chip in the CPU?

This will tell you the GPU being used:

    //Main loop
    while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false)
    {
        while (PeekEvent())
        {
            const Event e = WaitEvent();
            if (e.id == EVENT_STARTRENDERER)
            {
                Print(e.text);
            }
        }

        world->Update();
        world->Render(framebuffer);
    }

 

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

On 7/17/2024 at 4:28 PM, Charrua said:

Is my laptop: lenovo thinkpad, intel(R) UHD Graphics and AMD Radeon (TM) RX 640

Bingo. It must be using the Intel graphics in the C++ program.

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

no word to say thank you in the right way!

(my lack of english limit my effort in trying to express my self, about how I feel right now!)

Radeon has a software in which we can tell how do we prefere some graphics attribues about each "game", but, it seems that is the game the one ho has to select the card... isn´t it?

 

radeon solved.png

Paren el mundo!, me quiero bajar.

Link to comment
Share on other sites

The way your force an OpenGL application to use the discrete GPU is to declare these variables in the application:

#ifdef _WIN32
#include <windows.h>

//Hack to force NVidia discrete card when integrated graphics are present:
//http://developer.download.nvidia.com/devzone/devcenter/gamegraphics/files/OptimusRenderingPolicies.pdf -  Page 3
extern "C"
{
	_declspec(dllexport) DWORD NvOptimusEnablement = 0x00000001;
	_declspec(dllexport) int AmdPowerXpressRequestHighPerformance = 1;
}

#endif

I include these in the static library, and the Nvidia value seems to be detected correctly.

It appears that the AMD value does not get detected unless it is declared directly in the compiled application. This is why the editor and Lua executables worked on your machine, but new C++ programs did not.

The solution is to move this file into the C++ project template.

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

I been using this laptop for a while with Leadwerks and the only problem I had in the past is if I choose screen scale other than 100% in which case, window do not open.

 

Paren el mundo!, me quiero bajar.

Link to comment
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

 Share

×
×
  • Create New...