Out of OpenGL3Context
I spent a lot of time last weekend making sure resources are correctly shared between rendering contexts. It's surprising how many commercial games make you restart the game to switch graphics resolutions, and I find it annoying. Leadwerks Engine 3 uses a small hidden window with an OpenGL context to create the OpenGL 3.3 contexts, and it just stays open so there is always a persistent context with which resources are shared. Textures, shaders, and vertex buffers can all be shared between OpenGL contexts, but oddly, frame buffer objects cannot. This is probably because FBOs are small objects that don't consume large amounts of memory, but it still seems like a strange design choice. I got around this problem by using the persistent background context whenever any FBO commands are called, so buffers will continue to work after you delete a context. So I guess the way to describe that is I start with something that is sort of awkward to work with, and encapsulate it in something that makes more sense, to me at least.
Because the background context is created in the OpenGL3GraphicsDriver constructor, you can start calling 3D commands as soon as you create the driver object, without creating a visible 3D window! Weird and cool. No idea yet if it will work like this on MacOS, but I'll find out soon enough, since I ordered my iMac last week. I got the 27-inch model with the 3.2 ghz dual core CPU, and upgraded the GPU to an ATI 5750. I chose the 3.2 ghz dual core over the 2.8 ghz quad core because I have found in general usage, my quad core rarely goes over 50% usage, and I would rather have a faster clock speed per core.
I said earlier that the window/context design was a little tricky to figure out, especially when you take into consideration the external windows people will want to use. In Leadwerks Engine 2, this was accomplished via a custom buffer, where callbacks were used to retrieve the context dimensions, and the user was responsible for setting up an OpenGL window. Well, initializing pixel format and OpenGL version on a window is a somewhat tricky thing, and if it's possible I would like to avoid making you deal with that. I ended up with a window design that is quite a lot more advanced than the simple Graphics() command in LE2. The window is created from a GUIDriver, which implies other parts of a cross-platform GUI might one day be included. The design is modeled similarly to MaxGUI for BlitzMax. To create a window, we do this:
Gadget* window = CreateWindow("My window",0,0,1024,768,NULL,WINDOW_FULLSCREEN)
Then you can create a graphics context on the window (or any other gadget):
Context* context = GraphicsDriver->CreateContext(window)
We can check for events in our game loop like this:
while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWCLOSE: return 0; } }
At this time, windows are the only supported gadget, but the framework is there for adding additional gadgets in the future. This system can also be used to implement the skinned game GUI as well as native interface elements. Before Rick says anything, yes, there will be a custom event handler function you can attach to a gadget instead of polling events.
You can render to an external window just by supplying the HWND (on Windows) to the GUIDriver->CreateGadget(HWND hwnd) command. This will create a "Gadget" object from any valid hwnd, and it can then have a context created on it, like the above example.
Simple deferred lighting is working, just using a directional light with no shadows. On both AMD and NVidia cards, the engine can render 16x MSAA deferred lighting. The gbuffer format in Leadwerks Engine 3 is only 12 bytes per pixel. Per-pixel motion blur will add a couple more bytes:
color0 (RGBA8)
diffuse.r
diffuse.g
diffuse.b
specular intensity
color1 (RG11B10)
normal.x
normal.y
normal.z
color2 (RGBA8)
emission.r
emission.g
emission.b
materialid
Material properties are sent in an array to the GPU, and the material ID is used to look up properties like specular reflection color, gloss value, etc. So this is a very efficient usage of texture bandwidth. My GEForce 9800 GTX can handle 1920x1080 with 8x MSAA, but 16x seems to go over a threshold and the card crawls. I don't know if you'll be using 16x MSAA in a lot of games, but if nothing else it makes for fantastic screen shots, and the lower resolution antialias options are still there. I personally don't see a big improvement past 4x multisampling in most games.
Here's my current test program. It's more low-level than you will have to work with. You won't have to create the gbuffer and draw lighting yourself like I am here, but you might like seeing how things work internally:
#include "le3.h" using namespace le3; int main() { InitFileFactories(); //Create GUI driver GUIDriver* guidriver = new WindowsGUIDriver; //Create a window Gadget* window = guidriver->CreateWindow("Leadwerks",0,0,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); if (!window) { Print("Failed to create window"); return 0; } //Create graphics driver GraphicsDriver* graphicsdriver = new OpenGL3GraphicsDriver; if (!graphicsdriver->IsSupported()) { Print("Graphics driver not supported."); return 0; } //Create a graphics context Context* context = CreateContext(window,0); if (!context) { Print("Failed to create context"); return 0; } //Create world World* world = new World; //Create a camera Camera* camera = CreateCamera(); camera->SetClearColor(0.5,0.5,0.5,1); //Load a model LoadModel("Models/train_sd40.mdl"); //Create gbuffer #define SAMPLES 16 Buffer* gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); Texture* normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); Texture* emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; //Set up light shader Shader* lightshader = LoadShader("shaders/light/directional.shd"); Mat4 lightmatrix = Mat4(1,0,0,0, 0,1,0,0, 1,0,0,0, 0,0,0,1); lightmatrix *= camera->mat; lightshader->SetUniformMat4("lightmatrix",lightmatrix); lightshader->SetUniformVec4("lightcolor",Vec4(1,0,0,1)); lightshader->SetUniformVec4("ambientlight",Vec4(0,0,0,1)); lightshader->SetUniformVec2("camerarange",camera->range); lightshader->SetUniformFloat("camerazoom",camera->zoom); //Delete and recreate the graphics context, just because we can //Resources are shared, so you can change screen resolution with no problems delete context; delete window; window = guidriver->CreateWindow("Leadwerks",200,200,1024,768,NULL,WINDOW_TITLEBAR|WINDOW_RESIZABLE); context = CreateContext(window,0); float yaw = 0; while (true) { //Print(graphicsdriver->VidMemUsage()); if (!window->Minimized()) { yaw +=0.25; //Adjust the camera camera->SetPosition(0,0,0,false); camera->SetRotation(0,yaw,0,false); camera->Move(0,2,-10,false); //Update the time step UpdateTime(); //Render to buffer gbuffer->Enable(); camera->Render(); //Switch back to the window background context->Enable(); //Enable shader and bind textures lightshader->Enable(); gbuffer->depthcomponent->Bind(0); gbuffer->colorcomponent[0]->Bind(1); gbuffer->colorcomponent[1]->Bind(2); gbuffer->colorcomponent[2]->Bind(3); //Draw image onto window graphicsdriver->DrawRect(0,0,context->GetWidth(),context->GetHeight()); //Turn the shader off lightshader->Disable(); //Swap the back buffer context->Swap(false); } //Handle events while (PeekEvent()) { Event ev = WaitEvent(); switch (ev.id) { case EVENT_WINDOWRESTORE: ResumeTime(); break; case EVENT_WINDOWMINIMIZE: PauseTime(); break; case EVENT_WINDOWSIZE: //Recreate the gbuffer delete gbuffer; gbuffer = CreateBuffer(context->GetWidth(),context->GetHeight(),1,1,SAMPLES); normals = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGB_PACKED,0,1,SAMPLES); emission = CreateTexture(context->GetWidth(),context->GetHeight(),TEXTURE_RGBA,0,1,SAMPLES); gbuffer->SetColor(normals,1); gbuffer->SetColor(emission,2); delete normals; delete emission; break; case EVENT_WINDOWCLOSE: //Print OpenGL error to make sure nothing went wrong Print(String(glGetError())); //Exit program return 0; } } } }
There's been some debate about the use of constructors, and although it would be nice to be able to use a constructor for everything, but that does not seem possible. I use a lot of abstract classes, and there is no way to use an abstract class constructor to create an object. If there was a way to turn the object into a derived class in its own constructor, that would work, but it's not supported. You certainly wouldn't want to have to call new OpenGL3Buffer, new DirectX11Buffer, new OpenGL4Buffer depending on the graphics driver. The point of abstract classes is so you can just call their commands without knowing or caring what their derived class is. So if anyone has any other ideas, I'm all ears, but there doesn't seem to be any other way around this.
What's next? I need to get some text up onscreen, and the FreeType library looks pretty good. I'll be getting the Mac version set up soon. And I am eager to get the engine working together with BlitzMax, so I can work on the editor. The graphics features of LE3 are great, but I think there are two even more important aspects. The first is the art pipeline. I've designed a system that is the absolute easiest way to get assets into the engine. More details will come later, but let's just say it's heavy on the drag and drop. The other important aspect of LE3 is the interactions system. I have concluded that programming, while necessary, is a lousy way of controlling interactions between complex objects. The Lua implementation in LE2 was good because it provided a way for people to easily share programmed objects, but its the interactions of objects that make a system interesting, and a lot of object-oriented spaghetti is not a way to handle this. Using the interactions system in LE3 with inputs and outputs for each object is something that I think people will really like working with.
18 Comments
Recommended Comments