I am experimenting with a system for creating a sequence of actions using Lua coroutines. This allows you to define a bunch of behavior at startup and let the game just run without having to keep track of a lot of states.
You can add coroutines to entities and they will be executed in order. The first one will complete, and then the next one will start.
A channel parameter allows you to have separate stacks of commands so you can have multiple sequences running on the same object. For
A new beta is available. In this build I cleaned up a lot of internal stuff. I removed some parts of the engine that I want to redesign in order to clean up the source.
JSON material files loaded from MDL files are now supported.
Added ActiveWindow() command. if the game window is not the foreground window this will return null.
The Steamworks and all dependent classes are temporarily removed. There's a lot of stuff in there I don't intend to use in the future like all the Worksho
This is a good time to write about some very broad changes I expect to come about over the next year in our community as our new engine "Turbo" arrives. Turbo Game Engine, as the name suggests, offers really fast performance using a groundbreaking Vulkan-based renderer, which is relevant to everyone but particularly beneficial for VR developers who struggle to keep their framerates up using conventional game engines. I want to help get you onboard with some of the ideas that I myself am processi
A new update is available for beta subscribers.
What's new
Added support for strip lights. To create these just call CreateLight(world, LIGHT_STRIP). The entity scale on the Z axis will determine the length of the line, and the outer range will determine the radius in which light shows.
Added new properties to the JSON material scheme. "textureScroll" is a float value that can animate a texture to make it smoothly move. "textureScrollRotation" is an angle to control which dir
It's always fun when I can do something completely new that people have never seen in a game engine. I've had the idea for a while to create a new light type for light strips, and I got to implement this today. The new engine has taken a tremendous amount of effort to get working over two years, but as development continues I think I will become much more responsive to your suggestions since we have a very strong foundation to build on now.
Using this test scene provided by @reepblue you ca
A new update is available for beta subscribers. Transparent materials are now supported. Unlike the old deferred renderer, our new clustered forward renderer supports transparency really really well! You can add these in a JSON material file with a Boolean property called "transparent" set to true:
"transparent": true
There are no separate blend modes now, since pre-multiplied alpha allows alpha and additive blending in a single pass. This is actually a really simple technique but for som
An update for the beta of the new engine is now available with the following changes:
GLTF loader is now working for most models. A large collection of GLTF files are available online for free from many sources, and they can be loaded right into the engine without any adjustment for materials or textures. Single-file GLB files also work.
Added support for GLTF extension KHR_materials_pbrSpecularGlossiness.
Disabled PNG loader gamma correction.
world->SetSkybox(tex
A new beta update is available for subscribers. What's new?
Lighting
Point and spot lights are now supported in the new Vulkan renderer, with either PBR or Blinn-Phong lighting. Lighting is controlled by the shader in the material file. There are two main shaders you can use, "Shaders/PBR.spv" and "Shaders/Blinn-Phong.spv". See below for more details.
JSON Materials
Materials can now be loaded from JSON files. I am currently using the .json file extension instead of "mat",
I now have point and spot lights working (without shadows) in the Vulkan renderer. Here are the results, with both "Physically-based rendering" (PBR) and Blinn-Phong shaders: Without the IBL contribution it's not terribly impressive, but this is progress.
Vulkan is pretty wonderful because I can take all the optimal techniques I worked out in OpenGL and it just makes everything much faster. I've successfully completed the implementation of early Z-pass, which is important for our lighting system. We are using a forward clustered renderer, similar to the technique id Software's new DOOM games use. Because the fragment shader is fairly intensive, a depth pre-pass is rendered to ensure we only process each screen pixel once.
This techniqu
I have basic point lights working in the Vulkan renderer now. There are no shadows or any type of reflections yet. I need to work out how to set up a depth pre-pass. In OpenGL this is very simple, but in Vulkan it requires another complicated mess of code. Once I do that, I can add in other light types (spot, box, and directional) and pull in the PBR lighting shader code. Then I will add support for a cubemap skybox and reflections, and then I will upload another update to the beta.
S
The beta of our new game engine has been updated with a new renderer built with the Vulkan graphics API, and all OpenGL code has been removed. Vulkan provides us with low-overhead rendering that delivers a massive increase in rendering performance. Early benchmarks indicate as much as a 10x improvement in speed over the Leadwerks 4 renderer.
The new engine features an streamlined API with modern C++ features and an improved binding library for Lua. Here's a simple C++ program in Turbo:
The Vulkan renderer now supports new texture compression formats that can be loaded from DDS files. I've updated the DDS loader to support newer versions of the format with new features.
BC5 is a format ATI invented (originally called ATI2 or 3Dc) which is a two-channel compressed format specifically designed for storing normal maps. This gives you better quality normals than what DXT compression (even with the DXT5n swizzle hack) can provide.
BC7 is interesting because it uses the sam
I have it worked out now where the new engine will handle multiple shaders. The renderer groups meshes (renamed from "surfaces" in Leadwerks) by shader. A single draw call renders many batches of instances, with different materials applied. It's a very advanced and complex system, so something that was simple before, changing the shader, now requires a lot of code to make work! You can see here the barbed wire is using an alpha-discard shader that removes pixels while the rest of the scene uses
I now have different materials with textures working in Vulkan. The API allows us to access every loaded texture in any shader, although some Intel chips have limitations and will require a fallback. This is interesting because some of our design decisions in Leadwerks 4 were made because we had a limit of 16 textures a shader could access. Terrain clipmaps were a good solution to this problem, but since the same limitations no longer exist it may be time to revisit this design. We could, for ex
In Turbo (Leadwerks 5) all asset types have a list of asset loader objects for loading different file formats. There are a number of built-in loaders for different file formats, but you can add your own by deriving the AssetLoader class or creating a script-based loader. Another new feature is that any scripts in the "Scripts/Start" folder get run when your game starts. Put those together, and you can add support for a new model or texture file format just by dropping a script in your project.
Having completed a hard-coded rendering pipeline for one single shader, I am now working to create a more flexible system that can handle multiple material and shader definitions. If there's one way I can describe Vulkan, it's "take every single possible OpenGL setting, put it into a structure, and create an immutable cached object based on those settings that you can then use and reuse". This design is pretty rigid, but it's one of the reasons Vulkan is giving us an 80% performance increase ove
I finally got a textured surface rendering in Vulkan so we now have officially surpassed StarFox (SNES) graphics:
Although StarFox did have distance fog. ?
Vulkan uses a sort of "baked" graphics pipeline. Each surface you want to render uses an object you have to create in code that contains all material, texture, shader, and other settings. There is no concept of "just change this one setting" like in OpenGL. Consequently, the new renderer may be a bit more rigid than what
It is now possible to compile shaders into a single self-contained file that can loaded by any Vulkan program, but it's not obvious how this is done. After poking around for a while I found all the pieces I needed to put this together.
Compiling
First, you need to compile each shader stage from a source code file into a precompiled SPIR-V file. There are several tools available to do this, but I prefer GLSlangValidator because it supports the Google #include extension. Put your vertex
I was going to write about my thoughts on Vulkan, about what I like and don't like, what could be improved, and what ramifications this has for developers and the industry. But it doesn't matter what I think. This is the way things are going, and I have no say in that. I can only respond to these big industry-wide changes and make it work to my advantage. Overall, Vulkan does help us, in both a technical and business sense. That's as much as I feel like explaining.
Beta subscribers ca
Vulkan gives us explicit control over the way data is handled in system and video memory. You can map a buffer into system memory, modify it, and then unmap it (giving it back to the GPU) but it is very slow to have a buffer that both the GPU and CPU can access. Instead, you can create a staging buffer that only the CPU can access, then use that to copy data into another buffer that can only be read by the GPU. Because the GPU buffer may be in-use at the time you want to copy data to it, it is b
I've now got the Vulkan renderer drawing multiple different models in one single pass. This is done by merging all mesh geometry into one single vertex and indice buffer and using indirect drawing. I implemented this originally in OpenGL and was able to translate the technique over to Vulkan. This can allow an entire scene to be drawn in just one or a few draw calls. This will make a tremendous improvement in performance in complex scenes like The Zone. In that scene in Leadwerks the slow step i
Using my box test of over 100,000 boxes, I can compare performance in the new engine using OpenGL and Vulkan side by side. The results are astounding.
Our new engine uses extensive multithreading to perform culling and rendering on separate threads, bringing down the time the GPU sits around waiting for the CPU to nearly zero.
Hardware: Nvidia GEForce GTX 1070 (notebook)
OpenGL: ~380 FPS
Vulkan 700+ FPS. FRAPS does not work with Vulkan, so the only FPS counter I have is
Following this tutorial, I have managed to add uniform buffers into my Vulkan graphics pipeline. Since each image in the swapchain has a different graphics pipeline object, and uniform buffers are tied to a pipeline, you end up uploading all the data three times every time it changes. OpenGL might be doing something like this under the hood, but I am not sure this is a good approach. There are three ways to get data to a shader in Vulkan. Push constants are synonymous with GLSL uniforms, althoug
The Vulkan graphics API is unbelievably complex. To create a render context, you must create a series of images for the front and back buffers (you can create three for triple-buffering). This is called a swap chain. Now, Vulkan operates on the principle of command buffers, which are a list of commands that get sent to the GPU. Guess what? The target image is part of the command buffer! So for each image in your swap chain, you need to maintain a separate command buffer If anything changes in y