Jump to content

Ultra Engine testing


Josh
 Share

Recommended Posts

First crack at environment probes. The important thing is the scene color is rendering to the texture and the probe effect is showing up in the scene. The rest is just details.

cube.thumb.jpg.63dc97c2286778232f0c301af93ee799.jpg

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

8 minutes ago, SpiderPig said:

The KinematicJoint example crashes a few moments in and in my game, passing a model to it makes it return nullptr.  Is there still some work to do on physics?

Fixed

  • Thanks 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

On 9/18/2022 at 12:25 PM, Josh said:

To load the Leadwerks material, you might need need to first load the Leadwerks plugin. However, some of the material properties are up-in-the-air right now so it might not work still.

Just thought I'd mention that loading the LELegacy plugin didn't fix the issue.  Same error loading this JSON one too;

{
	"material":
	{
		"color": [1,1,1,1],
		"emission": [0,0,0],
		"metallic": 1,
		"roughness": 1,
		"textures":
		[
			{
				"slot": "BASE",
				"file": "./Rock001_DIFF.dds",
				"filter": "LINEAR",
				"clamp": [false, false, false]
			},
			{
				"slot": "NORMAL",
				"file": "./Rock001_NORM.dds",
				"scale": 1
			},
			{
				"slot": "BRDF",
				"file": "Materials/BRDF/default.dds"
			}
		]
	}
}

 

Link to comment
Share on other sites

Try this:

{
	"material":
	{
		"color": [1,1,1,1],
		"emission": [0,0,0],
		"metallic": 1,
		"roughness": 1,
		"texture0": "Rock001_DIFF.dds"
		"texture1": "Rock001_NORM.dds"
	}
}

 

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

I have the basics of environment probes worked out. The forward renderer allows much better graphics because I can evaluate all the probes at once.

Each probe is a box of any dimensions, with an adjustable padding value. This is the distance from the edge where the reflection effect starts to fade until it exits the volume.

If two probes overlap it's no problem. The smaller probe is evaluated first, and if pixel is in the fade region, then lighting from the other probe will be added. You actually want probes to overlap, maybe with a distance of 0.5 meters or so, because this will give you a smooth fade from one cubemap to the other. You can also place one probe inside another and it's no problem. So if you had one big probe in the center of the room, and a small enclosed area inside the big room, you could put a small probe in that area and everything would work as expected.

Probes blend with the skybox with no problem. There's no need to mark indoors / outdoors area. Probes can work inside a structure and are effective at blocking the sky reflection, or they can be used outdoors, and the skybox will appear in the areas where the probe reflection doesn't render any objects.

So you can just place them wherever and they just work, but you still can control how they fade in and out.

  • Like 2

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Can't seem to get this to work... is there something I'm doing wrong?

#include "UltraEngine.h"

using namespace UltraEngine;

int main(int argc, const char* argv[])
{
    //Get the displays
    auto displays = GetDisplays();

    //Create a window
    auto window = CreateWindow("Ultra Engine", 0, 0, 800, 600, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR);

    //Create a world
    auto world = CreateWorld();
    world->SetGravity(0, -9.8, 0);

    //Create a framebuffer
    auto framebuffer = CreateFramebuffer(window);

    //Create a camera    
    auto camera = CreateCamera(world);
    camera->SetClearColor(0.125);
    camera->SetPosition(0, 5, -5);

    auto light = CreateLight(world, LIGHT_DIRECTIONAL);
    light->SetRotation(35, 35, 0);

    //Create the ground sprite
    auto ground = CreateBox(world, 100, 0.1, 100);


    auto player = CreateBox(world);
    player->SetPosition(0, 5, 0, true);
    player->SetMass(1.0f);

    //Create kinamatic joint
    auto joint = CreateKinematicJoint(Vec3(0, 5, 0), player);
    joint->SetMaxForce(100000);
    joint->SetMaxTorque(1000);

    //Main loop
    while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false)
    {
        camera->UpdateControls(window);

        auto pos = player->GetPosition(true);
        if (window->KeyDown(KEY_UP)) {
            auto p = pos + Vec3(0, 1, 0);
            joint->SetTargetPosition(p.x, p.y, p.z);
        }

        if (window->KeyDown(KEY_DOWN)) {
            auto p = pos + Vec3(0, -1, 0);
            joint->SetTargetPosition(p.x, p.y, p.z);
        }

        if (window->KeyDown(KEY_LEFT)) {
            auto p = pos + Vec3(-1, 0, 0);
            joint->SetTargetPosition(p.x, p.y, p.z);
        }

        if (window->KeyDown(KEY_RIGHT)) {
            auto p = pos + Vec3(1, 0, 0);
            joint->SetTargetPosition(p.x, p.y, p.z);
        }

        world->Update();
        world->Render(framebuffer);
    }
    return 0;
}

 

Link to comment
Share on other sites

Okay, I might not have implemented kinematic joints in Newton. The engine has Box2D physics built into it as well, and you can see my example was initializing a world using Box2D. (Box2D is not going to be included in the first release because it's not finished, but I am just laying things out for future plans. It's a lot easier for me to account for that feature now than to go back and add it in later.)

  • Like 1

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

mipmap generation works great now :)

thx Josh.

this is the main code i use currently to setup the environment pipeline:

auto light = CreateLight(world, LIGHT_DIRECTIONAL);
	light->SetRotation(35, 45, 0);
	light->SetColor(1, 1, 1);

	auto environment = initalize_atmosphere(world);

	auto envSky = CreateTexture(TEXTURE_CUBE, 1024, 1024, TEXTURE_RGBA32, {}
		, 6, TEXTURE_STORAGE, TEXTUREFILTER_LINEAR, 0);

	auto envSkyWithoutSun = CreateTexture(TEXTURE_CUBE, 1024, 1024, TEXTURE_RGBA32, {}
	, 6, TEXTURE_STORAGE, TEXTUREFILTER_LINEAR, 0);

	auto reflectionTexture = CreateTexture(TEXTURE_CUBE, 256, 256, TEXTURE_RGBA32, {}
	, 6, TEXTURE_STORAGE | TEXTURE_MIPMAPS , TEXTUREFILTER_LINEAR, 0);

	auto diffTexture = CreateTexture(TEXTURE_CUBE, 64, 64, TEXTURE_RGBA32, {}
	, 6, TEXTURE_STORAGE, TEXTUREFILTER_LINEAR, 0);

	EnvironmentSkyContants skypush;
	skypush.cameraposition = Vec4(camera->GetPosition(), 0.0);
	skypush.lightdir = Vec4(Vec3(-light->matrix.k.x, -light->matrix.k.y, -light->matrix.k.z).Normalize(), 0.0);

	auto cshader = ComputeShader::Create("Shaders\\Environment\\simple_test.comp.spv");
	cshader->SetupPushConstant(sizeof(EnvironmentSkyContants));
	cshader->AddTargetImage(envSky);
	cshader->AddTargetImage(envSkyWithoutSun);
	cshader->AddUniformBuffer(&environment->_atmosphereData, sizeof(AtmosphereData), false);
	cshader->AddSampler(environment->m_transmittance_texture);
	cshader->AddSampler(environment->m_irradiance_texture);
	cshader->AddSampler(environment->m_scattering_texture);
	cshader->AddSampler(environment->m_optional_single_mie_scattering_texture);
	cshader->AddSampler(LoadTexture("Materials/Environment/noise.png"));

	cshader->BeginDispatch(world, envSky->GetSize().x / 16, envSky->GetSize().y / 16, 6, false, &skypush, sizeof(EnvironmentSkyContants));

	auto irrshader = ComputeShader::Create("Shaders\\Environment\\env_irradiance_gen.comp.spv");
	irrshader->AddSampler(reflectionTexture);
	irrshader->AddTargetImage(diffTexture);

	irrshader->BeginDispatch(world, diffTexture->GetSize().x / 16, diffTexture->GetSize().y / 16, 6, false);

	vector<EnvironmentSkyReflectionContants> reflShaders;
	for (int layer = 0; layer < reflectionTexture->CountMipmaps(); layer++)
	{
		auto refshader = ComputeShader::Create("Shaders\\Environment\\env_reflection_gen.comp.spv");
		refshader->AddSampler(envSkyWithoutSun);
		refshader->AddTargetImage(reflectionTexture,layer);
		refshader->SetupPushConstant(sizeof(EnvironmentSkyReflectionContants));
		EnvironmentSkyReflectionContants data;
		data.reflectiondata.x = layer / reflectionTexture->CountMipmaps();
		refshader->BeginDispatch(world, reflectionTexture->GetSize().x / 16, reflectionTexture->GetSize().y / 16, 6, false, &data,sizeof(EnvironmentSkyReflectionContants));
		reflShaders.push_back(data);
	}

	world->SetEnvironmentMap(envSky, ENVIRONMENTMAP_BACKGROUND);
	world->SetEnvironmentMap(reflectionTexture, ENVIRONMENTMAP_SPECULAR);
	world->SetEnvironmentMap(diffTexture, ENVIRONMENTMAP_DIFFUSE);

The initalize_atmosphere is the method where the most magic happens, all required lookup textures are pre-generated for later use and are only calculated once.

the other Computeshaders run recurring (which can be optimized to only run when parameters have changed)

 

 

  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

I use some vulkan helper classes to make it easier to instantiate some of the VK_STRUCTURE_TYPES and handle/check for errors, but everything else for the compute shader is pure vulkan code by myself (of course collected from tutorials). Only a small part in my ComputeLibrary comes directly from Ultraengine (VulkanInstance, Device, ShaderModule, and math)

  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

That would be extremely difficult to do with OpenGL. I think the method of using raw Vulkan code, rather than exposing my internal rendering API, is probably best because you have a lot of external resources to lean on. Vulkan code is no doubt difficult, but it can do whatever you want and it never changes, whereas my rendering classes only perform the specific tasks they are designed for, and changes to the internal rendering code could break external hooks.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

Yes, of course it will rely on some UltraEngine internals like the CommandBuffer or some transfer data, like the VKTexture you implemnted for me. But everything else should be handled by the external lib by itself. I even use my own Vkbuffer instances, because it would be hard to acces your internals in the way it would be optimal for the library, also you would need to write things like VKTexture for everything which possibly might interact with vulkan. 

As said earlier, a more dynamic access to the posteffect system (add custom textures, as you have already subpasses you coud add some dummy pass which does nothing but pass external generated textures to the effect) would be awesome. 

For the computeshader (will work for all shaders) I am currently integrating glslc together with some Filewatching and logging system, which will recompile the shader code on the fly either at the start of the program or on a file change. this way you just need to edit the original shader and the spv is generated at runtime when needed (when the source has changed). 

  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

You can already do that:

auto fw = CreateFileSystemWatcher(".");

while (true)
{
        while (PeekEvent())
        {
            const auto ev = WaitEvent();
            if (ev.id == EVENT_FILECREATE or ev.id == EVENT_FILECHANGE)
            {
                auto asset = FindCachedAsset(ev.text);
                if (asset) asset->Reload();
            }
        }

That will detect an SPV file change. You could also detect a change to the shader files and create a process for the GLSL compiler, but I like to use a bat file so I can easily detect errors.

My job is to make tools you love, with the features you want, and performance you can't live without.

Link to comment
Share on other sites

yes, i found the FileSystemWatcher in your VXRT sample, and already using it.

with glslc or better libshaderc you don't need an external process, you can use a cpp api to handle the compilation and get errors etc. some people have done runtime descriptorSetgenerators with this ^^.

  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

I don't know if this is a bug for sure, but it seems the pbr reflection is not correctly aligned:

image.thumb.png.49e1af95c92f0c096ff56bf068e6644c.png

Green: actual sun/light position

Red: specular reflection is correct

Yellow: actual reflection is off.

Note: all cubemaps use the same layout and are rendered the same way.

  • Thanks 1
  • Windows 10 Pro 64-Bit-Version
  • NVIDIA Geforce 1080 TI
Link to comment
Share on other sites

  • Josh changed the title to Ultra Engine testing
  • Josh locked this topic
Guest
This topic is now closed to further replies.
 Share

×
×
  • Create New...