Jump to content

klepto2

Developers
  • Posts

    929
  • Joined

  • Last visited

Everything posted by klepto2

  1. If you have posteffects assigned to a camera, the wireframe mode is not rendering correctly. The only thing which is visible when using wireframe seems to be the pfx quad which is rendered. #include "UltraEngine.h" #include "ComponentSystem.h" //#include "Steamworks/Steamworks.h" using namespace UltraEngine; const WString remotepath = "https://raw.githubusercontent.com/UltraEngine/Documentation/master/Assets"; int main(int argc, const char* argv[]) { #ifdef STEAM_API_H if (not Steamworks::Initialize()) { RuntimeError("Steamworks failed to initialize."); return 1; } #endif RegisterComponents(); auto cl = ParseCommandLine(argc, argv); //Load FreeImage plugin (optional) auto fiplugin = LoadPlugin("Plugins/FITextureLoader"); //Get the primary display auto displays = GetDisplays(); //Create a window auto window = CreateWindow("Ultra Engine", 0, 0, 1280, 720, displays[0], WINDOW_CENTER | WINDOW_TITLEBAR); //Create a rendering framebuffer auto framebuffer = CreateFramebuffer(window); //Create a world auto world = CreateWorld(); world->SetAmbientLight(0); //Create a camera auto camera = CreateCamera(world); camera->SetPosition(0, 0, -1); camera->SetFov(70); camera->SetClearColor(0.125); camera->SetTessellation(4); //Create a light auto light = CreateBoxLight(world); light->SetRange(-10, 10); light->SetRotation(35, 35, 0); light->SetColor(4); //Display material auto model = CreateCubeSphere(world, 0.5, 8, MESH_QUADS); auto mtl = LoadMaterial(remotepath + "/Materials/Ground/rocks_ground_02.json"); mtl->SetTessellation(true); mtl->SetDisplacement(0.075f); model->SetMaterial(mtl); bool isWireFrame = false; bool pfxEnabled = false; //Main loop while (window->Closed() == false and window->KeyDown(KEY_ESCAPE) == false) { if (window->KeyDown(KEY_DOWN)) camera->Move(0, 0, -0.01); if (window->KeyDown(KEY_UP)) camera->Move(0, 0, 0.01); if (window->KeyHit(KEY_F1)) { isWireFrame = !isWireFrame; camera->SetWireframe(isWireFrame); } if (window->KeyHit(KEY_F2)) { pfxEnabled = !pfxEnabled; if (pfxEnabled) { camera->AddPostEffect(LoadPostEffect("Shaders/SSAO.fx")); } else { camera->ClearPostEffects(); } } world->Update(); world->Render(framebuffer); #ifdef STEAM_API_H Steamworks::Update(); #endif } #ifdef STEAM_API_H Steamworks::Shutdown(); #endif return 0; }
  2. I would suggest a shortcut (toolbar icon) to the world/scene settings which are currently located under Edit/World Settings like this:
  3. Just an idea: Now as normally the glsl is compiled at runtime, we could use that fact to introduce global defines which are added to the preprocessed shaders. this could also be extended to add functionality or inject custom macros, but that might be a small overkill for now. I could imagine a system like this: //Before loading any shader a function needs to be called to register global definitions. //As it will be a advanced feature i would place it like this: Render::GraphicsEngine::instance->AddShaderDefine("Editor"); Render::GraphicsEngine::instance->AddSahderDefine("saturate(x) clamp(x,0,1)"); the generated code in glsl would look like this: #version 460 ... #define Editor #define saturate(x) clamp(x,0,1) void main() { #ifdef Editor .... #endif } just an idea.
  4. Maybe you should revert the changes to this for now, currently the textures are not working anymore no matter what uniform type i use. previously the uvec2 approach works at least.
  5. with this: #extension GL_ARB_gpu_shader_int64 : enable you can use use uint64_t directly in the shaders. Also the uvec2 approach might not work on mac system as it is bound to little endian machines if i remember correctly. Also with the latest update the SetUniform with textures isn't working anymore. I assumed that with this update it should be possible to go both ways. but whether uvec2 and uint64_t is not working. When i debug the program with nvidia nsight, the handles are all 0 or (0,0) and i don't see the bound samplers in the shaders resources. this code seems to be wrong it should be more like: void RenderShader::SetUniform(const int location, std::shared_ptr<RenderTexture> tex) { if (not Finalize()) return; if (location < 0 or location >= uniforms.size()) return; auto& u = uniforms[location]; if (u.defined and u.resource == tex) return; if (u.size != 1) return; GLuint64 handle = 0; switch (u.type) { case GL_UNSIGNED_INT_VEC2: if (tex) handle = tex->GetHandle(); glProgramUniform2uiv(program, u.location, 1, (GLuint*)handle); break; default: //Default because the handle can be interpreted as uint64_t and all samplerxd types if (tex) handle = tex->GetHandle(); glProgramUniformHandleui64ARB(program, u.location, handle); break; break; } glCheckError(); u.defined = true; u.resource = nullptr; if (handle) u.resource = tex; } Also some notes about the bindless_sampler/bindless_image qualifier. according to the ARB_bindless_texture specs the default qualifier for sampler or images is the bound qualifier which is the original flow with glActiveTexture etc. if you use the bindless_ qualifier it is possible to either use the original flow or the bindless flow.
  6. Actually this should be possible, as you cast the sampler back to uint.
  7. klepto2

    fog settings

    added some fog, to improve realism of the atmosphere density lowering with the height
  8. It works nicely, but i have found a few things to improve: If a uniform is set with a faulty shader (compiled with errors) an exception is thrown, and you can't see the actual compilation errors With this in mind, the compile error msg should also provide the filename instead of the index of the file (i have for instance a shader with around 30 inlcudes in them) please provide also unifrom handling for matrices. the bindless texture handling can be made much more easy, when passed to a shader it can be used the same way as normal unifrom texture bindings if you use one of these methods: void glUniformHandleui64ARB(GLint location, GLuint64 value); void glUniformHandleui64vARB(GLint location, GLsizei count, const GLuint64 *value); void glProgramUniformHandleui64ARB(GLuint program, GLint location, GLuint64 value); void glProgramUniformHandleui64vARB(GLuint program, GLint location, GLsizei count, const GLuint64 *values); with these methods a shader can consume bindless textures (used as uniforms) with this syntax: layout(bindless_sampler) uniform sampler2D bindless; layout(bindless_image) uniform image2D bindless2; with this the textures can be accessed without casting them in the shader itself.
  9. With the changes in 0.9.5 and the switch to OpenGL i noticed that the SetPostEffectParameters have changed which was very logical to me, So instead of using Matrix offsets to provide the parameters, it seems the logic behind the scenes are just uniform variables. Long story short, they currently have no effect. small snippet using the godrays shader whcih exposes the "Exposure" uniform. Using this code doesn't change the exposure value: int pidex = camera->AddPostEffect(LoadPostEffect("Shaders/GodRays.fx")); camera->SetPostEffectParameter(pidex, "Exposure", 10.0f);
  10. I have found some small but annoying bugs with the current terrainsystem. As you mentioned the texture sizes must currently be of 1k*1k, but it seems that they also need to be BC7 encoded or r8_UNORM for displacement Leads to black paint results when not in this formats materials with metal/roughness textures seems not to be working correctly, the terrain becomes very shiny if those are set in the material. sample.zip Currently the Terrain doesn't use tesselation when enabled. If you change materials within the editor, the changes are only shown when you restart the editor and reload the map. Suggestions: Convert the provided textures to the required format and sizes before building the textureatlas. Add an option to convert materials and textures to the requirements of the terrain.
  11. ok, that makes sense, but in that case a dynmic calculation how many frames needs to be generated maybe needed. While 2 frames for ssr are enough (previous + currentframe) it would't be enough if you use the full power of reprojection and regenerate the whole view over multiple frames. In my case the editor would need to render 16 frames to update the viewport. Maybe something like the premulitplyalpha option in a posteffect file would be needed. Wven if its for the editor alone. Maybe somehting like: EditorRedrawFrames: 16 or so. Then the editor could use the maximum of the that value based on the posteffects attached.
  12. also it should work with the environemnt, it might be that it interfears with your atmosphere shader. it uses a sphere if i remember correctly.
  13. You can try to change the exposure and gamma corection parameters in the fragment shader: const float G_SCATTERING_DEFAULT = 0.7f; const float G_SCATTERING_DIRECTIONAL = 0.1f; const float exposure = 20.0; const float gamma = 1.1; also the function writetopixel in the fragment shader (i added a return true for the editor): bool writeToPixel(vec2 fragCoord) { //return true; ivec2 iFragCoord = ivec2(fragCoord); uint index = CurrentFrame % BAYER_LIMIT; return (((iFragCoord.x + BAYER_LIMIT_H * iFragCoord.y) % BAYER_LIMIT) == bayerFilter_[index]); } comment it out to use the reprojection, should be a bit faster.
  14. @Josh I need to reopen this, the top sample produces a lot of memory again (0.9.5), also after a short time period it produces INVALID _VALUE errors. I know the rendering is async, but the instance count always says 505 instances instead of 100 (maybe +1 one for the camera).
  15. local window = CreateWindow("TEST LEVEL", 0, 0, 1280, 720, displays[1], WINDOW_CENTER | WINDOW_TITLEBAR | WINDOW_FULLSCREEN)
  16. While testing the new Effect feature in the Editor i encountered a small problem with my posteffect shaders. Some of my effects use a technique called reprojection, actually this is a very basic implementation but improves the perfromance a lot. What it does, is that it uses a 4*4 Bayer filter to determine which pixel needs to be calculated based on the fragment coordinate and the current frame. If a pixel is not meant to be calculated it reads the result from the previous frame (reprojecting the current uv to the previous uv). This means over a period of 16 frames the whole buffer is calculated. Unfortunatly this is not working in the editor. I believe it might be due to the fact (correct me @Josh if i am wrong), that the editor doesn't use async rendering and the camera is not rendered in realtime. so if we can get a flag which indicates if a shader is invoked from the editor, we could disable some features (reprojection in this case) and use the pfx correctly in the editor. I will write a very basic shader which will simulate the error and attach it later here.
  17. Ok, now the ssr setting doesn't disable the sky and pfx anymore. But i can't see any ssr reflections with the current build. (Steam and Standalone) project is a clean project to test the ssr, so everything should be up to date.
  18. you can add your own IOSystem to the importer like this Assimp::AndroidJNIIOSystem *ioSystem = new Assimp::AndroidJNIIOSystem(app->activity); if ( nullptr != iosSystem ) { importer->SetIOHandler(ioSystem); } https://assimp-docs.readthedocs.io/en/latest/usage/use_the_lib.html#using-custom-io-logic-with-the-c-class-interface
  19. With the latest release, when you enable SSR on a Camera the skybox is disabled and no posteffects are rendered. both Steam and standalone.
  20. Download: VolumetricLighting0_9_5.zip This is a small but powerful posteffect with "real" VolumetricLighting for all lighttypes. It shows some basic powerful features of the UltraEngine pipeline: Performance Accesibility (you can get nearly everything in a shader at every time) The shader itself features Volumetric lighting based on the included shadowmap lookups and performs some calculations more or less like the screenspace godrays, but not depenfing on the backbuffer colors, but real shadow casting. It also features calculates only 1/16th of a buffer (plus out of scope pixels) per frame and reprojects the result of the previous buffer if possible. The shader itself is currently more or less for demonstration usage only, because there are no volumetric settings for lights available and adding these would require some kind of lua or cpp backend.
  21. When you add Posteffects in the worldsettings dialog and reorder them. they are added mlutiple times to the cameras. At first when you reorder the effects it looks correct, but if you close the dialog and refresh the perspective viewport you can see artifacts and multiplied effects, eg the godrays will get brighter and brighter. When you reopen the settings you now see the effects multiple times (multiplied by the numbers of reorders you did previosly) Before reordering: And after reordering multiple times: As an addtional request hidden in here: Expand the posteffect api with Remove and sorting capabilities. Currently you can only add the posteffects and clear all, but not remove single items.
  22. Just some small things in the world-settings: skybox/specular and diffuse are not loaded from previous maps after assigning skybox/specular and diffuse textures and reopening the world-settings, the skybox value is empty the posteffect tab is completely empty and nothing can be done in the tab except closing the window when you open the world settings dialog he first tab is selected, but the content shown is the content of the last tab opened previously (open settings ⇾ select fog⇾ close ⇾ open again ⇾ selected tab is environment, but content is the fog settings)
  23. With the latest release 0.9.5 (Build 567) it seems the lib files are damaged. Both debug and release.
×
×
  • Create New...