TheConceptBoy Posted June 19, 2019 Share Posted June 19, 2019 This honestly may or may not appeal to the more seasoned Leadwerks users who've been around the block but I find the unreal material editor in UE or Shader Editor in Blender to be the only existing things in this universe where I can not only forgive the use of Node based programming but that I even welcome it. The ease and speed of material creation in either cases simply trumps even veteran shader programmers. The sort of stuff I was able to create in UE with materials was simply stunning. This would open LW5 / Turbo users to a whole new world of accessible Visual Effects and experimentation. 1 Quote Link to comment Share on other sites More sharing options...
Josh Posted June 20, 2019 Share Posted June 20, 2019 I would only implement something like this in the new editor (which I have not started yet). 1 Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
Josh Posted June 21, 2019 Share Posted June 21, 2019 This does tie into something I am working with right now and I would like your feedback. I am downloading some PBR materials across the web and there doesn't seem to be any real standard. It's much more efficient to pack metallic / roughness / occlusion into a single texture, but it seems like most people make these separate greyscale image files. GLTF files specify that metal / roughness should be in a single texture, but even there occlusion is sometimes added and sometimes not. What do you usually do? Is there any packed standard or common tools to handle this? Do you really mess around dragging and connecting lines between color channels for every single material you create? Doesn't that take ages to do? Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 22, 2019 Author Share Posted June 22, 2019 Well yes. I understand that this sort of a functionality would be best put into the new engine, I'm all for that. Also I'm glad this sort of lines up with your plans and you're at least considering this. Looking forward to Turbo. Starting to put savings away now. If you're not sure what layout to use, then base it on Blender / Unreal. Both of their PBR material layouts are highly versatile and offer great flexibility. Quote Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 22, 2019 Author Share Posted June 22, 2019 When In Doubt, take example from the big guns: This is the sort of stuff that would make creating visual effects accessible to a significantly broader audience who do not have Shader Coding experience. Youtube is filled to the brim with these material tutorials, making all kinds of effects and materials. Quote Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 What is the Blender layout? Allegrorithmic seems to prefer RGB occlusion/rough/metal although I can’t get a straight answer and no one seems to have any idea at all. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 Also regarding height, why is it not just common practice to pack height into the alpha channel? I have never seen displacement maps with more than one component. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
reepblue Posted June 22, 2019 Share Posted June 22, 2019 Blender 2.8 has a material node system. It's pretty similar what you'll find in Unreal. There is no real standard for any of this. Quote Cyclone - Ultra Game System - Component Preprocessor - Tex2TGA - Darkness Awaits Template (Leadwerks) If you like my work, consider supporting me on Patreon! Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 So we all drag a bunch of lines between squares that represent color channels because we can’t make up our minds where to store each value? ? 1 1 Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
reepblue Posted June 22, 2019 Share Posted June 22, 2019 the gltf model actually loads each image as a separate mask like how we were doing things in Leadwerks. To be completely honest, this is the most easiest method. 1 Quote Cyclone - Ultra Game System - Component Preprocessor - Tex2TGA - Darkness Awaits Template (Leadwerks) If you like my work, consider supporting me on Patreon! Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 22, 2019 Author Share Posted June 22, 2019 5 minutes ago, Josh said: Also regarding height, why is it not just common practice to pack height into the alpha channel? I have never seen displacement maps with more than one component. It's more the terms of flexibility. Yes you could combine them all into one and it would make it easy but the whole idea is to have the option and flexibility of routing and committing Quote Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 22, 2019 Author Share Posted June 22, 2019 34 minutes ago, Josh said: So we all drag a bunch of lines between squares that represent color channels because we can’t make up our minds where to store each value? ? It's the fact that you have the option to separate them. I've used textures as drivers that way, red was driving one parameter while blue controller another. It's very flexible. 1 hour ago, Josh said: What is the Blender layout? Allegrorithmic seems to prefer RGB occlusion/rough/metal although I can’t get a straight answer and no one seems to have any idea at all. I'd recomment looking into what is generally used in software like Blender, Maya (Arnold Renderer), Substance Painter and Designer. Etc. Pick the most commonly seen attributed and the rest is a cherry on top. Quote Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 We should have the same thing for normal maps. All normal maps should require a visual editor to decide which channel represents the x, y, and z attributes. It would give us more options. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 It is very restrictive to have X, Y, and Z, correlate to R, G, and B. There may be a mentally impaired and color-blind person in Borneo who prefers to make the green channel represent the Z component of the vector. 1 1 Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
Josh Posted June 22, 2019 Share Posted June 22, 2019 This should be required for every single normal map that anyone imports into the engine, ever. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
reepblue Posted June 22, 2019 Share Posted June 22, 2019 Let's just do separate images for each map and call it a day. ? Quote Cyclone - Ultra Game System - Component Preprocessor - Tex2TGA - Darkness Awaits Template (Leadwerks) If you like my work, consider supporting me on Patreon! Link to comment Share on other sites More sharing options...
knocks Posted June 25, 2019 Share Posted June 25, 2019 The mentally impaired and color-blind person in Borneo my well just be experimenting. Creativity becomes muted if everyone connects the dots in the same order. Tools like this encourage experimentation which I think is a good thing? Quote My first Adobe purchase was Photoshop 2.0, CS6 was my last! < = > Link to comment Share on other sites More sharing options...
Josh Posted June 25, 2019 Share Posted June 25, 2019 27 minutes ago, knocks said: The mentally impaired and color-blind person in Borneo my well just be experimenting. Creativity becomes muted if everyone connects the dots in the same order. Tools like this encourage experimentation which I think is a good thing? Ah but, we have an overarching design goal: speed. Vulkan strongly prefers static structures of settings. Therefore we will do things in a way that optimizes performance. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 25, 2019 Author Share Posted June 25, 2019 9 hours ago, Josh said: Ah but, we have an overarching design goal: speed. Vulkan strongly prefers static structures of settings. Therefore we will do things in a way that optimizes performance. I mean that'll put the entry point for Turbo a bit higher than other engines when it comes to being able to create effects and materials. Can't you figure out a tool bar that will prep the materials with that sort of a workflow and convert them to a static resource for Vulcan at runtime? Quote Link to comment Share on other sites More sharing options...
Josh Posted June 25, 2019 Share Posted June 25, 2019 This is what I am picturing. You drag the images you want onto the slots, and then it saves the results in our optimal format. What do you think? 2 Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 25, 2019 Author Share Posted June 25, 2019 Just now, Josh said: This is what I am picturing. You drag the images you want onto the slots, and then it saves the results in our optimal format. What do you think? And how do you use the data from these textures afterwards? Quote Link to comment Share on other sites More sharing options...
Josh Posted June 25, 2019 Share Posted June 25, 2019 It saves a material file and a few DDS files, wherever you want. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 25, 2019 Author Share Posted June 25, 2019 7 minutes ago, Josh said: It saves a material file and a few DDS files, wherever you want. Well the format in which Turbo will use these textures is really inconsequential. If that format is more reliable and efficient, Then go for it. The main thing (unless this conversation trailed off the original subjects matter of my first post) is how the data is routed. I'd love to be able to, for example use the red channel of the diffuse map to drive some properties. Perhaps the red channel can be used as a scalar value to drive a metallic intensity and now I suddenly have a very shiny metal that only reveals the metallic property on the red parts like this generator here. And I didn't even have to use a dedicated Dedicated Metallic Map. Not every object needs ALL 5 maps. Why not re-use some of the data from other textures to fill in the gaps. This is where the UE Material editor shines. You have the flexibility to use the data from textures as you please. re-use data etc. Quote Link to comment Share on other sites More sharing options...
Josh Posted June 25, 2019 Share Posted June 25, 2019 Okay, I believe now we are talking about a different thing. The output of a system like that is going to be a custom shader. So for that we would want something that reads your settings and generates a new pixel shader. I can see where that would be useful, but I don't think we need to go through that whole process for all our materials. Quote My job is to make tools you love, with the features you want, and performance you can't live without. Link to comment Share on other sites More sharing options...
TheConceptBoy Posted June 25, 2019 Author Share Posted June 25, 2019 5 minutes ago, Josh said: Okay, I believe now we are talking about a different thing. The output of a system like that is going to be a custom shader. So for that we would want something that reads your settings and generates a new pixel shader. I can see where that would be useful, but I don't think we need to go through that whole process for all our materials. As far as I know there are all of only one or two shaders in in UE, but it's their flexibility of the routing that makes them versatile in any sittuatuon. You want Alpha? Plug in an alpha map into the Transparency Input of the node. You want some roughness on the red bits of that firetruck? use an RGB node to separate the 3 color channels and use the red channel in the roughness input. Quote Link to comment Share on other sites More sharing options...
Recommended Posts
Join the conversation
You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.