Jump to content

Lunarovich

Members
  • Posts

    166
  • Joined

  • Last visited

Profile Information

  • Location
    Lausanne, Switzerland

Recent Profile Visitors

6,567 profile views

Lunarovich's Achievements

Newbie

Newbie (1/14)

78

Reputation

  1. Hello, if anyone might be interested, here is a template to use Leadwerks C++ with CMake. Leadwerks CMake Template
  2. I was wandering, what will happen to the Leadwerks 4? Will it be discontinued? Will it receive updates?
  3. Simply does not work under my (L)ubuntu distro. Crashes at the very start. When I run it from the console I get Initializing Lua... Warning: Lua sandboxing disabled. Executing file "/home/darko/.steam/steam/steamapps/common/Leadwerks Game Launcher/Scripts/Error.lua" Executing file "/home/darko/.steam/steam/steamapps/common/Leadwerks Game Launcher/Scripts/Main.lua" Error: Failed to read file "/home/darko/.steam/steam/steamapps/common/Leadwerks Game Launcher/Scripts/Main.lua". Error: Failed to execute script "Scripts/Main.lua".
  4. I've just added a code that uses shaders and buffers to create a custom texture... http://www.leadwerks.com/werkspace/topic/10311-drawing-dynamic-material-on-models-and-similar/#entry100219
  5. For the sake of completeness, here is how I use a custom shader to procedurally create textures... -- NB: we have created previously a model and stored it into the model var local surface = model:GetSurface(0) local material = surface:GetMaterial() local shader = Shader:Load("Shaders/Model/diffuse.shader") material:SetShader(shader) -- Save ref to the current shader & buffer. local context = Context:GetCurrent() local defaultShader = context:GetShader() local defaultBuffer = Buffer:GetCurrent() -- Specify a TEXTURE size. local textureSize = 128 -- Use a custom SHADER to draw a texture in the GPU. local customShader = Shader:Load("Scripts/Custom/drawimage.shader") -- Create a custom BUFFER to draw a texture to. -- static Buffer* Create(const int width, const int height, const int colorcomponents=1, -- const int depthbuffer=1, const int multisamplemode=0);//lua local customBuffer = Buffer:Create(textureSize, textureSize, 1, 1) Buffer:SetCurrent(customBuffer) -- must be called after Buffer:SetCurrent() context:SetShader(customShader) -- DRAW the texture. --context:SetBlendMode(Blend.Alpha) local texture = customBuffer:GetColorTexture(0) context:DrawImage(texture, 0, 0) material:SetTexture(texture, 0) -- Restore previous buffer and shader. Buffer:SetCurrent(defaultBuffer) context:SetShader(defaultShader)
  6. Thanks. I know about that. However, this driver does not support my card. I've bought today nvidia geforce. I just can't recommend AMD cards if you are Linux users. At least I do not recommend cards that are not supported by the new AMDGPU. However, AMDGPU does look promising.
  7. Is there any info on the new AMD driver? I am using Ubuntu 16.04 with radeon drivers and can't run LE. I don't want to go back to 14.04 and install fglrx drivers since those are not being developed for almost a year or so now...
  8. I agree with Roland, Josh. You should put your effort into further developing this wonderful engine. However, I do understand your desire to improve asset side of LE games and I do think that workshop is the way. However, there is also a programmatic side to the poor content quality of LE games comparing to other engines. For example, if there were no Shadmar, there would be quasi-zero post-processing effects for LE. What is more, you cannot control post-processing effects from the code easily, which goes against the philosophy of LE. That said, I think that users would rather like to see PBR support in LE than a new action figure. Btw, a military action figure incites LE users to make utterly unoriginal and totally deja vu FPS games, but that is beside the point in this discussion. Obviously, the workshop should do the work. Don't know exactly how... Maybe try with complete packs? But let's not forget the programmatic side of making your games nice... More material shaders, more post processing effects, more CSG options and the likes... And let's not forget long standing bugs like Key::RButton or context drawing functions. In my opinion, it's ten times more important to fix these bugs than to produce another drop of an action figure or a weapon in a sea saturated with every kind of boy's phantasies. Btw, I haven't seen any LE female user...
  9. Thank you! I am definitely thinking about writing a follow up tutorial, but do not have time at the moment.
  10. Thanks! Shadmar is sure an everlasting inspiration And I find LE particularly well suited for post-processing because of the ease of use and transparency. It simply gets out of the way to let you apply the general knowledge floating around the web. The only problem is the lack of LE specific intros. The issue I'm trying to address with this tutorial.
  11. Thank you! @tjheldna The idea is to explain LE specifics concernig effects and give a basic tutorial on post processing itself. So, there are two goals. In the next tutorial, I'll explain how to use math functions and how to make time-based animated effects. Basically, I'll adapt this tutorial for LE. And the rest is up to you "If you give a man a fish, you feed him for a day. But if you give him a fishing rod, you feed him for a lifetime". Well, this could be a good motto for this tutorial
  12. You could say so. It is a tutorial on one kind of shaders - post processing effects. They are easier to grasp than model shaders, since you don't need to know linear algebra. Although, it wouldn't hurt
  13. Thank you! I've created a basic, "do nothing" post effect script using your pixelate shader and removing the code for the pixelation
  14. Post effects are shader programs written in OpenGL Shading Language (GLSL). We use post effects to influence the entire look and feel of a scene. We refer to these effects as "post" effects because they are applied right after the scene has been rendered. They produce a fish eye effect, color an entire scene in grayscale, etc. In each frame of a game, a post effect iterates through every screen pixel and manipulates its color value. You can find post effect programs in LE project Shaders/PostEffects folder. You apply them either via Root of the scene in the editor, or via Camera:AddPostEffect() function in code. Here is a screenshot of the fish eye effect I've recently made: You can get the effect via the workshop. Hopefully, after reading this post, you'll have a sufficient knowledge to understand it and make your own cool effects. So, when you write a post effect shader, the primary - though not a final - goal is to get a color value of the current scene pixel. This is the easy part. Then comes the final goal: to do something interesting with the fetched color value. GETTING THE PIXEL COLOR VALUE Let's start with the simplest version of the post effect - the one that does nothing Our idea is to output pixel color values ready to be displayed on the screen, right after the current scene frame has been processed. We'll fetch frame's pixels' color values and simply forward them further down the rendering pipeline. If everything goes well, we will not notice any difference. If we mess up something, we'll either get the unpredicted results or, which is worse, we won't see anything. Particularly if our shader fails to compile because of syntax or other errors. A post shader script consist of a vertex shader and a fragment shader. A vertex part of the shader: #version 400 uniform mat4 projectionmatrix; uniform mat4 drawmatrix; uniform vec2 position[4]; in vec3 vertex_position; void main(void) { gl_Position = projectionmatrix * (drawmatrix * vec4(position[gl_VertexID], 0.0, 1.0)); } A fragment part of the shader: #version 400 uniform sampler2D texture1; uniform bool isbackbuffer; uniform vec2 buffersize; out vec4 fragData0; void main(void) { vec2 coords = vec2(gl_FragCoord) / buffersize.xy; if (isbackbuffer) coords.y = 1.0 - coords.y; fragData0 = texture(texture1, coords); } Since we won't use the vertex shader, you can ignore it for this tutorial. Let us instead concentrate on the fragment shader. First of all, what is a fragment? A fragment is a numerical value that defines a color of an individual screen pixel. There can be mutliple fragments per pixel. So, if you are using a 800 x 600 resolution, there will be at least 800 x 600 fragments. To keep it simple, we will think in terms of one fragment per pixel. Let's say that our current screen resoultion is 800x600 and we are using one fragment per pixel. That means that we will be dealing with 800x600 fragments. Now, in every frame of your game, a fragment shader processes each of this fragments. The goal of the fragment shader is to output the final value of the current fragment, which in turn defines the final color of the screen pixel. The out vec4 fragData0 defines the variable which wil store this value. The out is a variable modifier that indicates a fragment shader output value. The vec4 defines a variable type. In GLSL, a color is represented as a 4-dimensional vector with r, g, b and a components. uniform is another variable modifier. It denotes a sort of a constant value of the fragment shader. As you recall, a fragment shader script applies in turn to each fragment in each frame. The uniform modifier says that this value stays the same (uniform) for each fragment. Other uniforms used here: buffersize is a 2-dimensional uniform vector which stores the size of the screen in pixels. In our case, buffersize.x = 800 and buffersize.y = 600. sampler2D texture1 is another uniform variable. It stores the actual color values of screen pixels. We can read this values and use them to manipulate final color values, as we shall soon see. sampler2D is a special GLSL type that stores texture data. isbackbuffer is a boolean you can safely ignore for all common purposes. The hearth of the fragment shader is its main function. This is where the final fragment color value - a 4-dimensional vector with r, g, b and a components - gets calculated: void main(void) { vec2 coords = vec2(gl_FragCoord) / buffersize.xy; if (isbackbuffer) coords.y = 1.0 - coords.y; fragData0 = texture(texture1, coords); } gl_FragCoord is a built-in input vec4 that contains the screen (to be more precise, the game window) coordinates of the current fragment. In our case its x range is [0-799] and its y range is [0-599] since we are using 800 x 600 window resolution. On the other hand, texture1 coordinates are given in a so-called normalized space. Their range is [0-1]. In order to map fragment coordinates to texture coordinates, we simply divide gl_FragCoord.xy with buffersize.xy. For example, if our current fragment coordiantes are x=100 and y=80 our coords.x = 100/800 (0.125) and our coords.y = 80/600 (0.133). vec2() is a GLSL function that extracts first two components, x and y, from a given vector. Please recall that gl_FragCoord is a vec4. buffersize.xy is an expression to get first two componets from an existing vector. Since buffersize is a vec2 we could have simply used buffersize instead of buffersize.xy. The third line is where we finally set the output color value of the fragment. We use a built-in GLSL texture() function and normalized ([0-1] range) fragment coordinates stored in coords. The texture() is a so-called look up function. You give it a texture that you want to query for a color value and coordinates of a texel (a texture element) you want to get a color value for. (A texture element is analoguous to a fragment: there can be more than one texel per pixel. For the sake of simplicity, we'll just think of a texel as a pixel.) Basically, you just ask a texture to return its color value at certain xy coordinates. In our case, we want a texel with coordinates 0.125 and 0.133. Please recall that texture coordinates are given in normalized space: 0,0 is a bottom left texel and 1,1 is top right texel. If everything went well - particularly, if shader compiled -, you won't notice any difference. That is because we simply read every single frame buffer pixel and outputed its value back to be displayed on the screen. PROCESSING THE PIXEL COLOR VALUE Let us now do something more adventurous: let us remove a red color from the final image that will be displayed on the screen. void main(void) { vec2 coords = vec2(gl_FragCoord) / buffersize.xy; if (isbackbuffer) coords.y = 1.0 - coords.y; fragData0 = texture(texture1, coords); fragData0.r = 0; } As you recall, fragData0 is a vec4. In its first component we store a red value of the pixel. Therefore, fragData0.r = 0 simply means that we are turning to zero the red value for every pixel on the screen. Here is what we've got: Let's now try something even more ambitious: to make a horizontal red value gradient: void main(void) { vec2 coords = vec2(gl_FragCoord) / buffersize; if (isbackbuffer) coords.y = 1.0 - coords.y; fragData0 = texture(texture1, coords); fragData0.r = coords.x; } As we are moving further to the left, the screen gets more and more red. That's the effect of the fragData0.r = coords.x; line. Recall that fragData and coords components have [0-1] range. We are, basically, making our red channel dependent on the coords.x. The further we get to the right, the latter gets bigger and bigger and, consequently, our red channel gets more and more red value. I hope this explains basics and wets your appetite. Go ahead and try to modify the green and blue components of the fragData0. You can also use math functions such as mod, sin, cos, etc., to process fragment color values. To get a current time, add uniform float currenttime to the fragment shader. You can then use it in the main function to make time based animated effects. If you have any comments and corrections, feel free to respond. If you make a cool effect, publish it in the workshop and leave us a note/picture here. Thank you!
  15. Thanks! Works perfectly. For a recap, one should use the diffuse model shader - I was using the editor's sprite shader -, and put a texture image on a black background to use the "light" blend mode.
×
×
  • Create New...