Jump to content

Ma-Shell

Members
  • Posts

    371
  • Joined

  • Last visited

Everything posted by Ma-Shell

  1. For all available methods you should take a look at [LeadwerksInstallDir]/Include/Classes/Buffer.h. You can easily bring it up by pressing "Ctrl"+"," in Visual Studio and then typing "Buffer".
  2. You can look at the files I posted over there: http://www.leadwerks.com/werkspace/topic/11579-3d-screen-mode-for-3d-glasses/#entry83580 If you open the StereoRender.cpp from the zip-file, you can see, how you can use buffers. Basically that file creates a Buffer called "leftEye" by using: Buffer* leftEye = Buffer::Create(context->GetWidth() + 1, context->GetHeight(), 1, 1, 0); This buffer can be cleared by leftEye->Clear() Rendering to this Buffer is done by: context->Disable(); leftEye->Enable(); world->Render(); leftEye->Disable(); context->Enable(); The example then draws the leftEye-image over the context's image by using int blendModeOld = context->GetBlendMode(); context->SetBlendMode(Blend::Alpha); context->DrawImage(leftEye->GetColorTexture(), 0, 0, context->GetWidth()+1, context->GetHeight()+1); context->SetBlendMode(blendModeOld);
  3. First you have to create a new shader with the given code. To do that navigate to "Shaders/PostEffects" in the "Assets"-Tab. Right-Click on the dark grey area and choose "New"->"Shader" Give it a name, e.g. "rain" Double-Click the newly created file. The script-editor should open. Navigate to the Vertex-Shader by pressing "Vertex" in the white area on the left of the script-editor and paste the Vertex-Shader-Code Navigate to the Fragment-Shader by pressing "Fragment" and paste the Fragment-Shader-Code and save. Close the script-editor and select the Root-Object in the "Scene"-tab Under "Post Effects" add the newly created shader Select your camera and check the "Use Post-Effects"-Checkbox in the "Camera"-Tab
  4. @Rick No, it doesn't. I agree, that would be quite cool but I think, we would need higher powers (shadmar ) to achieve that
  5. You might want to try this PostFx-Shader I derived from http://glsl.herokuapp.com/e#14949.0 You might want to play around with some of the values but I think, it's a good start. Vertex-Shader: #version 400 uniform mat4 projectionmatrix; uniform mat4 drawmatrix; uniform vec2 offset; uniform vec2 position[4]; in vec3 vertex_position; void main(void) { gl_Position = projectionmatrix * (drawmatrix * vec4(position[gl_VertexID]+offset, 0.0, 1.0)); } Fragment-Shader: #version 400 float torad(float deg){ return deg*3.14/180; } uniform bool isbackbuffer; uniform float currenttime; uniform vec2 buffersize; uniform sampler2D texture1; out vec4 fragData0; uniform mat4 projectioncameramatrix; void main( void ) { vec2 tcoord = vec2(gl_FragCoord.xy/buffersize); if (isbackbuffer) tcoord.y = 1.0 - tcoord.y; float aspect = buffersize.y/buffersize.x; tcoord.x = clamp(tcoord.x,0.0,1.0); tcoord.y = clamp(tcoord.y,0.0,1.0); vec2 position = ( gl_FragCoord.xy - buffersize.xy*.5 ) / buffersize.x; position.y+=projectioncameramatrix[1][3]; position.y-=1.0; // 256 angle steps float angle = atan(position.y,position.x)/(2.*3.14159265359); angle -= floor(angle); float rad = length(position); float color = 0.0; for (int i = 0; i < 10; i++) { float angleFract = fract(angle*256.); float angleRnd = floor(angle*256.)+1.; float angleRnd1 = fract(angleRnd*fract(angleRnd*.7235)*45.1); float angleRnd2 = fract(angleRnd*fract(angleRnd*.82657)*13.724); float t = currenttime*.005+angleRnd1*10.; float radDist = sqrt(angleRnd2+float(i)); float adist = radDist/rad*.1; float dist = (t*.1+adist); dist = abs(fract(dist)-.5); color += max(0.,.5-dist*40./adist)*(.5-abs(angleFract-.5))*5./adist/radDist; angle = fract(angle+.61); } fragData0 = texture(texture1,tcoord); fragData0 += vec4( color )*.3; }
  6. Instead of the Express-Edition you can also use the Community-Edition. It's free as well and it adds some nice things (e.g. the support for plugins; long live the vim-plugin ).
  7. If you only want to make a pretty basic waterfall, a simple texture-scrolling could be enough. Refer to this thread for more infos on that: http://www.leadwerks.com/werkspace/topic/11874-piping-water-flux-anim-texture/
  8. I would assume, that one is one from shadmars PostProcessing shaders from the workshop: http://steamcommunity.com/workshop/browse?appid=251810
  9. While Leadwerks itself is tied to Steam, that doesn't mean, your creation has to be tied to Steam as well. So, if you don't want to tie your game to Steam, that would be a reason to not use the Steam API.
  10. OK, you're right with that one. Not really sure, how to interprete that one. Do you only want to distinguish two states ("dark enough to be considered outside the light" and "bright enough to be considered inside the light")?
  11. Actually such a collision wouldn't necessarily mean that the player is really exposed to the light source, as there might be walls or anything else between the player and the light. Also I assume pointlights should have some kind of falloff (the nearer the brighter), so it would be hard to determine which size the sphere should be. Instead you should iterate through all the lights and 1. perform a raytrace to determine if you are exposed to the lightsource. 2. for lights that have a falloff (pointlights and spotlights?) check the distance 3. for lights with an angle (directional lights and spotlights) calculate the angle. The only part of that you could make easier by these proposed csg-shapes would be the third one, but that is also easily achievable by using the acos of the dot product of the normalized light's viewing vector and the normalized vector from the player towards the light.
  12. Sublime Text doesn't have anything to do with LE directly. It's simply an amazing text editor, which you could use to edit your scripts instead of the one from LE: http://www.sublimetext.com/ There is a fully functional and time-unlimited demo available from their website.
  13. What exactly do you want to say with this? You didn't expect that the frames will drop, when you start another highly demanding program?
  14. Learning to program shaders is a real useful thing. For a start, you should have some fundamental knowledge about the basics of vector-maths (dot-product, cross-product, length, vector*matrix,...). Once you got that, you should make yourself familiar with the "programmable function pipeline" to learn, which kinds of shaders exist and what they do. (You will mainly focus on vertex- and fragment-shaders but there also exist three more (control, evaluation and geometry), which you should only mess with, once you got yourself comfortable with the former two. Then you should look at some existing shaders and try making little changes to them. You can't imagine, how satisfying it is to get your first wobbling teapot by moving a vertex along its normal with the sinus of the time . There are tons of good tutorials out there, but I think, it's important to keep messing around. I found that this site helped me a lot: https://coolcodea.wordpress.com/category/shaders/page/5/
  15. There is also an (inofficial ?) community wiki: http://leadwerks.wikidot.com/
  16. i.e. In the vertex-shader: at the beginning (where all the uniforms are declared) add: uniform float currenttime; then search for the line ex_texcoords0 = vertex_texcoords0; and add below that: ex_texcoords0.y -= currenttime*0.0005; (The uniform will automatically be filled with its correct value)
  17. If it's scrolling only (no real animation): You could simply add the current time (multiplied by a speed factor) to the y-texture-coordinate in the shader. (Maybe you need to do a modulo-calculation as well but I don't think, that is necessary).
  18. The problem with GEMA in germany is not that we are not allowed to listen to it, but that youtube is not allowed to present it to us. GEMA is short for "Gesellschaft für musikalische Aufführungs- und mechanische Vervielfältigungsrechte" (Society for musical presentation- and mechanical copyrights). They demand to get payed by everyone who is playing music to anyone in public and give a part of that money to the artists (cough, cough). The problem is, that they don't come to terms with youtube and so they keep suing them and requesting videos to be blocked because youtube doesn't have the license to show that music. By now youtube got quite pissed and started to automatically block videos where they aren't sure, whether they are allowed to show it, so they don't. In your video it says (translated): This video isn't available in Germany, because it might include music by SME, for the usage of which they couldn't yet come to terms with GEMA. They are sorry.
  19. No, Leadwerks is not interested in whether or not you have VS installed or not. You should install it though, if you want to edit c++, as it is a great IDE and comes with a compiler. It doesn't have to do anything with shaders. For the shaders to work, you will have to tell ShaderTool to export a *.shader file.
  20. Also because of the music it isn't available in Germany, oh how I love GEMA^^ Youtube now has a button to control the speed, if you click on the button where you control the quality
  21. You are aware, that your ray goes into y-direction, which is upwards? I think you rather meant to edit z, not y (as anything with a mass won't stay above you for long ) Also these are global coordinates, so you would probably want to move the points along forward direction, not along a constant axis.
  22. As far as I know, ShaderTool should offer an option to export *.shader-files (don't own it, so can't check). You don't need Visual Studio to work with shaders, you need it to work with C++-files. (Also I wouldn't choose the express-edition of vs but the community-edition, which is free as well.)
  23. I think, the problem is not about having a parent with many children but having many objects. AFAIK it doesn't matter, if your objects are parented to anything but if you are trying a modular aproach, you likely turn out with really many objects for which you need to store things like transformation matrix etc. So, if your house consists of only one Object, which holds all the geometry description, you will only need one transformation matrix. If your house consists of an entity, which is parent to 4 Wall-Objects, one roof-object, 5 window-objects and one door-object, you will have to store 1+4+1+5+1=12 transformation matrices (and additional overhead for each object). EDIT: OK, it does matter, if those things are parented to an entity, because e.g. you will have to generate a new global transformation-matrix (by transforming the local transformation-matrix of the child with the parent's one.)
  24. Ma-Shell

    Vision cone

    You are right. A matrix-multiplication with the vector (0,0,1,0)^T is just the third column. Doing it your way, we can get rid of the for-loops and the matrix-transposition.
  25. Hi, I noticed, Leadwerks has functions for transforming vectors and points to different spaces. However, I am not really sure, what to think of the following things: I would have thought, transformation of the point A into the space of the matrix M simply means multiplying the matrix (from left) with setting the fourth coordinate of the point to 1: A' = M*A The same for vectors but just have the fourth coordinate be 0 instead of 1. I wrote the following code to do a matrix multiplication: ret = Vec4(0) for i = 0,3,1 do ret[i] = 0 for j = 0,3,1 do ret[i] = ret[i] + M[i][j] * A[j] end end (I also noticed, the matrix of self.entity:GetMatrix() has to be transposed before doing this because otherwise the last row is not 0,0,0,1, which is a MUST for transformation-matrices). However, Transform:Vector() and Transform:Point() all yield different results and I can't figure out, why. Transform:Vector yields quite similar results to my function, if I do not transpose the matrix beforehand, except for the fourth coordinate. The values yielded by Transform:Point() don't seem to be related to my values at all. Consider the following example code: local M = self.entity:GetMatrix():Transpose() local A = Vec4(0,0,1,0) System:Print(mat) ret = Vec4(0) for i = 0,3,1 do ret[i] = 0 for j = 0,3,1 do ret[i] = ret[i] + M[i][j] * A[j] end end local a = Vec3(0,0,1) local v = Transform:Vector(a, nil, self.entity) System:Print(v[0] .. "," .. v[1] .. "," .. v[2] .. "," .. v[3]) local v = Transform:Point(a, nil, self.entity) System:Print(v[0] .. "," .. v[1] .. "," .. v[2] .. "," .. v[3]) System:Print(ret[0] .. "," .. ret1] .. "," .. ret[2] .. "," .. ret[3]) All of these yield different values, when actually Transform:Vector should be the same. If I change the last coordinate of A to 1 instead of 0, Transform:Point should be the same. Can anyone enlighten me on this?
×
×
  • Create New...