Jump to content

Josh

Staff
  • Posts

    24,625
  • Joined

  • Last visited

Everything posted by Josh

  1. 0.9.8 The way prefabs are loaded has changed. Prefabs will now be loaded from the original scene data, rather than copied from a cached root entity. This will probably make component load calls more along the lines of what users expect. Flowgraph connections might get loaded from prefabs now. I'm not sure on the details of how those will work at the moment and it will probably take some experimentation to work it all out.
  2. The particle count is specified when the emitter is created. It's an optional parameter in the Create function.
  3. Response from Nvidia: This means that @Dreikblack identified and caused a major driver bug to be fixed by Nvidia. Great job!
  4. That result is controlled by the velocity and particle count right now. I am saying, maybe the inverse should be true.
  5. Using a downsampled seed image works better because it gives you more of an average texture color: Before: After:
  6. The idea is to build a color LUT and apply it to textures to try to normalize colors. Some success. #include "UltraEngine.h" using namespace UltraEngine; int main(int argc, const char* argv[]) { //------------------------------------ // Build the color palette //------------------------------------ auto plug = LoadPlugin("Plugins/FITextureLoader"); auto palette = LoadPixmap(GetPath(PATH_DESKTOP) + "/hl2_brick.png"); int res = 32; float m = 256.0f / float(res); Vec3* colors = new Vec3[res * res * res]; float* colordistance = new float[res * res * res]; int r, g, b, a, pr, pg, pb; int x, y; float d; uint32_t rgba; Vec4 color4; Vec3 color3; Vec3 current; if (FileType(GetPath(PATH_DESKTOP) + "/lut.png") != 1) { auto samples = CreateBuffer(palette->size.x * sizeof(Vec4)); Print("Building palette"); for (x = 0; x < palette->size.x; ++x) { color4 = palette->Sample(iVec2(x, 0)); memcpy(samples->Data() + sizeof(Vec4) * x, &color4, sizeof(Vec4)); } for (r = 0; r < res; ++r) { Print(r); for (g = 0; g < res; ++g) { for (b = 0; b < res; ++b) { current = Vec3(float(r) / 255.0f, float(g) / 255.0f, float(b) / 255.0f) * m; for (x = 0; x < palette->size.x; ++x) { for (y = 0; y < palette->size.y; ++y) { //memcpy(&color3, samples->Data() + sizeof(Vec4) * x, sizeof(Vec3)); color4 = palette->Sample(iVec2(x, y)); color3 = color4.xyz(); d = current.DistanceToPoint(color3); if (x == 0) { colors[r * res * res + g * res + b] = color3; colordistance[r * res * res + g * res + b] = d; } else { if (d < colordistance[r * res * res + g * res + b]) { colors[r * res * res + g * res + b] = color3; colordistance[r * res * res + g * res + b] = d; } } } } } } } auto out = CreatePixmap(res * res, res); for (r = 0; r < res; ++r) { for (g = 0; g < res; ++g) { for (b = 0; b < res; ++b) { //out->WritePixel(r + b * res, g, Rgba(r * m, g * m, b * m, 255)); out->WritePixel(r + b * res, g, Vec4(colors[r * res * res + g * res + b], 1.0f)); } } } out->Save(GetPath(PATH_DESKTOP) + "/lut.png"); } Print("Processing image"); //------------------------------------ // Apply color palette to an image //------------------------------------ auto out = LoadPixmap(GetPath(PATH_DESKTOP) + "/lut.png"); auto pixmap = LoadPixmap(GetPath(PATH_DESKTOP) + "/pexels-chetanvlad-3373620.jpg"); std::vector<shared_ptr<Pixmap> > frames(out->size.y); for (int y = 0; y < out->size.y; ++y) { frames[y] = CreatePixmap(out->size.y, out->size.y); out->CopyRect(y * out->size.y, 0, out->size.y, out->size.y, frames[y], 0, 0); } for (int x = 0; x < pixmap->size.x; ++x) { for (int y = 0; y < pixmap->size.y; ++y) { rgba = pixmap->ReadPixel(x, y); r = Red(rgba); g = Green(rgba); b = Blue(rgba); a = Alpha(rgba); float unitsize = float(out->size.y) / float(out->size.x); float u = float(r) / 255.0f; float v = float(g) / 255.0f; int frame = Floor(float(b) / 255.0f * float(out->size.y)); frame = Clamp(frame, 0, frames.size() - 1); //Print(u); //Print(v); color4 = frames[frame]->Sample(u, v); color4.a = float(a) / 255.0f; pixmap->WritePixel(x, y, color4); } } pixmap->Save(StripExt(pixmap->path) + "_out.png"); return 0; }
  7. Hi Aaron, I did not anticipate adoption of these features outside of my own software, but if you are interested I will put together some examples.
  8. The render layers are a bitwise flag. This allows you to set multiple values in a single integer variable. Each flag is a power of two number, like 1, 2, 4, 8, 16, etc. You can add these numbers together, and it is possible to tell which numbers are packed into the value. For example, if I see the value '3' I know it contains 1 and 2, and thus the first two flags are set. If I see 17, I know that means 16 + 1. By default every entity has its render layers set to 1. (Except maybe probes, I think I might have set it to have all possible values active.) There is also a simple math operation to tell if two flag sets have any flags in common: a = 17 b = 3 if ((a & b) != 0) { // at least one value must be in common in the two flag sets, in this example it would be 1. } This is how render layers tell which objects can see which other objects. In the future, I think the editor interface will get a little simpler, but I need to add checkboxes to combobox items in order to do that. For programming, you may find it easier to declare render layer constants like this: enum RenderLayers { RENDERLAYER_1 = 1 RENDERLAYER_2 = 2 RENDERLAYER_3 = 4 RENDERLAYER_4 = 8 RENDERLAYER_5 = 16 RENDERLAYER_6 = 32 RENDERLAYER_7 = 64 RENDERLAYER_8 = 128 } An unsigned 32-bit integer supports up to 32 flags like this (because it has 32 bits). Then you can set render layers like this: entity->SetRenderLayers(RENDERLAYER_1 | RENDERLAYER_3 | RENDERLAYER_6); This may be easier than remembering a lot of power-of-two numbers and is equivalent to this: entity->SetRenderLayers(37); When two entities have a flag in common, they can see each other. Cameras can see objects, or objects can cast shadows from a light. A similar system is used for controlling which decals can appear on which objects, although it is stored in a different variable.
  9. If the light color is black, no directional light will be used or saved in the scene. You can open the .ultra file to make sure no directional light is stored in the JSON data.
  10. Here is the artwork I created for the bundle. LeadwerksGamesBundleSteamStorePage.zip
  11. Josh

    CS2 Textures

    We are going to use CSG for the basic layout of the scene and then populate it with mesh details. This will provide the most reusability for people to modify the scene or make something new from it.
  12. We did not get to it in the meeting, but my basic idea is this: Currently, LoadPrefab first loads a copy of the scene into the NULL world and stores it, then returns an instance of the root entity. Any time the same prefab is loaded again, a new instance of the root entity is returned. This is why the extra call to Load is happening, and why the Load method must be able to handle the situation where the world is NULL. (It technically should anyways, but I can understand if people skip that.) I could probably disable this behavior without much trouble. The other issue is that Entity::Instantiate copies flowgraph connections to the original entity. There is no way for Instantiate to know a flowgraph connection target is also being copied. This is why prefabs currently cannot store their own internal flowgraphic. My idea is to make LoadPrefab load the JSON and binary data, and then run the scene load routine each time the same prefab is loaded. This would solve both problems easily.
  13. Josh

    CS2 Textures

    @Andy90 @reepblue I took some screenshots from CS2 for reference. I am not sure how these are made. One thing I do notice is the all the textures are pretty desaturated, which probably helps prevent odd colors from standing out. Another interesting thing is they use a much higher resolution for small metal details, like a diamond plate floor or chain link, then they use for a concrete wall. I am guessing those textures are probably just one small piece that repeats at a high frequency. I don't really have any conclusions to draw from this, it's just some info you might find interesting. I
  14. I have updated the beta branch with A fix, but I do not know for sure if this fixes the problem, since I did not see it firsthand. It might have happened with the map you indicate, but I already had a new build up before I tried it.
  15. This episode includes progress on our first-person shooter mechanics, a detailed level design discussion, and a deep look at environment textures. 9-7-24.zip
  16. You can edit the value here. Make sure you edit the configuration you are trying to compile (in the top-left corner of this window)
  17. Normally all you need to do is drag the .lib file into the project like a cpp file, include the header for that library from the main.cpp file, and sometimes you have to add "include search directories" in the project settings. I do not know how SDL works, but if it can return a HWND you can probably create a Leadwerks graphics context on that.
  18. If you are rendering to a texture, as you are when any post-processing or MSAA is in use, the color format is RGBA16F. So as you are performing rendering, all the intermediate steps will use high=precision color, and the results don't get clamped to 0-1 until the final output. I don't have an HDR monitor, so I don't currently have any way of testing HDR color output. I am sending some code for the framebuffer setup you can experiment with if you would like.
  19. We will discuss this in tomorrow's meeting. I don't want to make any major changes to this until after my talk with Khronos on Monday.
  20. Good reads here: https://web.archive.org/web/20080211132953/http://www.lunaran.com/page.php?id=187 https://web.archive.org/web/20080211132937/http://www.lunaran.com/page.php?id=188 https://web.archive.org/web/20080211132937/http://www.lunaran.com/page.php?id=188 https://web.archive.org/web/20080211132942/http://www.lunaran.com/page.php?id=189 https://web.archive.org/web/20080211132932/http://www.lunaran.com/page.php?id=190 https://web.archive.org/web/20080211132947/http://www.lunaran.com/page.php?id=191 https://book.leveldesignbook.com/process/layout
  21. It's not that difficult, but I am just clarifying why would be a new feature that has nothing to do with the terrain foliage system.
  22. The foliage system does not store the position of any objects instances in memory. Each instance is placed according to a procedural equation that is duplicated in the main thread, physics thread, and on the GPU in the shaders. One aspect of this, as mentioned, is the layout is basically 2D with a vertical offset. It could be possible to make a separate "model painting" tool that would distribute instances of a model across the scene, but it would be creating normal entities, and would be a totally different tool.
  23. How is this different from a regular material?
×
×
  • Create New...