No announcement yet.

HLSL Help!

  • Filter
  • Time
  • Show
Clear All
new posts

    oh nice to hear this, I thought there was absolutely no way to get access to lower-level stuff. could you post info with your findings once/if you get around to it?


      @rebb: That's a distinct possibility, but the way I figure it, the standard nodes can obviously access the buffers at a low level: that's how the depth nodes and camera vector nodes operate, so a custom node accessing the same function should work. The only bar to success would be if they arbitrarily changed the call name. I don't even really see how it's possible for them to lock out normal buffer access when existing nodes access it routinely. But we'll see

      @Chosker: I will certainly post my findings and even a how-to if it works!

      Thanks for all the help so far. Maybe if you guys have time too, we can all try independently and see how it goes? Check in soon (hopefully before next week is over !)


        I'd wondered why Epic reduced access as well. I imagine the admins are not allowed to say, but I really wish there was more powerful shader support, or more support for advanced HLSL (such as vertex shader stuff). I remember at least one really talented shader creator who almost wanted to give up on using UE3 because he had spent so much time doing advanced techniques with the shader source etc., and then discovered the access/source was removed in the newer versions.

        Good luck though, and maybe if enough people whine Epic will have mercy in the future.


          LOL! Let's organize an HLSL-whine Yeah, it's pretty frustrating, but in some ways understandable. Three vertex stuff, from what I've heard, seems to be locked out mainly for compatibility reasons. There are ways around such issues in programming terms, but they want the UDK as wide-access as possible, and too much vertex-tweaking can shut out some players. DX11 handles things slightly differently again, so I will assume these moves are benign until proven otherwise.

          As I mentioned before, there are still nodes that utilize the buffers I want, and their functionality hasn't changed, so I'm hoping it works. Unfortunately I'm on my backup computer until I finish rebuilding my main box, but I'll keep on it. The only thing I need for the shaders I'm writing is the **** gNormal buffer.... So we'll see


            Now THIS is interesting....

            After looking through the 04-2010 build version of the SSAO code, I found... no reference to the gNormal buffer at all. The code appears to have a function to derive normals per pixel rather than take the whole normal buffer into memory.

            This is both cool and a little frustrating, since my SSAO and SSDO code both require access to the g-buffer.

            However, their solution is pretty bloody elegant, and may actually be replicable in the Material Editor using the CameraVector node! For now I'm going to spend a bit of my evening error-checking my logic here, but it looks like their solution actually helps with a few details that bug SSAO, specifically occlusion from out-of-field objects suddenly appearing and disappearing. Essentially, the function caches a history for screen-visible pixels, then simply sets history to 0 for new pixels on a frame-by-frame basis, thus ensuring a smooth "lead-in" and no harsh pop-out of occlusion.

            Much experiment ahead!


              Oops! Nvm... SSAO in the 04-2010 build is based solely on depth.

               * Pixel shader that calculates an occlusion factor per-pixel using a heuristic dependent only on scene depth. 
              Hmmm... I'll have to pick through other shaders to find something that works on screen normals...

              Unless my crazy idea to use the CameraVector node actually works.


                contrary to what might seem, UDK uses forward-rendering on DX9, and only uses deferred-rendering on DX11. AFAIK a normal buffer is only used in deferred rendering (along with others), which might explain why they only allow support for the Lit Scene on the SceneTexture node

                if they made SSAO without a normal buffer, they probably reconstructed the normals from depth and geometry.... or something


                  Yup, you' re absolutely right... I had to step through the SSAO and the"Common.usf" file to figure out how it worked. Now for SSAO, that's not a huge issue, but I'm not sure if that'll work so well for SSDO, which can't work so accurately with reconstructed normals. I don't think that'll work in the later versions of UDK, which is annoying.

                  Hmmm.... I'll have to rethink this for now. Updates if I can figure it out.


                    OK, new tack.

                    I'm abandoning the idea of porting from an earlier version and starting from the new version using existing nodes to see if I can get them to work for me, together with some clever CustomNode work.

                    I played with CamerVector a little last night; no results to post yet, but it appears that I might be able to use that node as a worldNormal sampler in my SSDO loop. The only problem is, because it's not giving me direct access to the buffer, it'll be slooooowwww.

                    I may have to wait a few weeks to fully implement this, until I can upgrade my computer to something a little beefier. Actually, I'm hoping to upgrade to something with similar specs to the "Samaritan" system (a bunch of GTX 580s) That way even if it takes 6ms to process the effect, it'll be part of a parallel chain and still allow me to run at decent speeds.

                    I'm not sure if this is going to be a useable game implementation, but we'll see. Of course, it's possible that Epic could release SSDO in the meantime before I get that far (which would be nice), but dammit, I still want to give it a shot. ear with me, everyone!