Announcement

Collapse
No announcement yet.

Shader Distortion Real-Time Translucency Rendering & Post-Process Effects

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Even in multiplayer? But really its not only that, he's adding highlights to specific actors in post process. This is best seen at 39sec in this video: http://www.youtube.com/watch?v=juB-PHKRG9s&hd=1&t=0m34s Doesn't look like a regular old material swap, but I could be wrong.

    Leave a comment:


  • replied
    Nothing stopping you from switching out materials at the same time as swapping post process chains.

    Leave a comment:


  • replied
    Originally posted by ADayInForever View Post
    Thanks for replying . I'll tell you specifically everything I want to know.

    4) Highlight Effect
    Seems rather straightforward, you could probably adjust the shadow, mid-tone and highlight settings to get the values he's used... but then... that wouldn't explain how he blows out the shapes of specific objects with an emissive white light or how he sets the gradient effect. (I know this can be done with a red dominantdirectional light and a fog emitter... but can this be done with just material and post-process chains?). Does the material work such that it's given a white emissive if the object exists beyond a clamped value range? Or is there more to it than that? Or if I'm not thinking about this correctly, how might this material network be set up and then applied to a PP chain?

    5) Electrical Field Vision
    Based on your tutorial, he's using some kind of posterization effect... but only on specific objects below a certain light value. Really dark objects become more posterized, while lighter objects become less so. And then there's the issue of halos over extremely bright objects. Now if I were writing a shader network, I'd be inclined to put in an "if" node that checks whether the light value is above a certain range, and if so, multiply that value by an arbitrary value X to get the halo. But then he's also got a vertical grain applied over the objects above the certain light value as well, which sort of complicates my understanding of the effect. Especially when the vertical striping is always relatively vertical to the viewer despite the camera translation and rotation applied. How would he get a vertical striping only on the objects?

    6) Thermal Vision
    Again the posterization effect is applied here. But the highlight values are magnified and the posterization is preserved even at the higher values... although not nearly as much. And then there's the purple halo effect surrounding each of the figures that generates a heat signature. How does the shader tell the difference when called in post-processing? And how is this posterization effect created?
    It looks like he is somehow targeting specific actors, I haven't a clue how he's managing this unless he's altering the material of the objects at the same as he applies the post process. Anyone else have some input on this subject? Its been bothering me for a while.

    Leave a comment:


  • replied
    Sorry, I didn`t ment SceneRenderTarget, my bad. You`re on the right track with using SceneTexture i think.

    Leave a comment:


  • replied
    #1
    It sounds like you have a better idea of what parallax occlusion is now. If you look into a crack in the sidewalk you can see parallax occlusion as you go side to side. Things seem to move at different speeds at different distances from you. Occlusion just means that part of the texture can cover up another part of the texture. Like when the upper lip of the crack covers up part of the deeper crack.

    #2
    The "depth-based distortion" was SceneTexture UV distortion that was based on depth. What you should take away from that is the UV distortion. You can use any type of distortion that you like, including what you see in his video. It's just a simple matter of making a texture and animating it using math.

    Leave a comment:


  • replied
    http://udn.epicgames.com/Three/MaterialsCompendium.html
    This is a fantastic resource when you're trying something new in the material editor.

    Check SceneTexture for the double vision. Also take a look at the Gem on Sobel Edge Detection to see how you can take the scene texture and mess with it a bit. I think it will do the trick.

    I'm still working on some of the other things for you. You hurled a lot at me, and I can't answer it all, but I'll do what I can.

    Leave a comment:


  • replied
    Originally posted by SethNemo View Post
    What if the Disoriented effect is created with several SceneRenderTarget Nodes that are made to shift around somehow and the magic actually happens in the way they are combined/the way they interpolate? I got no idea how to swirl the images around except for standart panning and rotation. Maybe it is possible to link the Panning Node to some Params that define the Nodes directional values? If yes, than you could link this to Kismet and make it go in circles. If it works it would work for size and stuff like that to. Might be that this is just a workaround.

    I never heard of Rymarching but I know what Parallax Occlusion is. UDN has a very helpful tutorial in its "Engine Gems". PO materials are quite nasty for performance but you can make these stone-walls from the vid.
    Checked out PO... I've seen most of the gems, but I can't believe I missed that!

    As for Distortion. SceneRenderTarget? Not familiar with that node... not finding it in the material editor of the actor classes list. If you're talking about a scenecapturereflectactor or something, then hmmm... maybe. Although I've seen a scenecapturereflectactor in action on a RenderTexture2D surface. But when I've tried it, there's always been an incredible amount of aliasing in the reflection. Just doesn't seem likely (Unless there's an option to use AA in scene capture actors that I don't know about?)

    Other than that, I'm looking at this scenetexture and scenedepth nodes I found in the material editor, and doing a bit of background research on those nodes on UDN. It looks like this might answer a few of my questions....

    Leave a comment:


  • replied
    What if the Disoriented effect is created with several SceneRenderTarget Nodes that are made to shift around somehow and the magic actually happens in the way they are combined/the way they interpolate? I got no idea how to swirl the images around except for standart panning and rotation. Maybe it is possible to link the Panning Node to some Params that define the Nodes directional values? If yes, than you could link this to Kismet and make it go in circles. If it works it would work for size and stuff like that to. Might be that this is just a workaround.

    I never heard of Rymarching but I know what Parallax Occlusion is. UDN has a very helpful tutorial in its "Engine Gems". PO materials are quite nasty for performance but you can make these stone-walls from the vid.

    Leave a comment:


  • replied
    Originally posted by micahpharoh View Post
    What exactly do you want to know?

    The distortion shown at the 0:11 or so mark on the UDK Game Shaders video would actually be really easy to make by the look of it, but past that it gets a lot more complicated. A lot of those effects could be done in any number of ways. It could be a pure post process setup, or it could be a combination of any number of other things.
    Thanks for replying . I'll tell you specifically everything I want to know.

    1) Parallax Occlusion Shader. I imagine he's using a bumpoffset node with his normal map, right? Is he using a worldpositionoffset to make the normalmap + bumpoffset appear more prominent based on the cameraworldposition? But I'm sure there has to be more to it than that. I'd like to know some shader network details about how he might set something like this up. Like if I were wanting to set a custom attribute in the material instance editor to determine how much bump shows up in a normal map, how might something like that work?

    Also, I need a vocabulary explanation of what this guy's talking about, here. What does "Parallax Occlusion" mean? Occlusion suggests something is being "hidden" from view. In this particular case, parallax... the background moving slower than the foreground... and this is... somehow being occluded from view? And this technique of "Linear + Binary Raymarching". I have no idea what that is... and a google search isn't turning up much of anything of substance either. Who invented the concepts of raymarching? What sort of mathematical concepts does this sort of raymarching employ?

    Also, what does the "inline" part of "Inline HLSL" mean? That this guy created a custom node full of HLSL code that works alongside the regular mat nodes that UDK supplies by default?

    2) Electrical Field Distortion:
    How is he offsetting the objects world space and still maintaining complete translucency? What would be a shader network for something like that? You showed something very briefly in your 2nd post-processing chain tutorial... around 15:20 or so. You called it a "depth-based distortion." Was that a pre-made UDK material, or was that a custom material that you used to begin with? What I'd like to know is what was in the shader network that you used to get the 'depth-based distortion' in that video. I think from there I can deduce how to do something like an Electrical Field Distortion. Note: I'm not wanting to copy these effects, I'm just curious how to use these so that I can brainstorm and come up with my own stuff that involves effects like this for my own projects.

    And then there's the issue of "Framebuffer Distortion" -- what does "Framebuffer" mean? Is that some kind of a queue of frames that have not yet been rendered? I'm confused about what this term means, as well as the context in which it's used. (Not to mention any relevant math applied here.)

    3) Surface Shader: River
    I know how Fresnel effect has relevance based on distortion of the core intensity of the reflective surface, but I'm a little confused about what this person means by "absorption" -- is he referring to a depthbiasedalpha node? What really has me confused is how he's able to make a translucent material work hand-in-hand with a normal map, given that the two often don't get along with each other very well. Unless it's an opaque material, and he's using destcolor to give the illusion of a color to the river instead of an actual diffuse map?

    4) Highlight Effect
    Seems rather straightforward, you could probably adjust the shadow, mid-tone and highlight settings to get the values he's used... but then... that wouldn't explain how he blows out the shapes of specific objects with an emissive white light or how he sets the gradient effect. (I know this can be done with a red dominantdirectional light and a fog emitter... but can this be done with just material and post-process chains?). Does the material work such that it's given a white emissive if the object exists beyond a clamped value range? Or is there more to it than that? Or if I'm not thinking about this correctly, how might this material network be set up and then applied to a PP chain?

    5) Electrical Field Vision
    Based on your tutorial, he's using some kind of posterization effect... but only on specific objects below a certain light value. Really dark objects become more posterized, while lighter objects become less so. And then there's the issue of halos over extremely bright objects. Now if I were writing a shader network, I'd be inclined to put in an "if" node that checks whether the light value is above a certain range, and if so, multiply that value by an arbitrary value X to get the halo. But then he's also got a vertical grain applied over the objects above the certain light value as well, which sort of complicates my understanding of the effect. Especially when the vertical striping is always relatively vertical to the viewer despite the camera translation and rotation applied. How would he get a vertical striping only on the objects?

    6) Thermal Vision
    Again the posterization effect is applied here. But the highlight values are magnified and the posterization is preserved even at the higher values... although not nearly as much. And then there's the purple halo effect surrounding each of the figures that generates a heat signature. How does the shader tell the difference when called in post-processing? And how is this posterization effect created?

    7) PostProcess Effect: Disoriented
    This is the kicker. How is he creating the illusion of 3 people at once? I know he's probably using a sine wave to alternate between values of "more visually coherent" and "less visually coherent". But I'm confused as to how he's able to make multiple instances of the same object appear in the scene. How would this "blur" effect be created? And if worldpositionoffset is the relevant plug, then what value (or values) are changed and with what nodes to create the illusion of multiple object instances?

    That's what I'd like to know...

    Think you can help with that? Or do you need more information? Or...?

    Leave a comment:


  • replied
    Originally posted by ADayInForever View Post
    You did. And that actually makes a lot of sense. It never occurred to me to think of using worldpositionoffset as a sort of coordinate rotation or translation function.

    I'll have to play around with that. Thanks a lot for the explanation!
    no problem. I`m suprised it helped. Let us know if you create something similar.

    Leave a comment:


  • replied
    Originally posted by SethNemo View Post
    Hi,

    I don`t know if I look at this from the wrong direction (Beginner Alert) but as said before...World Position might play a big role in this. There is some good video tutorial on youtube and udk-scriptures that shows you how to build a snow material that always aligns with the worlds z-axis (meaning: Snow always stays on top of the rock). Now I imagine it basicly could work like this=

    Each Holodeck wall is linked through Kismet to the material. Ball hits wall 0,1,0,0 and the node setup creates an effect that interpolates along the y-axis. The rest is good use of effects.

    I just don`t know how to make it work that the wall "detexts" how far the ball has moved into it.

    don`t know if I made sense here.
    You did. And that actually makes a lot of sense. It never occurred to me to think of using worldpositionoffset as a sort of coordinate rotation or translation function.

    I'll have to play around with that. Thanks a lot for the explanation!

    Leave a comment:


  • replied
    Originally posted by ADayInForever View Post
    (Although if you are lurking on these forums and seeing this thread, micahpharoh, could you shed some light and maybe answer or provide some tutorials on how stuff like this might be accomplished? The tutorials I've seen thus far are pretty awesome.)
    What exactly do you want to know?

    The distortion shown at the 0:11 or so mark on the UDK Game Shaders video would actually be really easy to make by the look of it, but past that it gets a lot more complicated. A lot of those effects could be done in any number of ways. It could be a pure post process setup, or it could be a combination of any number of other things.

    Leave a comment:


  • replied
    Hi,

    I don`t know if I look at this from the wrong direction (Beginner Alert) but as said before...World Position might play a big role in this. There is some good video tutorial on youtube and udk-scriptures that shows you how to build a snow material that always aligns with the worlds z-axis (meaning: Snow always stays on top of the rock). Now I imagine it basicly could work like this=

    Each Holodeck wall is linked through Kismet to the material. Ball hits wall 0,1,0,0 and the node setup creates an effect that interpolates along the y-axis. The rest is good use of effects.

    I just don`t know how to make it work that the wall "detexts" how far the ball has moved into it.

    don`t know if I made sense here.

    Leave a comment:


  • replied
    3dimentia -- I'd like that very much. What do pointers do you recommend? Familiar with most stuff on the material editor, but worldpositionoffset, cameraoffset, depthbiasblend, destdepth and destcolor are one of the few things I don't know that well.

    In the holographic room demo, I want to know how he might start deforming the geometry SPECIFICALLY when it's close to touching a wall. I know part of the equation is triggers and kismet. But then... what's he doing? Is he using a lerp to switch worldpositionoffset on and off between states? And then, what specifically might he be doing to get the mesh in this very specific way... and at THAT particular side of the sphere only.. and then specifically show 2 different materials blending between each other at very context-specific values....i.e. depending on how close the viewer gets to the force field. How can he get kismet to determine how far away from the forcefield, and then set the constant variable to the lerp node(s) accordingly? Or is he doing something else? Is he using depthbiasedalpha by any chance to make the transitions very specific?

    And then there's the question about where the ball first spawns when everything is activated. The ball goes from opaque to transparent (probably using a black and white mask filter) with a panning node. But the thing is, it wouldn't make sense to use a straightforward vertical panning node. I know if I try something like a panning node across your average sphere diffuse/normal/specular map (either vertical or horizontal), the deformations will appear on random spots on the sphere because of how the UVs are set up. So the next question becomes how to pan uniformly across the mesh like that.

    kg777: If you still have your modified network... could I please see a snapshot of it, by any chance? I'd like to see how you changed it to do the odd deformations. Thank you kindly...

    Anyways, thanks to the both of you for the assistance and the suggestions!

    Leave a comment:


  • replied
    managed to "stretch" the ball mesh
    vertex shaders can do this. i think like above post says via offsets.

    the UDK comes with a lava style vertex shader/material , while playing with this i got it to do some crazy things like expand. i assigned it to the player pawn for fun and as i moved about depending on the amount of light i was in it changed colour,animated ,expanded,contracted quite insane tbh.

    Leave a comment:

Working...
X