Announcement

Collapse
No announcement yet.

Ideas for new/improved features in the Unreal Engine.

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Ideas for new/improved features in the Unreal Engine.

    The Unreal Engine has been and still is one of the best and easy to use engines out there. Epic has gone out of their way to provide us independent developers with the 'free' version: the Unreal Development Kit. The UDK already has a host of awesome features, many giving us 'indies' the industry standard, others giving us better than standard. However the sky is the limit: Epic Games is a company known for respecting and listening to it's community. Not only is this an opportunity for us to let our voices be heard, but its our responsibility to do so. We owe it to Epic to provide feedback and ideas to help improve the already awesome toolset within UDK and thus the Unreal Engine.

    I don't want anyone to think even for one second that I'm complaining here: on the contrary. I love the opportunity Epic has given us. We all know that one idea leads to many ideas: thus the crazy amount of tools given to us in the UDK only makes me imagine what other features and tools I could use in theory. So this thread is devoted to a non-complaining list of ideas about expanding the use of current tools and possible new features that could help everyone. I do not expect Epic to act on any of these ideas, to do so would be ridiculous. I only want my voice to be heard by the big E.
    Please note the use of the word "Possible". No wacky ideas that would be out of reach, or simply implausible, are on this list. Only ones that seem within reason. I will be adding onto this list as more ideas come but mind you, I'm an artist primarily. Most if not all of these suggestions will have to do with art and graphics.


    Material Editor 'Prefab' Sets

    I don't know about other material artists, but this idea hits close to home with me. I use copy and paste to create 'stacking' effects, or to create two similar materials quite often. I may set up a group of nodes that I plan to use many times, not just in one material either.


    Instead of copying and pasting this group of nodes, I could highlight them and create a node 'set' or 'prefab'. The editor would analyze the selected nodes and determine which nodes are connected to un-highlighted nodes via either input or output and from that save a single node with multiple inputs and outputs. Later, I could right-click and select this node 'prefab' from a list of other saved prefabs and place it when and where I desire.


    As an added bonus, If I decide to change anything about this prefab, I'll only have to edit one instance of it to update every matching prefab. This would save time in many of my endeavors. A material artist would only edit one node to distribute congruent changes across a material that uses a stacking process for a certain effect. If the possibility exists: allow us to name/label each input and output for better organization. This idea comes from Unreal's integration of Prefabs in the main editor which store actors in a similar way, a feature I have used quite often.

    ______________________________________________

    Remove the Linearity with Post Process Chains

    Unreal Engine 3 handles post-process effects rather simply. Not that this is a bad thing: for most users this is a blessing. However, I can see so much potential for improving the post-process chain editor with simple integration of non-linearity. At the moment, post process nodes can only be stacked on top of one-another, and once you alter the rendered scene into a 'processed' state, you cant re-introduce the 'virgin' render for more advanced blending.


    I don't know whether this is a hardware limitation or if it's an engine code issue. What I do know is that it is extremely difficult to create new post-process effects outside the library of nodes included in the Post Process Editor. Material Effects can be quite useful, but their effects are linear and total: once you alter the scene using a hardware material effect, the change is irreversible.



    I can think of a solution to this that involves adding a material node to the material editor that essentially references a second scene texture. This new scene texture would be 'un-processed' as it would be the rendered frame before any post-processing has been done to the image. This way the artist could blend the current post-process image with the virgin render in any way they see fit. I suspect that this would require double the amount of graphics RAM as a single frame, as the rendered frame would need to be written twice before processed. However, the feature may not have a substantial effect on the frame-rate, as uncompressed frames are only about 2-5mb. This solution is a material editor option, and may not be ideal for designers or some artists. I suggest it merely because it is something I'm somewhat familiar with. Other methods are welcome.

    ______________________________________________

    Allow for Direct Referencing of Local Player via Kismet

    Some of you may know of a map I made back in UT3 called "arahan Crown". In the release candidate version of this map, I had a nifty kismet sequence that changed the music depending on the volume the player was in. The problem was that the music changed whenever any player touched the various volumes. Bots and other players chaanged music for everyone in the server, every time a volume was entered. Despite days of my C++ expert friend and I losing sleep over the problem, we could not find a solution in either the UDN, Unreal Wikki, or our own wacky test-maps.

    This is just one example of Kismet's limitation involving local sequences on a non-local world. For basic sequences, the 'target' actor flow usually does the job: however more complex kismet multiplayer sequences are impossible if they are meant to be seen/heard/experienced only by the player activating the trigger. I'd like to see this problem addressed. If there is a solution: make the solution more accessible. An experienced programmer and a (relatively) experienced artist could not find a way to do this. If the answer is more obvious than I make it out to be: then have the UDN updated to include information on the subject. There is something wrong here, in some way.

    I'm going to add more ideas in the future as soon as I can concentrate my thoughts into words, however: I would like to know your opinions and potential ideas as well.

    #2
    I agree with a lot of what's above, especially the 'prefab', or 'loop' node. I've been after that one for a good long time. Thankfully custom nodes can help that, but it should be there already!

    I'd also like to see more nodes for vertex shader manipulation.

    Comment


      #3
      One thing I think would be quite cool would be to be able to set the mesh used for a fluid surface actor...

      Imagine a hemisphere bubble shield that ripples when you shoot it, but you can't get through. It'd be nice for the eye-candy effect...

      Comment


        #4
        hey great ideas and a great thread, similar to the wish list thread though
        so ill say it here too
        i know a lot of you dont give a lot of thaught to sound implementation, just simply playing a sound sample is more than enough for most cases
        but (being musicaly orientated) i do
        i would love to see improvements with how udk handles sound
        now VST support would give us the same kind of 'no limits' approach but with sound.
        a vst doesnt have to be a drum machene or a synth, its simply a way to have control over every aspect of sound
        you can make anything at all with vst, and its the same c++ language that drives udk so i cant see it being beyond the realm of possibility to include this feature.
        i dont expect udk to ever have vst but maybe somebody could write a 3rd party plugin or something, you never know
        having a vst kismet node would be awesome

        Comment


          #5
          I'd love to see these implemented.
          I'd contribute more, but it's 3am :P

          Comment


            #6
            Sub-quad detailing on terrain actors. Affects only quads with detailing and is visible only at close range. Good for ditches and other natural, small, non-repeating and irregular constructs which can't really depend on static meshes or such when needed for huge terrain dependant levels.

            Could be done manually, yes, but laborous for levels spanning multiple kilometers. Just something to think about.

            Comment


              #7
              Thanks for the feedback, should have some new ideas soon.

              Comment


                #8
                Exposing gravity as a vector could be useful, and it's a popular topic amongst a fair few developers.

                Comment


                  #9
                  Exposing the angular velocity cap would be handy for people wanting to make game which make heavy use of rolling objects.

                  Comment


                    #10
                    I still think a noise expression in the Material editor would speed up some materials by more than double. It's a good post, and I hope Epic do look through the posts and create to-do list from it, but my bet is that they already have a list of assets they're planning to put in for the foreseeable future, everything else is on the back burner.

                    Comment


                      #11
                      As someone mentioned in a thread a few spots down, the ability to call keyboard events through Kismet would be extremely handy.

                      Comment


                        #12
                        I may write up a few topics based on these ideas:

                        A 'Velocity' node for the material editor.

                        A 'history' node for the material editor

                        A 'render to static texture' feature for creating and saving new textures to a file during runtime.

                        The ability to reference the local player for kismet events

                        Comment


                          #13
                          Originally posted by tegleg View Post
                          hey great ideas and a great thread, similar to the wish list thread though
                          so ill say it here too
                          i know a lot of you dont give a lot of thaught to sound implementation, just simply playing a sound sample is more than enough for most cases
                          but (being musicaly orientated) i do
                          i would love to see improvements with how udk handles sound
                          now VST support would give us the same kind of 'no limits' approach but with sound.
                          a vst doesnt have to be a drum machene or a synth, its simply a way to have control over every aspect of sound
                          you can make anything at all with vst, and its the same c++ language that drives udk so i cant see it being beyond the realm of possibility to include this feature.
                          i dont expect udk to ever have vst but maybe somebody could write a 3rd party plugin or something, you never know
                          having a vst kismet node would be awesome
                          That's a bit too much don't you think? It's like asking for a 3d modeling tool built inside UDK.
                          I'm a sound designer in real life and **** me, I 'd love to see a game engine doing that but... with what cost... Think about all those problems a VST can cause into your D.A.W. how many times have you seen cubase or whatever you use crashing because of a vst... imagine what would happen with a more complex program such as UDK.
                          My suggestion. Use your favourite DAW to synthesize/edit/mix your sounds and export those as Wav for UDK.This is the way most pros do their job simply because a game engine is a game engine and a DAW is a DAW. The more (and different) stuff a program can do the more frequently it'll crash, that's a common law.

                          Comment


                            #14
                            I wanted to give some feedback for a while so here it is(I mainly program.)

                            ABCs of the Engine Base classes:
                            classes such as PlayerController and Pawn have a great part of the UT logic in them, they are also native so the framework becomes unflexbile and the namespace cluttered. things could be moved upwards into GameFramework or have an ABC layer underneath, like UDKBase, this would make things more flexible. as it is, I find it very constricting, everything is native until UTGame... if this is within the scope of the de-nativisation then Go! Go~~! *Cheers*

                            Some wants/Suggestions:

                            Actor 'Latent Movement': Having some type of PHYS_MODE in actor that allows the use of velocity and acceleration as if in a void. I did not manage to get phys modes to work correctly in the march beta with an actor, only pawn.

                            Vertex Texture Fetches: I understand not all cards support it but being able to use it will make world position offset more useful.

                            Shader Semantics: being able to somehow define semantics for the custom node and pass them to the pixel shader would be ace.

                            Comment


                              #15
                              Originally posted by ThePriest909 View Post
                              That's a bit too much don't you think? It's like asking for a 3d modeling tool built inside UDK.
                              I'm a sound designer in real life and **** me, I 'd love to see a game engine doing that but... with what cost... Think about all those problems a VST can cause into your D.A.W. how many times have you seen cubase or whatever you use crashing because of a vst... imagine what would happen with a more complex program such as UDK.
                              My suggestion. Use your favourite DAW to synthesize/edit/mix your sounds and export those as Wav for UDK.This is the way most pros do their job simply because a game engine is a game engine and a DAW is a DAW. The more (and different) stuff a program can do the more frequently it'll crash, that's a common law.
                              I kind of agree with both of you.

                              Maybe not a VST, but there are DirectX plugins (DX) that should work.

                              Maybe not the synthesizer plugins, but I think the effect plugins like reverb, flanger, and some other DSP effects would be great.

                              If there are no use of plugins, I really think it'd be cool to just have DSP effects. Kind of like post proccessing effects, but for sounds instead of graphics. Sounds in the ambient class have reverb, but if I'm in a really lush environment, I'm going to be really disappointed and the illusion will be broken if all my player sounds are all flat and dry. Sure I could put the reverb on myself, but what if there's a time where I'm walking in and out of a big cave? I could have the sounds be replaced, but that just takes up more space in the game.

                              So inshort, DSP audio effects (or audio post-proccessing effects) would be great to have in UDK for ALL the sound classes, not just ambient.

                              If not that... then at least make it so that the reverb can be used in all sound classes.

                              Another thing that would be cool is to have it so that we could build dynamic music cues... we can kind of do this in kismet already though.

                              ANOTHER suggestion is about materials and shaders.

                              Now maybe no one else has a problem with this, but I'm coming from 3D for film and this material editor doesn't work ANYTHING like it does over there. Sure it's better than writing the code by hand I'll admit, but I wish it was a litle bit more intuitive to use for beginners. Maybe there could be like, a simple material setup where you could easily modify some channels like deffuse, specular and bump and normal by loading the images and using sliders, and if you wanna get more complex, then you can go and mess with the shader nodes. This is kind of how unity works.

                              Just a few thoughts.

                              Comment

                              Working...
                              X