The Unreal Engine has been and still is one of the best and easy to use engines out there. Epic has gone out of their way to provide us independent developers with the 'free' version: the Unreal Development Kit. The UDK already has a host of awesome features, many giving us 'indies' the industry standard, others giving us better than standard. However the sky is the limit: Epic Games is a company known for respecting and listening to it's community. Not only is this an opportunity for us to let our voices be heard, but its our responsibility to do so. We owe it to Epic to provide feedback and ideas to help improve the already awesome toolset within UDK and thus the Unreal Engine.
I don't want anyone to think even for one second that I'm complaining here: on the contrary. I love the opportunity Epic has given us. We all know that one idea leads to many ideas: thus the crazy amount of tools given to us in the UDK only makes me imagine what other features and tools I could use in theory. So this thread is devoted to a non-complaining list of ideas about expanding the use of current tools and possible new features that could help everyone. I do not expect Epic to act on any of these ideas, to do so would be ridiculous. I only want my voice to be heard by the big E.
Please note the use of the word "Possible". No wacky ideas that would be out of reach, or simply implausible, are on this list. Only ones that seem within reason. I will be adding onto this list as more ideas come but mind you, I'm an artist primarily. Most if not all of these suggestions will have to do with art and graphics.
I don't know about other material artists, but this idea hits close to home with me. I use copy and paste to create 'stacking' effects, or to create two similar materials quite often. I may set up a group of nodes that I plan to use many times, not just in one material either.

Instead of copying and pasting this group of nodes, I could highlight them and create a node 'set' or 'prefab'. The editor would analyze the selected nodes and determine which nodes are connected to un-highlighted nodes via either input or output and from that save a single node with multiple inputs and outputs. Later, I could right-click and select this node 'prefab' from a list of other saved prefabs and place it when and where I desire.

As an added bonus, If I decide to change anything about this prefab, I'll only have to edit one instance of it to update every matching prefab. This would save time in many of my endeavors. A material artist would only edit one node to distribute congruent changes across a material that uses a stacking process for a certain effect. If the possibility exists: allow us to name/label each input and output for better organization. This idea comes from Unreal's integration of Prefabs in the main editor which store actors in a similar way, a feature I have used quite often.
______________________________________________
Remove the Linearity with Post Process Chains
Unreal Engine 3 handles post-process effects rather simply. Not that this is a bad thing: for most users this is a blessing. However, I can see so much potential for improving the post-process chain editor with simple integration of non-linearity. At the moment, post process nodes can only be stacked on top of one-another, and once you alter the rendered scene into a 'processed' state, you cant re-introduce the 'virgin' render for more advanced blending.

I don't know whether this is a hardware limitation or if it's an engine code issue. What I do know is that it is extremely difficult to create new post-process effects outside the library of nodes included in the Post Process Editor. Material Effects can be quite useful, but their effects are linear and total: once you alter the scene using a hardware material effect, the change is irreversible.

I can think of a solution to this that involves adding a material node to the material editor that essentially references a second scene texture. This new scene texture would be 'un-processed' as it would be the rendered frame before any post-processing has been done to the image. This way the artist could blend the current post-process image with the virgin render in any way they see fit. I suspect that this would require double the amount of graphics RAM as a single frame, as the rendered frame would need to be written twice before processed. However, the feature may not have a substantial effect on the frame-rate, as uncompressed frames are only about 2-5mb. This solution is a material editor option, and may not be ideal for designers or some artists. I suggest it merely because it is something I'm somewhat familiar with. Other methods are welcome.
______________________________________________
Allow for Direct Referencing of Local Player via Kismet
Some of you may know of a map I made back in UT3 called "arahan Crown". In the release candidate version of this map, I had a nifty kismet sequence that changed the music depending on the volume the player was in. The problem was that the music changed whenever any player touched the various volumes. Bots and other players chaanged music for everyone in the server, every time a volume was entered. Despite days of my C++ expert friend and I losing sleep over the problem, we could not find a solution in either the UDN, Unreal Wikki, or our own wacky test-maps.
This is just one example of Kismet's limitation involving local sequences on a non-local world. For basic sequences, the 'target' actor flow usually does the job: however more complex kismet multiplayer sequences are impossible if they are meant to be seen/heard/experienced only by the player activating the trigger. I'd like to see this problem addressed. If there is a solution: make the solution more accessible. An experienced programmer and a (relatively) experienced artist could not find a way to do this. If the answer is more obvious than I make it out to be: then have the UDN updated to include information on the subject. There is something wrong here, in some way.
I'm going to add more ideas in the future as soon as I can concentrate my thoughts into words, however: I would like to know your opinions and potential ideas as well.
I don't want anyone to think even for one second that I'm complaining here: on the contrary. I love the opportunity Epic has given us. We all know that one idea leads to many ideas: thus the crazy amount of tools given to us in the UDK only makes me imagine what other features and tools I could use in theory. So this thread is devoted to a non-complaining list of ideas about expanding the use of current tools and possible new features that could help everyone. I do not expect Epic to act on any of these ideas, to do so would be ridiculous. I only want my voice to be heard by the big E.
Please note the use of the word "Possible". No wacky ideas that would be out of reach, or simply implausible, are on this list. Only ones that seem within reason. I will be adding onto this list as more ideas come but mind you, I'm an artist primarily. Most if not all of these suggestions will have to do with art and graphics.
Material Editor 'Prefab' Sets
I don't know about other material artists, but this idea hits close to home with me. I use copy and paste to create 'stacking' effects, or to create two similar materials quite often. I may set up a group of nodes that I plan to use many times, not just in one material either.

Instead of copying and pasting this group of nodes, I could highlight them and create a node 'set' or 'prefab'. The editor would analyze the selected nodes and determine which nodes are connected to un-highlighted nodes via either input or output and from that save a single node with multiple inputs and outputs. Later, I could right-click and select this node 'prefab' from a list of other saved prefabs and place it when and where I desire.

As an added bonus, If I decide to change anything about this prefab, I'll only have to edit one instance of it to update every matching prefab. This would save time in many of my endeavors. A material artist would only edit one node to distribute congruent changes across a material that uses a stacking process for a certain effect. If the possibility exists: allow us to name/label each input and output for better organization. This idea comes from Unreal's integration of Prefabs in the main editor which store actors in a similar way, a feature I have used quite often.
______________________________________________
Remove the Linearity with Post Process Chains
Unreal Engine 3 handles post-process effects rather simply. Not that this is a bad thing: for most users this is a blessing. However, I can see so much potential for improving the post-process chain editor with simple integration of non-linearity. At the moment, post process nodes can only be stacked on top of one-another, and once you alter the rendered scene into a 'processed' state, you cant re-introduce the 'virgin' render for more advanced blending.

I don't know whether this is a hardware limitation or if it's an engine code issue. What I do know is that it is extremely difficult to create new post-process effects outside the library of nodes included in the Post Process Editor. Material Effects can be quite useful, but their effects are linear and total: once you alter the scene using a hardware material effect, the change is irreversible.

I can think of a solution to this that involves adding a material node to the material editor that essentially references a second scene texture. This new scene texture would be 'un-processed' as it would be the rendered frame before any post-processing has been done to the image. This way the artist could blend the current post-process image with the virgin render in any way they see fit. I suspect that this would require double the amount of graphics RAM as a single frame, as the rendered frame would need to be written twice before processed. However, the feature may not have a substantial effect on the frame-rate, as uncompressed frames are only about 2-5mb. This solution is a material editor option, and may not be ideal for designers or some artists. I suggest it merely because it is something I'm somewhat familiar with. Other methods are welcome.
______________________________________________
Allow for Direct Referencing of Local Player via Kismet
Some of you may know of a map I made back in UT3 called "arahan Crown". In the release candidate version of this map, I had a nifty kismet sequence that changed the music depending on the volume the player was in. The problem was that the music changed whenever any player touched the various volumes. Bots and other players chaanged music for everyone in the server, every time a volume was entered. Despite days of my C++ expert friend and I losing sleep over the problem, we could not find a solution in either the UDN, Unreal Wikki, or our own wacky test-maps.
This is just one example of Kismet's limitation involving local sequences on a non-local world. For basic sequences, the 'target' actor flow usually does the job: however more complex kismet multiplayer sequences are impossible if they are meant to be seen/heard/experienced only by the player activating the trigger. I'd like to see this problem addressed. If there is a solution: make the solution more accessible. An experienced programmer and a (relatively) experienced artist could not find a way to do this. If the answer is more obvious than I make it out to be: then have the UDN updated to include information on the subject. There is something wrong here, in some way.
I'm going to add more ideas in the future as soon as I can concentrate my thoughts into words, however: I would like to know your opinions and potential ideas as well.
Comment