Announcement

Collapse
No announcement yet.

Live Video in UDK?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Live Video in UDK?

    I've been dreaming up a broadcast graphics/data visualization system for a while and I was wondering if it would be possible to use UDK for the rendering of such a thing. UDK would also need to handle video mixing, unless somehow a secondary view could be output that could be used as an alpha channel or key externally. I don't think UDK can do live video input from standard capture devices though, but i thought perhaps Scaleform could handle that since Flash can do it, but, I imagine the performance would be poor and you'd probably only get a single input, I would need at least 4. And also that might be an issue as why not just do it all in Flash at that point.

    Any thoughts on this? I have also considered doing it with Blender's game engine, which can indeed do live video input, and obviously being so tightly integrated would make designing animations for data vis and broadcast graphics and titles etc far simpler.

    Thanks! Sorry for such a weird question

    #2
    Sorry, this cannot practically be achieved in UDK.

    Comment


      #3
      I've read that it's possible to load a custom DLL into UDK, could that possibly be something to look into?

      Comment


        #4
        ++unless somehow a secondary view could be output that could be used as an alpha channel or key externally.++

        This seems doable. You'd need 2 clients running, attach one pawn/camera to the other when the level loads, and have the 2nd use a post-process chain or some other shader wizardry to generate the appropriate mask (which would in theory be similar to a fog-of-war or occluded-character-transparency-overlay).

        all good in theory, but I'm just starting to dig into FOW myself...

        Comment


          #5
          I'm not entirely sure what you're getting at? If you're suggesting you can grab the view from one client to the other, you're likely mistaken (and this doesn't help live video anyway).

          Comment


            #6
            you can take the output of 2 synced clients and combine them externally with capture devices/processing software, outside of UDK. UDK graphics + UDK alpha channel/key, overlayed on top of live video source externally. I do worry about sync issues though with having 2 udk clients rendering a matte/key and graphics, i don't think this would really be practical, though, if you didn't need to key things or have video textures on things, it could work quite well for some sorts of things.

            Sounds like a big mess though, I wonder if you could use a custom dll to transcode a capture device's frames into something UDK could map onto a texture in realtime. Actually with all the Kinect stuff these days UDK seems like it's lacking in this department, though granted Kinect probably handles game input stuff before it gets to the game engine at all in many cases, but there are some games i've seen where the video is displayed ingame. I think UDK ought to get a new video texture system that's capable of things like this.

            Comment

            Working...
            X