Hi all, just a few questions about interacting with the physical hardware.
Is it possible to interact with the camera on the hand held device, and if so how? As in, can I take a photo and then use the image in Scaleform? Can UDK still reference external c++ libraries when deploying onto an iOS device? Can you detect the presence of accelerometers (to locate pitch, roll, and yaw of the device itself) and to use that information rather than needing another on-screen joystick control? Can webpage data be read into Scaleform via calls to webpages, or is that "streamed data" and therefore disabled?
Are there any tutorials/threads showing any of the above tech in action (assuming it can be done)? If some or none of these options is available, are there plans to include them in some future release of UDK?
Thanks all.
Is it possible to interact with the camera on the hand held device, and if so how? As in, can I take a photo and then use the image in Scaleform? Can UDK still reference external c++ libraries when deploying onto an iOS device? Can you detect the presence of accelerometers (to locate pitch, roll, and yaw of the device itself) and to use that information rather than needing another on-screen joystick control? Can webpage data be read into Scaleform via calls to webpages, or is that "streamed data" and therefore disabled?
Are there any tutorials/threads showing any of the above tech in action (assuming it can be done)? If some or none of these options is available, are there plans to include them in some future release of UDK?
Thanks all.
Comment