No announcement yet.

Does The Unreal Engine 3 Have Rounding Errors For Low In-Game Mouse Sensitivity?

  • Filter
  • Time
  • Show
Clear All
new posts

    Does The Unreal Engine 3 Have Rounding Errors For Low In-Game Mouse Sensitivity?

    I apologize if this isn't the appropriate forum to be asking this, I didn't know where else to post.

    I'm a Tribes: Ascend player. I was recently told, by a fellow T:A player, that "the Unreal engine has s*** rounding errors if you use below 5 sensitivity."

    I attempted to do some research on this and only found one obscure post by some random guy on some random forum and he said,

    "Many games do not do completely accurate math for translating the in - game sensitivities, and the rounding errors and other problems are exacerbated by lower in game settings. The Unreal engine 3 is one that is notorious for this, you can see for yourself by going in any game with this engine, setting the sensitivity to 1 and moving your mouse very slowly. Sometimes it will never move at all in the game because the tiny fractions get rounded down to zero over and over. So your in game sensitivity needs to be above 4"

    So, my question to you guys is, is this true?

    Also, does this happen in all games that use the Unreal Engine 3?

    And let's clarify, does your sensitivity need to be above 4 or above 5?

    Thanks in advance.

    download the udk and run UT, figure it out... im not at home or I would test for you.. chances are that it might be correct... but ummmmmm who really uses a sensitivity below four?


      Originally posted by Trixer View Post
      download the udk and run UT, figure it out... im not at home or I would test for you.. chances are that it might be correct... but ummmmmm who really uses a sensitivity below four?
      I don't have the mouse testing software that can measure the minor discrepancies, that's why I'm asking here.

      I would just rely on field tests, but my PC isn't that good and my ISP isn't that stable, and latency + low frames makes it impossible for me to tell what's causing what.

      So, again, I'm come to you guys for help or at least a push in the right direction.

      Thanks again.


        If I set the mouse sensitivity in UDK to 1.0f it works fine for me albeit very slow movement.
        Mouse sensitivity is a floating point value, so I fail to see what "rounding errors" would be occurring.


          Did some more research and came up with this, if anyone is interested:

          "I have always wondered why mouse input feels way off when in higher zoom modes. Usually if I decide to play sentinel for a bit I have to switch from 1800 dpi to 600 and set my sensitivities to different levels. This somewhat works but it is tedious and annoying to switch mouse dpi along with game sensitivity and zoom macros when I decide to switch classes. After a bit of research to see how other games do mouse translation and how UDK does it (this isn't hi-rez's fault!) I will present my findings, different ways to alleviate this problem for the user, and how it should be dealt with by Hi-Rez if it comes to that.

          The mouse provides 2-Dimensional input in the form of dots or points. A 400dpi mouse sends 400 points for 1 inch of travel in a single direction as an 1800 dpi mouse sends 1800 points. FPS games convert these points into angles that rotate your player.

          For these examples I will just examine the formula for movement on one axis.
          I never had mouse input problems with quake based games and for all I am concerned it is the golden standard for how mouse input should feel.

          Quakes Method is as follows: (assuming no mouse acceleration)

          θ = MouseInputX * Sensitivity * M_Yaw
          M_Yaw is by default 0.022 degrees. So if in game sensitivity is 1, 1 inch of travel on an 1800dpi mouse to the right will rotate the player by 39.6 degrees.

          UDK's Method is similar:

          θ = MouseInputX * Sensitivity * 0.00549316540360483
          So if we want the same feel as in the Quake example we can set the sensitivity to around 4, giving us 39.6 degrees of movement on 1800 dpi mouse over 1 inch of horizontal travel.

          So where is the problem then?
          UDK stores most rotations in a "rotator" struct

          struct immutable Rotator
              var() int Pitch, Yaw, Roll;
          Where "rotator units" of rotation are stored as euler angles across 3 integers.

          Taking this into account, UDK's actual mouse translation looks like this,

          int UnrRot = (int)(MouseInputX * Sensitivity)
          So for every "UnrRot" of movement there is an amount of rotation, in degrees, of (UnrRot * UnrRotToDeg ) Where UnrRotToDeg = 0.00549316540360483;

          You cannot move .5 UnrRots! It is an integer, you must move by whole numbers only!
          Every decimal movement of the mouse is converted (rounded) to an integer. And C++ by default truncates (rounds down). If I move 1 dot to the right and have a sensitivity of 0.9 there is no rotation whatsoever.

          Suppose two players want 1 inch of mouse movement to be 15 degrees of rotation. Player one has a 400dpi mouse and Player two has an 1800 dpi mouse. Player one sets his sensitivity to 6.82, and Player two sets his to 1.51.

          If both players move their mice slowly, over an inch, to where every frame their mouse moves only one dot (MouseInputX = 1) Player one moves 13.18 degrees and Player two moves 9.88 degrees. That's an error of 13.8% and 66% respectively.
          Now both players move their mice by an inch but quicker. Now at 2 dots per frame instead of one(MouseInputX = 2) Player one moves 14.28 degrees and Player two moves 14.83. Their errors are suddenly very different, 5% and 1.1%.

          But if Player 2 wants to play at zoom level 2 at low sensitivity, his lowest choice of sensitivity is 1. If the sensitivity is any lower the amount of error will sky rocket. The only way to safely lower the amount of rotation per unit of mouse travel is to lower the mouse dpi.

          Why does this affect High DPI players more? It doesn't, It affects all people who have decimal sensitivities, with the most affected being lower sensitivity players as the error caused by rounding is greatest. It's just that high dpi players tend to play at lower sensitivities.

          How to alleviate this problem:

          Set your sensitivity to whole numbers or slightly above whole numbers (ex: 10.0 or 10.01)
          Either set "bEnableFOVScaling=False" in your input.ini or understand how fov affects sensitivity.*

          *If fov scale is on your sensitivity is multiplied by (fov * .01111)
          I'm guessing that at zoom level one your fov is (fov - (fov - 40)/2) aka halfway between normal and 40. And zoom level two your fov is 40.
          If you set fov scaling off you should look for a sensitivity macro. Most games set the sensitivity to the ratio between the two fov's (90->45 = .5 sensitivity) If you want it to feel like default cs 1.6 you multiply the ratio by 1.2 (90->45 = 1.2*(.5 sensitivity))

          How Hi-Rez should handle it:

          Level 0: Do nothing
          Level 1: Add 0.5 to the mouse movement so it rounds better as it gets casted to an int
          Level 2: In addition to above, save the error from rounding and apply it next frame
          Level 3: Virtual DPI cvar
          Level 4: Use quaternions instead of rotators."

          Original post: