Announcement

Collapse
No announcement yet.

Is there any reason not to turn "disable first rate frame lag" off?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #16
    Originally posted by gargorias View Post
    With or without vsync you are going to need framerate smoothing on in this game.
    This works fine for me (vsync off):

    bSmoothFrameRate=true
    MinSmoothedFrameRate=0.000000
    MaxSmoothedFrameRate=65.000000

    MinDesiredFrameRate=65.000000

    What is the difference between MinDesiredFrameRate and MinSmoothedFrameRate?
    What values do you guys have?

    Comment


      #17
      Here is an exact breakdown of how vsync introduces input lag, from this thread on AnandTech:

      How do buffers work? well, buffering is when extra images are rendered and stored depending on the specific setting...

      Single buffer:
      Frame 1 is displayed.
      Frame 2 is rendering.
      Frame 3 will not render until frame 2 is displayed. (or it will and discard, depending on vsync)

      Double buffer:
      Frame 1 is displayed.
      Frame 2 is rendering.
      Frame 3 will render ASAP.
      Frame 4 will not begin rendering until frame 2 is displayed. (or it will and discard, depending on vsync)

      Triple buffer:
      Frame 1 is displayed.
      Frame 2 is rendering.
      Frame 3 will render ASAP
      Frame 4 will render ASAP.
      Frame 5 will not begin rendering until frame 2 is displayed. (or it will and discard, depending on vsync)

      without vsync each frame is rendered ASAP, so the faster the video card, the less time passess between frames. With vsync, each frame is matched to be 1/60th of a second apart.

      Time meaningfully progresses every 1/60th of a second, when a new frame is sent to the monitor. If your video card is fast enough to render 240fps for game X, then it will render and DISCARD about 3 or so frames before the next monitor refresh. Meaning that regardless of buffer the next frame WILL display any user input since done. If your video card is vsynced that it would have instead rendered 3 "future" frames that are each 1/60th of a second apart and EACH is going to be displayed, resulting in X/60th of a second input lag depending on how many frames were rendered before your input was received.

      vsync off (250fps, frames 1/250th of a second apart):
      time=0 : Frame 1 is displayed.
      time=1/250s: Frame 2 created.
      time=1.5/250s: input from user
      time=2/250s: Frame 3 created.
      time=3/250s: Frame 4 created.
      time=1/60s: Frame 4 begins sending to monitor
      time=4/250s: Frame 5 enters buffer while frame 4 is being sent. Resulting in tearing as the top half of frame 4 and bottom half of frame 5 are displayed. user input is included in both.

      vsync on (250fps CAPABLE card working at 60fps) :
      time=0 : Frame 1 is displayed.
      time=1/250s: Frame 2 created
      time=1.5/250s: input from user
      time=2/250s: Frame 3 created.
      time=3/250s: Frame 4 created.
      time=1/60s: Frame 2 (which is missing the last input) begins sending to monitor, when it finishes, frame 5 will begin rendering.

      Basically with very high FPS situation, input lag will be introduced by triple and double buffering. But the tearing is eliminated. With low FPS the input lag is lessened because it is less like that frames are rendered ahead (since the video card is just not fast enough), but it might still occur in times of high FPS spikes. However tearing is completely gone.

      If you think vsync reduces input lag then you are just confusing input lag with lag in general. Or your CPU is choking, and reducing the framerate by capping it allows quicker calculations.
      Originally posted by Fliperaci View Post
      This works fine for me (vsync off):

      bSmoothFrameRate=true
      MinSmoothedFrameRate=0.000000
      MaxSmoothedFrameRate=65.000000

      MinDesiredFrameRate=65.000000

      What is the difference betweenMinDesiredFrameRate and MinSmoothedFrameRate?
      What values do you guys have?
      The game will start stripping visual effects from the screen when your framerate drops below MinDesiredFrameRate. And although in practice MaxFrameRateSmoothing behaves as you would expect (as a cap), BeyondUnreal's UE3 wiki suggests MinSmoothedFrameRate is something different:

      MinSmoothedFrameRate: Minimum framerate smoothing will kick in.
      MaxSmoothedFrameRate: Maximum framerate to smooth. Code will try to not go over via waiting.

      Comment


        #18
        `
        Originally posted by shombowhore View Post
        This is wrong. Input lag with vsync occurs with LCDs and CRTs, and can become worse at higher fps. I used to get 160+ fps in UT2004 on a CRT, and turning vsync on gave me HORRIBLE input lag.

        My framerates in UT3 were consistently above my refresh rate until I capped them using framerate smoothing to remove the mouselag brought on by vsync.

        edit: And I do not have a poor quality LCD.
        Just because you experienced input lag on your CRT's or even LCD's with vsync on doesn't mean everyone did/does and vice versa too.
        I for one never experienced any input lag on my CRT's with or without vsync on. Back in those days input lag seemed to go hand in hand with a bad mouse from what I remember.

        As for UT3 I have found that the only time I have problems with my aim is when I have both vsync and framerate smoothing off, the chopppiness and screen tearing virtually kill the game.
        I have also found that using the ingame vsync option far inferior to forcing it through the Nvidia CP - not sure what's the go there.
        I,m sure though that others will find it different so it just a matter of finding what works for you and what doesn't.
        Cheers

        Comment


          #19
          Originally posted by gargorias View Post
          `


          Just because you experienced input lag on your CRT's or even LCD's with vsync on doesn't mean everyone did/does and vice versa too.
          I for one never experienced any input lag on my CRT's with or without vsync on. Back in those days input lag seemed to go hand in hand with a bad mouse from what I remember.

          As for UT3 I have found that the only time I have problems with my aim is when I have both vsync and framerate smoothing off, the chopppiness and screen tearing virtually kill the game.
          I have also found that using the ingame vsync option far inferior to forcing it through the Nvidia CP - not sure what's the go there.
          I,m sure though that others will find it different so it just a matter of finding what works for you and what doesn't.
          Cheers
          So you have about the fastest single gpu card available atm. Tell me, when you play everything maxed out with the ``one frame thread lag`` box thicked, are you getting framedrops then? If this isnt the case, then tell me, why are you playing with the box unthicked?

          Im also curious about your ut3 performance with the i7 920 on stock speed. Does that make any difference? I play on a 920 stock, with 6gb ram and a HD4850 1gb. Game runs smooth, vsynced and maxed out 1680x1050 with ``one frame thread lag`` option disabled, it only dips to minimal 50 fps in crowded heavy combat scenes in certain maps..

          Trying to find out which part from my pc is the bottleneck, but im not into overclocking.. But actually im quite happy, its good enough and very playable..

          Comment


            #20
            Originally posted by gargorias View Post
            Just because you experienced input lag on your CRT's or even LCD's with vsync on doesn't mean everyone did/does and vice versa too.

            I for one never experienced any input lag on my CRT's with or without vsync on.
            It can be very minimal, so you may not even notice it. I played with it for several days in UT2004, after enabling vsync, before I realized what was screwing up my aim so badly. If you've always played with vsync, I'm not surprised you don't notice it.

            Also, if your framerate is close to or below your refresh rate for the majority of the time (do you already have framerate smoothing on?) you will not experience input lag (unless for some reason your framerate happens to spike upwards.)

            Whether or not it occurs as a result of vsync is not debatable; vsync causes input lag, and that is fact. Whether or not it is common or consistent for a given individual is another matter completely.

            Comment


              #21
              Originally posted by shombowhore View Post
              VSync synchronizes your monitor's refresh rate with your video card to eliminate the tearing that results from having your card work faster than your monitor. However, the card is still working faster - vsync just prevents the frames from getting to your screen before it's ready for them. The result? The frames are held in a buffer for several ms until they are ready to be shown on screen. That amount of milliseconds is your input lag.

              Leaving the 'disable one frame thread lag' ticked apparently puts a greater load on your card, lowering your framerate and therefor reducing this effect. Thankfully, I found a work around recently.

              You can eliminate mouselag entirely while using vsync by turning on fps smoothing and setting the minimum threshold (ini the INI) to several fps below your monitor's refresh rate. If you're using a 75hz monitor, set the minimum fps to 73. Then you can leave 'disable one frame thread lag' turned off without any mouse input lag. If your monitor is 60hz, you'll have to set your minimum to an unfortunate 57 or 58, but for the most part your fps will be consistent and any hitching you might experience will be significantly less noticeable than mouse lag or tearing.
              Hey, I'm running with vsync off(tearing doesn't bug me), but I still experience a small amount of input lag when I untick said box. It's not bad, but enough to mess me up when I'm using the ASMD or the sniper. However, I'd liek to get the frame boost. Do you thin kit would yield higher frames if I turn vsync and do all that jazz I just quoted? Thanks.

              Comment


                #22
                Originally posted by DeathRabbit View Post
                Hey, I'm running with vsync off(tearing doesn't bug me), but I still experience a small amount of input lag when I untick said box. It's not bad, but enough to mess me up when I'm using the ASMD or the sniper. However, I'd liek to get the frame boost. Do you thin kit would yield higher frames if I turn vsync and do all that jazz I just quoted? Thanks.
                Not necessarily. Vsync can make your framerate worse. And if tearing is minimal, that means your fps is probably similar to your refresh rate... unfortunately I can't predict exactly how much enabling vsync will affect it. Your best try is to just give it a shot.

                Turn vsync on, and if your fps is playable, cap your fps to prevent input lag. But if you're already getting input lag with vsync off, it's being caused by something else entirely. And if ticking that option helps the situation, that may be your best bet.

                Compare fps between vsync off and ticked box versus vsync on and unticked box, then make your own decision.

                Comment


                  #23
                  Originally posted by shombowhore View Post
                  It can be very minimal, so you may not even notice it. I played with it for several days in UT2004, after enabling vsync, before I realized what was screwing up my aim so badly. If you've always played with vsync, I'm not surprised you don't notice it.

                  Also, if your framerate is close to or below your refresh rate for the majority of the time (do you already have framerate smoothing on?) you will not experience input lag (unless for some reason your framerate happens to spike upwards.)

                  Whether or not it occurs as a result of vsync is not debatable; vsync causes input lag, and that is fact. Whether or not it is common or consistent for a given individual is another matter completely.
                  Sorry to say but it is very much debatable.
                  If vsync causes input lag then I and in indeed everyone that uses vsync would notice it but that just doesn't appear to be the case.
                  As for the anandtech doc this statement -
                  Conclusions:
                  Vsync eliminates tearing, and reduces input lag (who shot first kind) by decreasing CPU usage, but it also enables buffering to cause stutter.

                  Buffering might or might not make sense without vsync, with vysnc it increases the measured FPS, but causes stutter (input lag of the "I shot and it took suddenly went from no animation to 50% done with animation). Get rid of it.

                  AFR rendering (quad / dual GPU) makes no sense, It increases FPS but introduces significant jitter. At least when coupled with vsync. (but tearing and input lag are bad so it is needed)...

                  So for best quality gaming you want the fastest single core GPU rendering with vsync on and the lowest buffering possible.


                  backs up what I have been stating that vsnc does not cause input lag!
                  As for buffering that will only cause stutter with someone whos fps drops below that of vsnc AFAIK!

                  [EDIT] Oh and that jitter he talks about with AFR it has all but been eliminated since the early dual GPU's and SLI so thats old news![/EDIT]

                  Comment


                    #24
                    Okay read this if you are experiencing input lag on your LCD.

                    My tips for LCD users:
                    Make sure you purchase a good quality LCD monitor.
                    Make sure you run your LCD at native resolution.
                    Use vsync as LCD's require syncronization to prevent exasperated latency problems.
                    Make sure your game FPS never drops below vsync.

                    My tip for CRT users:
                    Make sure your screen refresh rate is at least 85hz.

                    As for the OP's question (sorry for off topic disc.) It doesn't appear to have any effect on my system.

                    Originally posted by Jov4s View Post
                    So you have about the fastest single gpu card available atm. Tell me, when you play everything maxed out with the ``one frame thread lag`` box thicked, are you getting framedrops then? If this isnt the case, then tell me, why are you playing with the box unthicked?

                    Im also curious about your ut3 performance with the i7 920 on stock speed. Does that make any difference? I play on a 920 stock, with 6gb ram and a HD4850 1gb. Game runs smooth, vsynced and maxed out 1680x1050 with ``one frame thread lag`` option disabled, it only dips to minimal 50 fps in crowded heavy combat scenes in certain maps..

                    Trying to find out which part from my pc is the bottleneck, but im not into overclocking.. But actually im quite happy, its good enough and very playable..
                    I answered your first question above, I honestly see no difference.
                    I have never tested this game with my CPU at stock speed but if you want me to I can do it and post some FPS comparisons!

                    Comment


                      #25
                      Gargorias, you obviously did not finish reading that thread. There were incorrect statements in his original post that were corrected by other users. I probably should have made that clear when I quoted it. However, that does not change that vsync causes input lag, and that is fact.

                      Here is the main correction:

                      Vsync does not reduce input lag, it increases it because it stalls the GPU until a refresh cycle is available. In theory triple buffering reduces the lag but I’ve found the opposite in certain cases, so YMMV.
                      Please note that I never said vsync always causes constant input lag on every system. I simply said that it causes it. If you want to be specific, vsync causes variable amounts of sometimes intermittent input lag on a high percentage of systems. And the tweak that I suggested does get rid it.

                      Yes, the amount of input lag is also tied to the quality of your monitor. But that last suggestion to keep your FPS above your refresh rate in order to reduce input lag is just plain wrong. Before you try to argue this again, please see this thread where we went over the very same discussion!

                      Here's how it ended:

                      Originally posted by TAW_GuitarSlim View Post
                      I didn't think it was possible to turn on vsync without getting mouse lag, but the FPS cap is the key! I did exactly what shombowhore said in post #27 and it worked!
                      Originally posted by General-Gouda View Post
                      That Vsync + FPS cap trick works in other games as well so long as there are ways of limiting the maximum FPS to at or just below your Refresh Rate. It works well in Valve's Source games when you use the "max_fps" modifier in the console.

                      Comment


                        #26
                        I have a feeling that certain people in this thread cannot aim well enough to notice slight amounts of delay.

                        Even the input lag I got by unticking said box was barely noticeable - I even attributed it to poor netcode changes in Patch2 at first. It's there, but to the majority of players it should be all but noticeable.

                        Also, if you play with Vsync on: you sure as hell aren't playing at a very high level of skill.

                        Comment


                          #27
                          Originally posted by shombowhore View Post
                          Gargorias, you obviously did not finish reading that thread. There were incorrect statements in his original post that were corrected by other users. I probably should have made that clear when I quoted it. However, that does not change that vsync causes input lag, and that is fact.

                          Here is the main correction:



                          Please note that I never said vsync always causes constant input lag on every system. I simply said that it causes it. If you want to be specific, vsync causes variable amounts of sometimes intermittent input lag on a high percentage of systems. And the tweak that I suggested does get rid it.

                          Yes, the amount of input lag is also tied to the quality of your monitor. But that last suggestion to keep your FPS above your refresh rate in order to reduce input lag is just plain wrong. Before you try to argue this again, please see this thread where we went over the very same discussion!

                          Here's how it ended:
                          This is a bit hard to explain and i may very well be wrong but this is what I know:

                          The gpu doesn't stop processing frames just because vsync is on.
                          Frames get dropped at the front buffer to screen level not at the rendering level (not sure how that would be possible) and believe it or not this happens anyway with or without vsync.
                          All vsync does is stop that one extra frame being sent to the screen while the current frame buffer is being drawn on the screen - read this, you may go back to the guy at anandtech and ask him if he is a d3d programmer.

                          With vsync off the gpu will send a frame to the screen while it is still in the vertical blanking interval or wait state but no more then one - not possible!

                          You may think your getting 125 FPS but in fact if you have a monitor refresh rate of say 60 then that is all you are going to see anyway 60 fames per second with or without vsync.

                          Baring that one extra frame which just causes screen tearing on an lcd all other frames above your monitors refresh rate are not and cannot be sent to the screen as your monitor won't except them - they get discarded.

                          Perhaps vsync will have an undesirable effect if your FPS drops below your monitors refresh rate.
                          It would most likely stick at that slower FPS while it renders both the front and back buffer through to the screen at the slower speed even if the second frame was rendered into the back buffer at higher speed.

                          Other then that I,m not quite sure how vsync could effect user input 60FPS is sufficient frames for smooth gaming and processing of user input although that would of course depend on the game and how it is programmed.

                          Cheers.

                          Comment


                            #28
                            Okay so I was wrong so it would appear, my apologies.
                            I was getting the whole D3DPRESENT types confused. So it would seem that the gpu does stop rendering to the pipeline while it is waitng for the vertical retrace.
                            Not sure how this would cause input lag because if a scene render was skipped after the user input was processed I would assume that the screen would lag or jerk and not the mouse.
                            Ohh well, I,m not going to comment on this anymore suffice to say it is all quite confusing at the programming level.
                            Just need to say I don't suffer from any input lag that I notice but I do notice screen lag if my FPS is too low!

                            Comment


                              #29
                              Well... gargorias, can you messure your input lag with and without vsync via "stat engine" command?
                              I tried it yesterday, but strangely everytime I used this command my fps droped and it became unplayable? Disabling stat engine my fps went norma. I cant explain that...

                              But I was reading this thread and questioned myself: Is mouse lag = Input latency listed in "stat engine"? IF it is so, why dont you guys just messure it with and without vsync on and compare it.
                              I dont think that vsync causes Input Lag when the Input Latency value is lower than the fps [for 60fps = 16,6 ms].

                              Comment


                                #30
                                Originally posted by gargorias View Post
                                I answered your first question above, I honestly see no difference.
                                I have never tested this game with my CPU at stock speed but if you want me to I can do it and post some FPS comparisons!
                                Yes please, that would be cool. Then i know if my CPU or GPU is bottlenecking my system.

                                Comment

                                Working...
                                X