Announcement

Collapse
No announcement yet.

Is there any reason not to turn "disable first rate frame lag" off?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    i found that it does increase mouse sesitivity but it also added a slight delay in firing so had to change it back

    Leave a comment:


  • replied
    Still I`d like to see what happens if you run a map like Confrontation or Serenity with 32 bots. No framedrops there?

    Leave a comment:


  • replied
    Originally posted by gargorias View Post
    Okay, this took some time but here you go.
    Using UT3 Benchmark tool with texture and level details on 5 with sound on and DX10 box ticked.
    10 runs of DM-Shangrila with 12 bots.
    Screen resolution - 2560x1600

    Overclock at 3.6 the average FPS was 68.46
    Default clock of 2.66 the average FPS was 66.95

    I think the results are that close that they would justify the statement that UT3 doesn't trouble the i7 920 even at default clocks.

    And for those interested my NVCP settings that I use for UT3:
    AA and AF at 16x
    Texture Filtering quality on high.
    Negative lod bias on clamp.
    AA gamma correction on.
    AA transparency on Multisampling.
    Maximum pre-rendered frames on 3.
    Triple buffering on.

    NB:Yes I know it's 3.8 in my sig that was a mistake I haven't corrected yet.
    Thanks, thats what i thought..

    Leave a comment:


  • replied
    Originally posted by Jov4s View Post
    I actually dont really care anymore.. Im having constant 60 fps maxed out, vsync on. Ocassionally drops to 45, but very rarely during heavy combat.. And yes, i get better framerates and gameplay with the ``OFTL`` box unthicked.

    After three months im starting to get used to my new rig, and UT3, i actually end on top of the list more often lol..

    Whats interesting for me is to know if my stock i7 920 or stock HD4850 is bottlenecking my system and causing the framedrops.

    Okay, this took some time but here you go.
    Using UT3 Benchmark tool with texture and level details on 5 with sound on and DX10 box ticked.
    10 runs of DM-Shangrila with 12 bots.
    Screen resolution - 2560x1600

    Overclock at 3.6 the average FPS was 68.46
    Default clock of 2.66 the average FPS was 66.95

    I think the results are that close that they would justify the statement that UT3 doesn't trouble the i7 920 even at default clocks.

    And for those interested my NVCP settings that I use for UT3:
    AA and AF at 16x
    Texture Filtering quality on high.
    Negative lod bias on clamp.
    AA gamma correction on.
    AA transparency on Multisampling.
    Maximum pre-rendered frames on 3.
    Triple buffering on.

    NB:Yes I know it's 3.8 in my sig that was a mistake I haven't corrected yet.

    Leave a comment:


  • replied
    I actually dont really care anymore.. Im having constant 60 fps maxed out, vsync on. Ocassionally drops to 45, but very rarely during heavy combat.. And yes, i get better framerates and gameplay with the ``OFTL`` box unthicked.

    After three months im starting to get used to my new rig, and UT3, i actually end on top of the list more often lol..

    Whats interesting for me is to know if my stock i7 920 or stock HD4850 is bottlenecking my system and causing the framedrops.

    Leave a comment:


  • replied
    This is what happens without vsync:
    Code:
    Your lcd 
    
    |~~~~~~~~~~~~~~~~~~~~~~~~~~~|
    |                           |
    |                           |  <----      Frame 1
    |                           |
    |___________________________|
    |                           |
    |                           |  <----      Frame 2
    |                           |
    |___________________________|
    The screen draws from top to bottom. If frame 2 is finished being computed before the screen is finished drawing frame 1, you will see half of frame 1 and half of frame 2 like the picture shows above. This is called tearing.

    Because frame 2 is most likely very similar graphically to frame 1, the tearing is usually not noticeable. A quick change of scene such as an explosion happening can make it much more noticeable. If an explosion happens on the screen, the top half of the screen might still be the old frame while the bottom half is the explosion frame, the line between becoming very noticeable, but since it is corrected in the next frame, it's not as noticeable as you'd think.

    Say you are in a fast paced battle. You are looking forward and frame 1 renders. During that rendering you move your mouse. The next engine tick has you looking 90 degrees to the right and shoot. With tearing, the second frame will render immediately. If there is an enemy to the right, for an instant, you will only see his feet, the top of the screen still displaying the first frame. As soon as the bottom of frame 2 is rendered it is continued at the top until frame 3 is rendered, which again results in more tearing.

    No matter how you slice it, there is going to be a line between frame 1 and frame 2. With v-sync, the line is moved to the bottom/top of the screen in between frames so you don't see it. The only way for the engine to move it is to either render faster (which it's already rendering as fast as it can) or to wait. That wait equals lag, which means v-sync causes lag.



    Now lets see what happens with your CRT:

    Code:
    CRT
    
     No vsync
    |~~~~~~~~~~~~~~~~~~~~~~~~~~~|
    |                           |
    |                           |  <----      Frame 1
    |                           |
    |___________________________|
    |                           |
    |                           |  <----      Frame 2
    |                           |
    |___________________________|
    |___________________________| <- vertical blank period
    
    
     with vsync
    |~~~~~~~~~~~~~~~~~~~~~~~~~~~|
    |                           |
    |                           |  <----      Frame 1
    |                           |
    |                           |
    |                           |
    |                           |
    |___________________________| <- end of frame 1
    |___________________________| <- vertical blank period
    
    
         buffer
        |~~~~~~~~~~~~~~~~~~~~~~~~~~~|
        |                           |
        |                           |  <----      Frame 2 waiting in
        |                           |                a buffer to be flipped
        |                           |
        |                           |
        |                           |
        |___________________________|
    v-sync lag is usually not noticeable on a CRT monitor. This is because CRT monitors already have a lag between frames called the vertical blank period (this is what you are supposed to sync against). Since there is already a lag, adding the v-sync lag just lines up both lags together so the image doesn't distort. In this case it actually doesn't cause any lag (it's already thre in the vblank), but simply adjusts the timing of everything to match your monitor's already inherent lag. With v-sync, because the monitor and the game engine tick() are working in harmony, and depending on the engine this can result in a better gameplay experience and actually less lag because of better synchronization. It's only true of CRT monitors though. There is no vertical blank period in LCDs. For LCDs enabling vsync may remove tearing at the cost of a slight bit of lag.

    Leave a comment:


  • replied
    Originally posted by demoniac
    It works nothing like debugging. There's no real-time watch on addresses, no symbol information being passed back and forth. The ov0rhead should be nowhere near that of a debugg0r.
    Sigh, what do you know about debugging if you can't do that even to your own posts?

    Leave a comment:


  • replied
    Originally posted by DrFish View Post
    I can..

    If you turn that on, it works like a trace output of an application you're debugging. It gives info about many things on each cycle, so the performance might drop because, say it would calculate some statistic on every pixel relocation.
    Thanks for your replies!
    So these calculations are related to my CPU? Maybe playing with 32bots was too much and caused fps drops. Killing all bots gave me my regularly fps (with stat engine). It seems that my e8400 @ 3,5 Ghz is too weak to do this calculations with bots :-/
    It would be nice if you can do it or someone else.

    Originally posted by demoniac View Post
    It works nothing like debugging. There's no real-time watch on addresses, no symbol information being passed back and forth. The ov0rhead should be nowhere near that of a debugg0r.
    Well, but what was it instead? I didnt change anything... maybe I am just not getting your point :O

    Leave a comment:


  • replied
    Originally posted by DrFish View Post
    I can..

    If you turn that on, it works like a trace output of an application you're debugging. It gives info about many things on each cycle, so the performance might drop because, say it would calculate some statistic on every pixel relocation.
    It works nothing like debugging. There's no real-time watch on addresses, no symbol information being passed back and forth. The ov0rhead should be nowhere near that of a debugg0r.

    Leave a comment:


  • replied
    Originally posted by Fliperaci View Post
    Well... gargorias, can you messure your input lag with and without vsync via "stat engine" command?
    I tried it yesterday, but strangely everytime I used this command my fps droped and it became unplayable? Disabling stat engine my fps went norma. I cant explain that...
    I can..

    If you turn that on, it works like a trace output of an application you're debugging. It gives info about many things on each cycle, so the performance might drop because, say it would calculate some statistic on every pixel relocation.

    Leave a comment:


  • replied
    Originally posted by gargorias View Post
    I answered your first question above, I honestly see no difference.
    I have never tested this game with my CPU at stock speed but if you want me to I can do it and post some FPS comparisons!
    Yes please, that would be cool. Then i know if my CPU or GPU is bottlenecking my system.

    Leave a comment:


  • replied
    Well... gargorias, can you messure your input lag with and without vsync via "stat engine" command?
    I tried it yesterday, but strangely everytime I used this command my fps droped and it became unplayable? Disabling stat engine my fps went norma. I cant explain that...

    But I was reading this thread and questioned myself: Is mouse lag = Input latency listed in "stat engine"? IF it is so, why dont you guys just messure it with and without vsync on and compare it.
    I dont think that vsync causes Input Lag when the Input Latency value is lower than the fps [for 60fps = 16,6 ms].

    Leave a comment:


  • replied
    Okay so I was wrong so it would appear, my apologies.
    I was getting the whole D3DPRESENT types confused. So it would seem that the gpu does stop rendering to the pipeline while it is waitng for the vertical retrace.
    Not sure how this would cause input lag because if a scene render was skipped after the user input was processed I would assume that the screen would lag or jerk and not the mouse.
    Ohh well, I,m not going to comment on this anymore suffice to say it is all quite confusing at the programming level.
    Just need to say I don't suffer from any input lag that I notice but I do notice screen lag if my FPS is too low!

    Leave a comment:


  • replied
    Originally posted by shombowhore View Post
    Gargorias, you obviously did not finish reading that thread. There were incorrect statements in his original post that were corrected by other users. I probably should have made that clear when I quoted it. However, that does not change that vsync causes input lag, and that is fact.

    Here is the main correction:



    Please note that I never said vsync always causes constant input lag on every system. I simply said that it causes it. If you want to be specific, vsync causes variable amounts of sometimes intermittent input lag on a high percentage of systems. And the tweak that I suggested does get rid it.

    Yes, the amount of input lag is also tied to the quality of your monitor. But that last suggestion to keep your FPS above your refresh rate in order to reduce input lag is just plain wrong. Before you try to argue this again, please see this thread where we went over the very same discussion!

    Here's how it ended:
    This is a bit hard to explain and i may very well be wrong but this is what I know:

    The gpu doesn't stop processing frames just because vsync is on.
    Frames get dropped at the front buffer to screen level not at the rendering level (not sure how that would be possible) and believe it or not this happens anyway with or without vsync.
    All vsync does is stop that one extra frame being sent to the screen while the current frame buffer is being drawn on the screen - read this, you may go back to the guy at anandtech and ask him if he is a d3d programmer.

    With vsync off the gpu will send a frame to the screen while it is still in the vertical blanking interval or wait state but no more then one - not possible!

    You may think your getting 125 FPS but in fact if you have a monitor refresh rate of say 60 then that is all you are going to see anyway 60 fames per second with or without vsync.

    Baring that one extra frame which just causes screen tearing on an lcd all other frames above your monitors refresh rate are not and cannot be sent to the screen as your monitor won't except them - they get discarded.

    Perhaps vsync will have an undesirable effect if your FPS drops below your monitors refresh rate.
    It would most likely stick at that slower FPS while it renders both the front and back buffer through to the screen at the slower speed even if the second frame was rendered into the back buffer at higher speed.

    Other then that I,m not quite sure how vsync could effect user input 60FPS is sufficient frames for smooth gaming and processing of user input although that would of course depend on the game and how it is programmed.

    Cheers.

    Leave a comment:


  • replied
    I have a feeling that certain people in this thread cannot aim well enough to notice slight amounts of delay.

    Even the input lag I got by unticking said box was barely noticeable - I even attributed it to poor netcode changes in Patch2 at first. It's there, but to the majority of players it should be all but noticeable.

    Also, if you play with Vsync on: you sure as hell aren't playing at a very high level of skill.

    Leave a comment:

Working...
X