No announcement yet.

fps and you; a story how 60fps changed my ut life lol

  • Filter
  • Time
  • Show
Clear All
new posts

    fps and you; a story how 60fps changed my ut life lol

    so the game was running on 40 fps.. i sucked.. wasnt able to get on top 3 ever!!! i sucked..

    But now the game runs smooth at 60fps on good settings and good res since i tweaked a bunch of things on my pc.. and ddd ddd dd aaammm im back in the game baby!!!! i got first place on deathmatch few times already and i move like a fly

    so get good fps ftw!



      wow.. ok alot of people for sure dont know this but read it helps.... instead of *****ing read and learn

      V-SYNC > (Taken from )

      Vertical Synchronization, also called Vertical Sync, or simply VSync for short, is primarily required because of the physical limitations of CRT monitors as discussed in the Refresh Rate section. A CRT monitor has to constantly light up the phosphors on the screen many times per second to maintain an image, and can only do this a certain number of times per second based on how fast the electron gun in the monitor can move. Each time it has to redraw the entire screen again, it moves the electron gun inside the monitor from the bottom of the screen to point to the top left of the screen, ready to 'repaint' all the lines on the screen from top left to bottom right, and back again for the next refresh. The period during which the electron gun moves to the top of the screen for a new refresh is called the Vertical Blanking Interval (VBI).

      Enabling VSync tells your graphics card to synchronize its actions with your monitor. That means the graphics card is only allowed to swap its frame buffer and send a new frame to the monitor when the monitor says it is ready to repaint a new screen - i.e. during the VBI. Your graphics card and monitor do not have to be in sync; they can still operate properly when VSync is disabled, however when VSync is disabled, you can experience a phenomenon called Tearing in periods when your graphics card and monitor go out of sync, precisely because the graphics card and monitor are acting without regard for each other's limitations.


      It is an unfortunate fact of computer graphics that if you disable VSync, your graphics card and monitor will go out of synch. Whenever your FPS exceeds the refresh rate (e.g. 120 FPS on a 60Hz screen), or in general at any point during which your graphics card is working faster than your monitor, the graphics card produces more frames in the frame buffer than the monitor can actually display at any one time, so the end result is that when the monitor goes to get a new frame from the primary buffer of the graphics card during VBI, the frame may be made up of two or more different frames overlapping each other. This results in the onscreen image appearing to be slightly out of alignment or 'torn' in parts whenever there is any movement - and thus it is referred to as Tearing. An example of this is provided in the simulated screenshot below. Look closely at the urinals and the sink - portions of them are out of alignment due to tearing:

      Click to enlarge

      The precise visual impact of tearing differs depending on just how much your graphics card and monitor go out of sync, but usually the higher your FPS and/or the faster your movements are in a game, such as rapidly turning around, the more noticeable it becomes. This is because the contents of the overlapping portions of new and old frames are more noticeably different from each other in such cases.

      Tearing does absolutely no damage to your graphics card or monitor. It just highlights the physical limitation of your monitor in keeping up with the graphics card when the two aren't synchronized. In the example of 120FPS on a 60Hz monitor, at most only 60 whole frames can actually be refreshed during any one second by your monitor, so the other 60 frames your graphics card is producing are pretty much being wasted and are coming out as lots of partially overlapping frames and hence only contribute to tearing. So even if you don't want to enable VSync, it makes sense for you to raise your in-game graphics settings to reduce your FPS such that it stays closer to your refresh rate. This will help you get more whole frames and thus reduce tearing. It may seem cool to have a very high framerate, but as you can see it is wasteful and only causes graphical glitches when VSync is disabled.

      Note that tearing is found equally on CRT or LCD monitors, since both work on the same basis for compatibility purposes - see the Response Time section for details of why an LCD monitor would behave the same as a CRT monitor in this respect.

      FPS & VSync

      When VSync is disabled, your FPS and refresh rate have no relationship to each other as such. This lets your graphics card work as fast as it wants, sending frames to the monitor as fast as it can draw them. Whether the monitor can actually show all these frames properly or not is another matter, which we've already discussed above. Clearly if disabling VSync can cause graphical glitches, however minor they may be, wouldn't it make sense to always enable VSync so that your graphics card doesn't wind up wasting its efforts only to generate more tearing? Well once again, things are not as simple as that.

      When VSync is enabled, what happens is that your graphics card is told to wait for your monitor to signal when it's ready for a new frame before supplying a single whole frame, each and every time. It can't race ahead, it can't just pump out lots of partially completed frames over old ones whenever it's ready - it has to provide a single whole frame to the monitor whenever the monitor says it's ready to refresh itself during VBI. The first noticeable impact is that your FPS becomes capped at a maximum equal to your current refresh rate. So if your refresh rate is 60Hz for example, your framerate can now only reach a maximum of 60FPS. By itself this isn't really a problem, since every monitor can do at least a 60Hz refresh rate at any resolution, and as we've discussed under the Frames Per Second section, if your system can produce 60FPS consistently in a game this should be more than enough FPS to provide smooth natural motion for virtually any type of game.

      There is however a more fundamental problem with enabling VSync, and that is it can significantly reduce your overall framerate, often dropping your FPS to exactly 50% of the refresh rate. This is a difficult concept to explain, but it just has to do with timing. As we know, when VSync is enabled, your graphics card pretty much becomes a slave to your monitor. If at any time your FPS falls just below your refresh rate, each frame starts taking your graphics card longer to draw than the time it takes for your monitor to refresh itself. So every 2nd refresh, your graphics card just misses completing a new whole frame in time. This means that both its primary and secondary frame buffers are filled, it has nowhere to put any new information, so it has to sit idle and wait for the next refresh to come around before it can unload its recently completed frame, and start work on a new one in the newly cleared secondary buffer. This results in exactly half the framerate of the refresh rate whenever your FPS falls below the refresh rate.

      As long as your graphics card can always render a frame faster than your monitor can refresh itself, enabling VSync will not reduce your average framerate. All that will happen is that your FPS will be capped to a maximum equivalent to the refresh rate. But since most monitors refresh at 60Hz or above, and in most recent games it is difficult to achieve 60FPS consistently at your desired resolution and settings, enabling VSync usually ends up reducing your FPS. Fortunately, because this problem is pretty much caused by the frame buffers becoming filled up, there is a solution and that's to enable a third frame buffer to allow more headroom. However this is not a straightforward solution, and to read more about this see the Triple Buffering section.

      So Which is Best, VSync On or Off?

      VSync poses a real dilemma for many people: with VSync off, tearing can occur whenever your graphics card and monitor go out of sync, and this can be very annoying for some people, especially in fast motion games. However with VSync on, your FPS can often fall by up to 50%. This can be resolved on many systems using Triple Buffering, but that also brings with it a range of possible problems. So which choice is right for you?

      Well clearly I can't give you a one-size-fits-all answer, but I can provide some suggestions. To start with, I strongly recommend setting VSync to 'Application Preference' (or similar) in your graphics card's control panel. This is because ideally you should set your VSync preference on a game-by-game basis, preferably using the in-game settings, as the choice will differ depending on the type of game you are playing. Newer games with complex graphics for example will be different to older games which your system can run much more easily. Remember, in games where your FPS is consistently above your refresh rate, enabling VSync is perfectly fine and results in no drop in FPS.

      In general, I recommend starting off with VSync disabled in any game as this is the most trouble-free method of gaining the fastest possible performance. This is the simplest solution, and on monitors which have lower refresh rates, or for games in which your framerate is not very high, this appears to be the best solution. You may notice some tearing, but this will generally be minimal if your FPS remains below your refresh rate anyway. Remember though that even if your FPS matches your refresh rate exactly, or is even below it, whenever VSync is disabled the graphics card and monitor are not strictly in sync, and tearing (however minor) can occur at any time.

      In any game if you find tearing annoying, you should enable VSync. If you find your FPS has halved, you should then specifically try enabling Triple Buffering, as this can help fix the FPS drops related to enabling VSync, but it introduces the possibility of hitching on graphics cards with less VRAM, and possible control lag on some systems. See the Triple Buffering section for details.

      There is no clear choice for everyone when it comes to VSync, and this is why the option to enable or disable VSync exists both in the graphics card control panel and in games. As long as you understand what it does however, you can make an educated choice to suit your hardware and tastes.


        Indeed. Fps is a critical factor up until a certain point. You reach the point of diminishing returns in terms of advantage once you get over about 60 fps. In other words, you will only see decreasingly smaller gains in advantage when you consistently go increasingly higher in fps.

        When you bring down the console and type "stat fps" the game will display a color coded fps number. Green numbers essentially mean you are good to play the game. Yellow indicates you are at a disadvantage, and red you might as well not play because not only does the game look terrible and play terrible at that rate, but it is near impossible to have any type of consistency then.

        And a little tidbit I'll throw in for free ...if your ping goes over say 60 it is time to start leaning towards your splash damage weapons.


          My 22'' 1680x1050 LCD screen is 60Hz, anything more than that just adds tear, vsync and the smooth frames option are both nice imo, I understand that CRT gamers don't want em though.


            Can someone explain about TFT screens ? Do they have that same refresh rate of 60Hz ?
            My TFT monitor has "an update of 8ms" ... how does that correspond to the Hz of a CRT screen


              It doesn't.


                well what does vsync do then with a TFT :s


                  It locks the FPS of the game to the refresh rate of your screen (60Hz = 60 FPS) the ms of the screen has nothing to do with it.


                    hey ozz awesome post. i dont know if you typed that all yourself or not, but very informational. thanks


                      Have to agree with exil3 and :rolleyes. If you sucked with 40fps you will suck the same with 60 -fact. Now if you were playing with 20fps or less I would agree, you undoubtably be playing much better with 60fps. I play with my frames locked at 30 (for the time being), and that rate has nothing to do with my suckage.


                        noooo is not the same.. dont ask me why.. just that at 40 the mouse lag wasnt the same at constant 60.. the mouse response is better


                          I can only play the game smooth with Vsync enabled I got an old 1280x1024 monitor (u know, a huge box ) and a 8800GTS video, runs PERFECT


                            PeterS_01: is that a 1280x1024 or 1280x960 monitor?


                              i think he means the 1024 big boxy ones.