No announcement yet.

what graphics card nets 60 frames on max settings?

  • Filter
  • Time
  • Show
Clear All
new posts

    One last Bench at max settings (equivalent of a 6 in settings) with custom @ 1680*1050
    5867 frames collected over 59.14 seconds, disregarding 0.01 seconds for a 99.21 FPS average, 99.23 percent of time spent > 30 FPS
    Average GPU frame time: 0.00 ms
    on CTF-OmicronDawn with 16 active bots

    just hit 102.56 FPS average, 99.60 percent of time by defragmenting. I really need a better hard drive. Any suggestions in the 500gb range?


      Originally posted by cq842000 View Post
      If you are the same Anuban that provided maps cooked for PS3, THANK YOU very much for your time and effort.
      Yep I am him and its true about what I said in regards to DX10. The tick box does nothing atm ... its there for the future. Epic has even said that while they support DX10/Vista as far as UT3 running on that platform with no issues there is no native DX10 code present in UT3 at this time. They may add some in a future patch and they have left that open for now ... with that code though FSAA support will be native as it in Gears of War if you run it on Vista and have an appropriate GPU.


        On my new rig with a GTX280 and core 2 quad 9450 I'm getting 60fps+ at 1920*1200 with everything maxed



          UE3 fully supports DirectX 10. Read the first sentence here.

          DX10 doesn't add anything fundamentally over DX9, so you are not going to magically get 50% higher framerate -- a bigger number doesn't mean way faster, or instantly have your display change to photo-realistic by enabling DX10. If that is what you think it does, you need a reality-check.
          You also require a DX10 video card to utilize the new features, so much of the end-results is dependant on the hardware.

          IMHO MS should have called it 9.5, but that is their marketing choice.

          UE3 is used for all of Epic's current games, they are not going to create two different render paths. And they aren't going to make a separate renderer just for UT3. The number of features utilized may vary between titles, but that doesn't detract from the fact that the engine supports DX10.

          FiringSquad has a pretty good overview on DX10 in gaming.


          Stating that UT3 only uses 600MB is silly. The amount of RAM required depends on a number of factors including current loaded content.

          Any system that is running any newer generation game like UT3 should have more than 2GB RAM. 2GB is the absolute minimum these days. The average configured base XP OS system with minimal background apps running will eat 0.5GB itself. Most people also have their paging settings wrong.

          Anyone with 2GB or less needs to upgrade their RAM. You also want 800 minimum speed, 1066 or higher if you have the cash or like to overclock it.

          The amount of Video Adapter memory also changes depending on the number of assets used on the map. StaticMeshes, Textures, Lightmaps, Shaders, all require GPU resources. A heavy map can have 70MB of Lightmaps alone, which isn't streamed.
          Anything less than a 512MB card may have issues and the engine may have to start dropping things back.

          A Tim Sweeney interview regarding UE3/UT3 and DX10 and also Memory Use.

          NV vs ATI:

          The latest cards from ATI are nice and beat most NV setups.
          However, even the HD3000 series from ATI still have better image quality than any NV.

          My system with HD3870 looks way better than the system with the NV8800GTS 512MB. The 3870 is also more stable. It has never failed under GoW or UT3, whereas the 8800 will randomly crash every now and then on UT3 in-game. The 8800 also has a foliage instancing bug (I don't think this is fixed yet).

          Both systems are close to the same hardware (on purpose). The main difference is the Dual versus Quad core CPUs. Same mobo, same RAM brand/type, same case, same monitor setup, etc.

          I plan on getting a 4870 soon to replace the 3870.


          FYI: the "fps" that is shown on the "stat fps" is not the number of frames per second your monitor is showing. That is the easier-to-understand conversion from the engine Frame Time. So it doesn't mean exactly everything you think it does.

          Any decent current hardware should get 60+ "fps", but it depends on the game conditions and chosen screen resolution, which was not specified in the first post question. For all stock retail maps with just the base player at resolutions up to 1280x1024 most any Dual core with NV8800 series or ATI HD3800 series should get 60 "fps".

          My systems which are listed in my sig get constant 95% 60fps+ on "stat fpschart" and average between 80fps and 140fps on "stat fps", and some maps even higher. I'm running 1680x1050 on both, UT3 settings at maximum.

          As soon as you throw a bunch of vehicles and players into the mix though, your "fps" is going to drop. This is because the Frame Time has increased.


            UT3 Benchmarks on ATI 4870 vs NVidia GTX280 plus others

            Here it is, UT3 benchmarks.ATI 4870, 4850, 3870, NVidia GTX 260 and 280, the 9800 and 8800 cards as well



              Originally posted by DGUnreal View Post
              DX10 doesn't add anything fundamentally over DX9, so you are not going to magically get 50% higher framerate -- a bigger number doesn't mean way faster, or instantly have your display change to photo-realistic by enabling DX10. If that is what you think it does, you need a reality-check.
              DGUnreal, you crack me up Because that was exactly what M$ and their $heep were hyping.
              FiringSquad has a pretty good overview on DX10 in gaming.
              It really has
              First shot: DX9 is 3 times faster. 2nd: a 8800gts-320 gives you 2.6 fps


                If MS didn't say it was bigger and better, then who would buy into it?

                Personally I never read anything from MS that stated DX10 was supposed to be a quantum leap in performance over DX9. I may have just missed that though. And I take any other media hype as exactly that -- hype.

                However, on the "insides" in DX10 they added a bunch of API improvements and optimized some ways to do things. So specific functionality and effects are going to be faster than previous versions. For example, the cost of DrawCalls was reduced in DX10, if the game engine was able to utilize this well, it gave a performance boost. Same with State management, etc.
                IMHO what we should have seen from DX10 was better graphics and effects at a slight performance increase.
                Part of the problem though is that existing engines and some current engines do not fully or properly utilize all of the benefitial features (for various reasons). A lot of DX10 got really confusing in implementation (for me anyway) and if things are not done properly when switching from DX9 to DX10 you'll tank your DX10 engine performance. Same issue with the current DX10 Video Cards and their respective drivers. So what we end up seeing is mostly slower or same-speed games with slightly-glitz-ier screenshots.

                I have no doubt that it will just be a bump in the road to better things, as occurs often when updating technologies. Same with Vista (which I actually happen to like, even though they messed up Windows Explorer badly).


                  Well people can say what they want ... btw that article with Sweeney is pretty old and I found it very funny ... what he said when asked about when UT3 will ship was priceless. When its ready. Yeah okay ... anyway yes it is true that UE3 is built to take advantage of DX10 but only if the game itself is going to use it. And currently there is NO DX10 code is UT3 that performs any specific DX10 only functions. Epic already showed in one application what having DX10 code can do ... Gears and its AA is supported only on a DX10 platform so we can say that Gears takes direct benefit (tangible ones the user can see) ... UT3 will work fine on a DX10 platform but it doesn't take advantage of any of the features of DX10. This can be seen in the fact the games look and function identically on both DX9 and DX10 platforms. If you google for a list of games which have DX10 functionality and not just "support" you won't UT3 anywhere.

                  Edit: Also I think the reason that DX10 functional code has been left out is because DX10 can't be implemented on the Sony PS3 AFAIK and they probably wanted a more uniform code base so all three versions were pretty much identical so they didn't make the PC version all that it could have been. I have other games with true DX10 functionality and you can definitely see the difference when it is using just DX9 code (WiC, Crysis, Lost Planet, even Bioshock to an extent all show noticeable visual improvements when running on DX10.


                    The rig I have UT3 installed on is running XP Pro with 2 gigs of ram and UT3 runs great. That system uses about 260mb of ram when idle... depends how much **** you have running in the background obviously.


                      Anu, quote your sources please.

                      Trust me, I have Licensee access and full source, there is DX10 code for DX10 functions and SM4. I would hope to expect additional implementations in future builds that improve and increase the DX10 supported feature list.
                      I agree, there are games/engines that have more DX10 support (and even other UE3-based games by other studios), but to say there is zero DX10 code is not true.

                      Gears/UT3 and AA is another topic. I could post a long blurb but that won't help most people, suffice to say that it has to do with access to the full-res and z-buffers. GoW used some tricks to get partial environment AA, it wasn't true full FSAA/MSAA.

                      The engine also uses a full supported set of target platform code for PC-SM2/3/4, plus XB360 and PS3, so it already branches to various calls depending on the platform.


                        Ahh its not that big of a deal to me and I am not going to get into an argument with a friend over something this insignificant. If thats what you believe (or know as you say) then fine ... leave it at that. When I see some real proof in game then I will be convinced otherwise unless some Epic programmer responds on this thread with info I am going to believe my eyes. So lets just move on ... no need to beat this issue with a dead horse. No one here is seeing any benefit whatsoever (unlike Gears) to running this on a DX10 platform vs DX9 (also unlike other games) so the issue imo is moot. Turning that option on (the one about DX10 in the ini) does nothing for this game atm. Anyone who has some direct proof otherwise (not some technical articles) post it.

                        Edit: I think I did say that I know there is DX10 support in the UE3 engine but the point is it really being used in UT3 ... I don't think so and all I have ever heard is that UT3 "supports" DX10. Which is a lot different from we actually utilize DX10 functionality to make this or that better (again as in other games with DX10 implementation).


                          GTX 280 gets 60 fps at 2560x1600


                            Anu, no argument here on not seeing a change between DX9 and DX10. Simply that there is DX10/SM4 code in the engine. There would still have to be more implementation in UE3 to match the other DX10 games.


                              Originally posted by DGUnreal View Post
                              Anu, no argument here on not seeing a change between DX9 and DX10. Simply that there is DX10/SM4 code in the engine. There would still have to be more implementation in UE3 to match the other DX10 games.
                              I totally get what you saying and you are correct ... I was mistaken is saying there is no DX10 code at all in UT3 .. its there since its built into the core engine but its how well the application makes use of all this core functionality is what I think we are all talking about and it appears thus far the in terms of UT3 it is minimal at best and no existent at worst. Oh yeah thanks for clarifying the issue of the code branching which explains how this works on the PS3 without the need for the PS3 to explicitly support DX10 ... it all makes sense now so thanks.


                                Originally posted by Phopojijo View Post
                                You 'asked for performance help' {assuming 1984 and 84 are the same person}... gloated about your PC... then when someone had a better one you attacked him. Yes your PC is good... gloating a little is okay (that's why we have sigs) -- just don't attack people as a result of it.
                                lol....chris wasnt bragging. that other guy clearly posted it to brag about his system and it was really sad. +1 to his comments