Is the 4870 now ATI's best GPU and is it really better than Nvidia for UT3? Also even though the 900GX2 has 1GB isn't only 512MB actually what is being used like the 7950GX2 card was? And finally I have two 8800GT 512 cards in SLI and I run the game on max settings at 1680x1050 and get consistent framerates of 62 fps on the DM and CTF maps and in the 50s on all the Warfare and VCTF levels. I am also thinking of upgrading my cards soon but there are just so many these days I actually don't have a solid idea. I would like the new GTX280 but I hear that the 9800GX2 is actually more powerful for UT3 and other UE3 powered games. What's the real deal GPU experts? Is ATI back on top after all these years?
Announcement
Collapse
No announcement yet.
what graphics card nets 60 frames on max settings?
Collapse
X
-
Originally posted by chrisr1984 View PostWhat graphics cards net perfect 60 fps on max settings?
4333 frames collected over 59.55 seconds, disregarding 0.00 seconds for a 72.76 FPS average, 94.73 percent of time spent > 30 FPS
Average GPU frame time: 0.00 ms
Memory capacity is especially important for any machine running Windows Vista with SP1
Comment
-
Originally posted by AnubanUT2 View PostIs the 4870 now ATI's best GPU and is it really better than Nvidia for UT3? Also even though the 900GX2 has 1GB isn't only 512MB actually what is being used like the 7950GX2 card was? And finally I have two 8800GT 512 cards in SLI and I run the game on max settings at 1680x1050 and get consistent framerates of 62 fps on the DM and CTF maps and in the 50s on all the Warfare and VCTF levels. I am also thinking of upgrading my cards soon but there are just so many these days I actually don't have a solid idea. I would like the new GTX280 but I hear that the 9800GX2 is actually more powerful for UT3 and other UE3 powered games. What's the real deal GPU experts? Is ATI back on top after all these years?
another random UT3 benchmark...
All settings maxed, DX10, 16XAF, 1680*1050, 10 active bots
On VCTF-War
Dumping FPS chart at 2008.07.17-15.26.49 using build 3543 built from changelist 215808
4891 frames collected over 58.57 seconds, disregarding 0.00 seconds for a 83.51 FPS average, 97.98 percent of time spent > 30 FPS
Average GPU frame time: 0.00 ms
Comment
-
There is no DX10 code in UT3 so it doesn't matter if you have Vista and a DX10 GPU because UT3 cannot take advantage of it. Unlike Gears of War PC there is no AA with DX10 for UT3 since there is no DX10 code. Thanks for letting me know ATI is back but I will probably still stick with Nvidia when I upgrade. I have had less problems with their drivers.
Comment
-
Originally posted by AnubanUT2 View PostThere is no DX10 code in UT3 so it doesn't matter if you have Vista and a DX10 GPU because UT3 cannot take advantage of it. Unlike Gears of War PC there is no AA with DX10 for UT3 since there is no DX10 code. Thanks for letting me know ATI is back but I will probably still stick with Nvidia when I upgrade. I have had less problems with their drivers.
I completely understand, and I commend you for going with what has worked for you, however I am using UT3 benchmarking tool from guru3d and in FACT there is a tick box for directx10. So hmmm, I wonder why they would include such a specific tool for a specific program, if it really had no use and was an absolute waste of even simple dev coding. Also I was always a hardcore Nvidia fan, and in many ways still am, but I cannot deny facts when they stare me in the face, and that would be that this new ATI card is definitely the cream of the crop, for less than I paid for my EVGA 8800GT OC, at its launch. As for driver problems, I must admit I havent had any, so far, and that goes for Nvidia as well. My lowest score in Vista ultimate is a 5.7 for my HDD. If you would like a screen capture of the benchmarking tool with DX10 options, just PM me or go download it and run it for yourself. If Nvidia were to give me the same deal as ATI has with it's HD4850, I would say the same about said card. This thing has so far run smoking circles around my previous 8800GT OC. So I will from this card forward, NOT count ATI out automatically, regardless of my driver experience. I have been an unreal player for longer than youve been a member of this forum, so I respect your opinion as well as my own on UT3, because we are both qualified, but if you were to jump on the $189.00 HD4850 wagon, and found yourself impressed just like I did, you would trust my opinion next time I give hardware advice. I dont not waste my words, so I never appreciate it when others take no value in them.
Please, please google, directx10 usage in UT3, before you throw unproven facts at me. DX10 is supported in UT3, but not a required runtime, because 9.0c was the most proven at the time of developement.
Comment
-
I agree about the drivers, but if ATI can step up their driver quality, then I'll be rooting for them this round. I can't imagine the beating Nvidia will take if ATI gets raytracing working on their cards--something even the highest-end Nvidia cards do not have the raw power to do efficiently.
Originally posted by cq842000 View PostBefore my response, comes off as standoff-is or offensive to you, please grant me some understanding on your part, and understand that I duely respect you as an adult as well as a gamer, in a time when consoles(yes I have a PS3) are robbing my favorite platform of great future experience, and perceived quality, I respect those that hold true to gaming's real roots( for me it was a 486DX, with windows 3.0 and 256 mb ram)lol. So please read my post and hopefully yo take something from a fellow gamer.
I completely understand, and I commend you for going with what has worked for you, however I am using UT3 benchmarking tool from guru3d and in FACT there is a tick box for directx10. So hmmm, I wonder why they would include such a specific tool for a specific program, if it really had no use and was an absolute waste of even simple dev coding. Also I was always a hardcore Nvidia fan, and in many ways still am, but I cannot deny facts when they stare me in the face, and that would be that this new ATI card is definitely the cream of the crop, for less than I paid for my EVGA 8800GT OC, at its launch. As for driver problems, I must admit I havent had any, so far, and that goes for Nvidia as well. My lowest score in Vista ultimate is a 5.7 for my HDD. If you would like a screen capture of the benchmarking tool with DX10 options, just PM me or go download it and run it for yourself. If Nvidia were to give me the same deal as ATI has with it's HD4850, I would say the same about said card. This thing has so far run smoking circles around my previous 8800GT OC. So I will from this card forward, NOT count ATI out automatically, regardless of my driver experience. I have been an unreal player for longer than youve been a member of this forum, so I respect your opinion as well as my own on UT3, because we are both qualified, but if you were to jump on the $189.00 HD4850 wagon, and found yourself impressed just like I did, you would trust my opinion next time I give hardware advice. I dont not waste my words, so I never appreciate it when others take no value in them.
Please, please google, directx10 usage in UT3, before you throw unproven facts at me. DX10 is supported in UT3, but not a required runtime, because 9.0c was the most proven at the time of developement.
Comment
-
Originally posted by D-Hunter View PostI agree about the drivers, but if ATI can step up their driver quality, then I'll be rooting for them this round. I can't imagine the beating Nvidia will take if ATI gets raytracing working on their cards--something even the highest-end Nvidia cards do not have the raw power to do efficiently.
Maybe you should do some more research. He was completely correct, UT3 does not utilize DX10. All that tick box does is change the value for bAllowD3D10, which currently does nothing. The tool is not a masterpiece of coding by any means, as all it does is launch the game with certain settings, I doubt it took much 'effort' to add the tick box there, as they probably assumed the option would be functional with a future patch.
Comment
-
Well, if the DX10 codepath is slower than DX9, maybe Epic just *decided* not to use it.
[shot]http://enthusiast.hardocp.com/images/articles/1193997119cPiHdMtE6i_4_2.gif[/shot]
DirectX isn't direct at all, it's a **** poor bloated API that eats a lot of CPU. I remember that when I played UT2004 on an Athlon XP + Radeon 9700, I used OpenGL and the old skool Catalyst drivers. You bet that it humilliated DX
Comment
-
Originally posted by Benfica View PostWell, if the DX10 codepath is slower than DX9, maybe Epic just *decided* not to use it and not care about it.
DirectX isn't direct at all, it's a **** poor bloated API that eats a lot of CPU. I remembered that when I played UT2004 on an Athlon XP + Radeon 9700, I used OpenGL and the old skool Catalyst drivers. You bet that it humilliated DX
Comment
-
Originally posted by AnubanUT2 View PostIs the 4870 now ATI's best GPU and is it really better than Nvidia for UT3? Also even though the 900GX2 has 1GB isn't only 512MB actually what is being used like the 7950GX2 card was? And finally I have two 8800GT 512 cards in SLI and I run the game on max settings at 1680x1050 and get consistent framerates of 62 fps on the DM and CTF maps and in the 50s on all the Warfare and VCTF levels. I am also thinking of upgrading my cards soon but there are just so many these days I actually don't have a solid idea. I would like the new GTX280 but I hear that the 9800GX2 is actually more powerful for UT3 and other UE3 powered games. What's the real deal GPU experts? Is ATI back on top after all these years?
Comment
-
Originally posted by i_hax View PostROFL. An 8800GT can get far more than 60fps @ 1280x1024. No problem, and no 'memory goes into pagefile' (not a clue what you're talking about there...)
Another with no clue what hes talking about...
If you monitor your GPU Ram usage in UT3, you will find it uses far less than most games. At 1440x900 lowest settings it uses a mere 70mb, at higher settings it uses a few hundred at most - only 1920x1200 would utilize 1GB of GDDR.
However an 8800GT with 1GB is pointless because your memory bandwidth is too low to make use of it - a 256bit bus with GDDR3 is useless. This is why the new 2__ series use a higher-bit bus, to increase memory bandwidth. ATI's on the other hand, use a 256bit bus with GDDR5 to compensate (a cost friendly solution).
Comment
-
Originally posted by Benfica View PostWell, if the DX10 codepath is slower than DX9, maybe Epic just *decided* not to use it and not care about it.
[shot]http://enthusiast.hardocp.com/images/articles/1193997119cPiHdMtE6i_4_2.gif[/shot]
DirectX isn't direct at all, it's a **** poor bloated API that eats a lot of CPU. I remember that when I played UT2004 on an Athlon XP + Radeon 9700, I used OpenGL and the old skool Catalyst drivers. You bet that it humilliated DX
Comment
Comment