Since the last version of this thread got closed because of flaming in slightly off topic discussion (let's be nice), I thought I'd give it another shot because there is more evidence that might support the 8800GT 256mb as the better buy in terms of dollar/fps for UT3.
There is a review at Expreview.com of the 8800GT 256MB which shows test data for UT3 and Bioshock. Note that the 8800GT 512MB shows little--if any--discernible gain over the 8800GT 256MB in terms of FPS in UT3 at 1280x1024, 1680x1050, and 1920x1200. Only in the Bioshock test at 1920x1200 does the 256MB version of the card start to fall behind because of its smaller memory.
Furthermore, while not a test of the 8800GT, [H]Enthusiast found that additional memory in the different 8800GTS versions and the HD 3850 made only a slight different at 1920x1200 when AA was not enabled in UT3.
Then there is another test of the EVGA 8800GT 256mb in Bioshock shown to beat out the reference board EVGA version of the 8800GT 512MB, probably due to the higher clock speed of the 8800GT 256mb version (@650mhz).
So it sounds like the additional memory is not providing much assistance into texture processing until very high resolutions. Given that UT3 appears to be CPU bound in some spots on some maps (see this lengthy thread), if building a new system and monitor res of 1680 or less, the better buy in terms of performance is likely to be the 8800GT 256mb and more money put into CPU to raise lower end FPS than to buy the 512mb version and a lower CPU. And certainly, the dollar/fps value is not there in the 512mb version.That is, unless one is looking for bragging rights: "My GPU is bigger than your GPU"
Does this sound right?
Sidenote: this data would also suggest that Gamespot did not make a typo in their hardware review of UT3 regarding the 8600GTS. 512MB over 256MB is likely not offering the performance gains that people are assuming except at very, very high res. It's just a good marketing strategy on ATI and Nvidia's part for selling a more expensive card.
There is a review at Expreview.com of the 8800GT 256MB which shows test data for UT3 and Bioshock. Note that the 8800GT 512MB shows little--if any--discernible gain over the 8800GT 256MB in terms of FPS in UT3 at 1280x1024, 1680x1050, and 1920x1200. Only in the Bioshock test at 1920x1200 does the 256MB version of the card start to fall behind because of its smaller memory.
Furthermore, while not a test of the 8800GT, [H]Enthusiast found that additional memory in the different 8800GTS versions and the HD 3850 made only a slight different at 1920x1200 when AA was not enabled in UT3.
Then there is another test of the EVGA 8800GT 256mb in Bioshock shown to beat out the reference board EVGA version of the 8800GT 512MB, probably due to the higher clock speed of the 8800GT 256mb version (@650mhz).
So it sounds like the additional memory is not providing much assistance into texture processing until very high resolutions. Given that UT3 appears to be CPU bound in some spots on some maps (see this lengthy thread), if building a new system and monitor res of 1680 or less, the better buy in terms of performance is likely to be the 8800GT 256mb and more money put into CPU to raise lower end FPS than to buy the 512mb version and a lower CPU. And certainly, the dollar/fps value is not there in the 512mb version.That is, unless one is looking for bragging rights: "My GPU is bigger than your GPU"

Does this sound right?
Sidenote: this data would also suggest that Gamespot did not make a typo in their hardware review of UT3 regarding the 8600GTS. 512MB over 256MB is likely not offering the performance gains that people are assuming except at very, very high res. It's just a good marketing strategy on ATI and Nvidia's part for selling a more expensive card.
Comment