Announcement

Collapse
No announcement yet.

new Nvidia 9600GT slaughters the competition

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #31
    Originally posted by Benfica View Post
    With Portuguese prices, I would need to be on something strong to go for a 9600GT or a 8800gt

    HD3850-256: 100€
    8800gs-384: 113€
    HD3850-512: 121€
    HD3870-512: 145€
    9600gt-512: 153€
    8800gt-512: 172€
    HD3850-256 Crossfire: 200€

    For value, the best is the 8800gs, even more because they can OC 25% to 30%. For performance, I would go for the HD3850-Crossfire
    High overclocks do not equal vastly higher performance... only slightly.

    Clocks are only one part of a cards performance... and a relatively small part, particularly with newer cards where stream processors matter most.

    Comment


      #32
      Why should I get one when my 8800 GTS 640 mb laughs at UT3?

      Comment


        #33
        Originally posted by Bret Hart View Post
        Why should I get one when my 8800 GTS 640 mb laughs at UT3?
        My 8800GTS 640MB doesn't laugh how do you make it do that?

        I wouldn't mind a 9800GTX if I was flush with cash, but not much point as I can play UT3 fine with what I have.

        Comment


          #34
          Originally posted by Harmatia View Post
          High overclocks do not equal vastly higher performance... only slightly.

          Clocks are only one part of a cards performance... and a relatively small part, particularly with newer cards where stream processors matter most.
          Well, what usually matters is clock x number

          Specs:
          ATI: Shaders=(256simple+64*Complex)
          HD3850: Core=670, Memory=828, 53GB/s
          HD3870: Core=777, Memory=1126, 72GB/s


          8800gs-384: Core=550, Shaders=96x1375, Memory=800, 38.6GB/s, 12 Texture Units
          9600gt-512: Core=650, Shaders=64x1625, Memory=950, 60GB/s, 16 Texture Units
          8800gt-512: Core=600, Shaders=112x1500, Memory=900, 57.6GB/s, 16 Texture Units

          8800gs-384 OC,Core=680, Shaders=96x1728, Memory=1000, 48GB/s
          9600gt OC,Core=720, Shaders=64x1728, Memory=1000, 64GB/s
          8800gt OC, Core=680,Shaders=112x1728, Memory=1000, 64GB/s

          My point is that the 9600gt is already pushed to the limit, while the HD3850 and the 8800gs are not.

          Comment


            #35
            Originally posted by Benfica View Post
            Well, what usually matters is clock x number

            Specs:
            ATI: Shaders=(256simple+64*Complex)
            HD3850: Core=670, Memory=828, 53GB/s
            HD3870: Core=777, Memory=1126, 72GB/s


            8800gs-384: Core=550, Shaders=96x1375, Memory=800, 38.6GB/s, 12 Texture Units
            9600gt-512: Core=650, Shaders=64x1625, Memory=950, 60GB/s, 16 Texture Units
            8800gt-512: Core=600, Shaders=112x1500, Memory=900, 57.6GB/s, 16 Texture Units

            8800gs-384 OC,Core=680, Shaders=96x1728, Memory=1000, 48GB/s
            9600gt OC,Core=720, Shaders=64x1728, Memory=1000, 64GB/s
            9600gt OC, Core=680,Shaders=112x1728, Memory=1000, 64GB/s

            My point is that the 9600gt is already pushed to the limit, while the HD3850 and the 8800gs are not.

            The thing about the 8800gs compared to the 9600gt is that the 88GS has a 192bit memory interface, while the 96gt has a 256bit interface.

            The 8800gs is within the 3850 range, while the 9600gt is within the 3870 range.

            Also, take a look at this:
            http://www.xtremesystems.org/forums/...d.php?t=182272

            He got his 9600GT to a stable overclock of 821mhz core, 2105mhz shader, 1050mhz memory (2100mhz)

            Comment


              #36
              Maybe in those OLD benchmarks the 9600gt barely beat my 3870, but now there are new patches for those games, and new drivers for the 3870 :P. My point is that my extremely overclocked 3870 + new drivers completely pounds the 9600gt into the ground. And not only that, but the 3870 is only $170 right now!

              Comment


                #37
                the 8800GS is slower than a 9600GT and soon it will be named a 9600GSO to hopefully drive up sales.
                http://en.expreview.com/2008/04/04/n...force-9600gso/

                Comment


                  #38
                  Originally posted by ]NIN[ View Post
                  My 8800GTS 640MB doesn't laugh how do you make it do that?
                  Just tickle it.

                  Comment


                    #39
                    Originally posted by Bret Hart View Post
                    Just tickle it.
                    Meh. With OCs I have to run at 5/3 to get a reliable 60 FPS all the time. I can run 5/5 easily on some maps, 5/4 easily on most... others, 5/2 would probably be a little better. More often I'm hurt by my CPU tho.

                    Comment


                      #40
                      Originally posted by Vidiot View Post
                      Maybe in those OLD benchmarks the 9600gt barely beat my 3870, but now there are new patches for those games, and new drivers for the 3870 :P. My point is that my extremely overclocked 3870 + new drivers completely pounds the 9600gt into the ground. And not only that, but the 3870 is only $170 right now!
                      Exactly. your EXTREMELY OVERCLOCKED 3870. It doesn't even "completely pound" it. Maybe just 10 or so FPS more at higher resolutions.

                      For the most part, the two cards are very equal. In some games and resolutions, the 9600 is superior, and in other games at other resolutions, the 3870 is superior.

                      Like you said, the 3870 is $170. Well, you can get a 9600GT for $150, and even around $130 (I bought mine for $129). The obvious choice would be the 9600GT as it's cheaper, and the two cards don't differ that much. If you're willing to spend $170 on a 3870.. well... you'd be better off just spending a tiny bit more and getting an 8800GT for $184
                      http://www.ncix.com/products/index.p...y%20Technology

                      Also, I'm not too familiar with ATI, but I hear their driver support isn't the greatest

                      Comment


                        #41
                        All IMHO, of course. Ok, I like decent kit, but I really don't care about gfx cards. You can prove me wrong on some of the points, it has typos, tons of "?", etc... I'm kinda tired and I'm not a specialist. I see a lot of BS on cards and gaming, so I had to rant


                        - I don't care about rocks, sucks, who beats who, having card faster or slower. Wtf is the point of all this if my slower card gives excellent 80 fps?


                        - If an OC'ed card considered crappy but has 90 to 95% of the performance of another known sexy kickass that owns any game thrown at it, how come the former sucks?


                        - The higher the res, the higher the AA some people use! Wtf? It's exactly the opposite, right? Sometimes a game it's unplayble at 1680x1050 4xAA. What's the point of hogging the card with 4xAA, sometimes being impossible to see the jaggies with 2xAA and very hard even without.


                        - Why feeling bad that a game lags on settings that the guy doesn't use or care? And what the hell do I care about how a card runs game A or B, if I don't give a rat about them and play UT3 where it is sick fast?


                        - People max out, whatever max is. I know 3 games that even the grass has shadows! On 1 of them it's "unplayable", so the res must come down to 1024x768 no-AA. That is not max out, it's pure ****. And they whine that card A and card B are not good enough for gaming. Aren't the devs allowed to create whatever improvements and a kickass engine? Should they be limited and be careful to not make highest a tad slow, because a lot of guys whine that they can't run on highest, only on med-high, or highest except 2 settings, say that the game is poorly optimized and it sucks. Can't the players go to advanced settings and disable the most demanding feature. Are people like that?


                        - The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?



                        - The 9600gt, 8800gs, HD3850-512, HD3870 and the older 8800gts-320 are within 20% performance difference. So:
                        * If people are OC'ers why don't they take into account that now, there are changes between them?
                        * The 8800gts-320 is a still kickass card that a lot of people bought. Since it fits well on the previous group, does any of the others suck by magic?
                        * The other way around: How the hell it is "outdated" when it has the performance of others that are new and considered good?
                        * I hate NVidia with a passion because of this: they released the 8600gts costing only 40€ less that the 8800gts because there wasn't much competition on DX10 IIRC. That had the psycological effect of making people consider the 8800 a sick deal and ruuuush for it. A card with only 320MB Ram?? And 100€ for extra 320MB
                        * And the sexy names, who cares if the name 9600gt is sexier than the others?



                        - What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?



                        - E-*****: Is it rational to brag about a very high end card? That's debatable but I don't care. But someone feeling bad because someone else has a 20% faster device? Either a game is playable on both or it's not. For instance, can the guy with the faster card that costed 500€, stand the fact that min fps go below 40? He bought it to play without compromise at all. Does the other guy need to feel bad because it only has 95% of the image quality if he needs to reach the other guys fps?



                        - Hard-disk failure is horrible, so are memory problems that quite often lead to data corruption on the cache, crashes, Blue Screen etc... A CPU temporary failure may lead to Blue Screen, etc.... But a gfx card data corruption sometimes leads only to a garbled display. Why doesn't some people make an effort to be sure that other components are really stable and good? And btw, ensure top performance cards, and then must cheapo on other components, saving 20€ here and 30€ there?



                        - What's the point of buying high end cards with inflated prices, riding the max out bandwagon, spreading FUD that PC gaming is too expensive, leading others to buy a console and contribute to hurt PC gaming?

                        Comment


                          #42
                          Originally posted by Benfica View Post
                          All IMHO, of course. Ok, I like decent kit, but I really don't care about gfx cards. You can prove me wrong on some of the points, it has typos, tons of "?", etc... I'm kinda tired and I'm not a specialist. I see a lot of BS on cards and gaming, so I had to rant


                          - I don't care about rocks, sucks, who beats who, having card faster or slower. If an OC'ed card considered crappy but has 90 to 95% of the performance of another known sexy kickass that owns any game thrown at it, how come the former sucks?


                          - The higher the res, the higher the AA some people use! Wtf? It's exactly the opposite, right? Sometimes a game it's unplayble at 1680x1050 4xAA. What's the point of hogging the card with 4xAA, sometimes being impossible to see the jaggies with 2xAA and very hard even without.


                          - Why feeling bad that a game lags on settings that the guy doesn't use or care? And what the hell do I care about how a card runs game A or B, if I don't give a rat about them and play UT3 where it is sick fast?


                          - People max out, whatever max is. I know 3 games that even the grass has shadows! On 1 of them it's "unplayable", so the res must come down to 1024x768 no-AA. That is not max out, it's pure ****. And they whine that card A and card B are not good enough for gaming. Aren't the devs allowed to create whatever improvements and a kickass engine? Should they be limited and be careful to not make highest a tad slow, because a lot of guys whine that they can't run on highest, only on on med-high, and they say that the game is poorly optimized and it sucks. Can't the players go to advanced settings and disable the most demanding feature. Are people like that?


                          - The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?



                          - The 9600gt, 8800gs, HD3850-512, HD3870 and the older 8800gts-320 are within 20% performance difference. So:
                          * If people are OC'ers why don't they take into account that now, there are changes between them?
                          * The 8800gts-320 is a still kickass card that a lot of people bought. Since it fits well on the previous group, does any of the others suck by magic?
                          * The other way around: How the hell it is "outdated" when it has the performance of others that are new and considered good?
                          * I hate NVidia with a passion because of this: they released the 8600gts costing only 40€ less that the 8800gts because there wasn't much competition on DX10 IIRC. That had the psycological effect of making people consider the 8800 a sick deal and ruuuush for it. A card with only 320MB Ram?? And 100€ for extra 320MB
                          * And the sexy names, who cares if the name 9600gt is sexier than the others?



                          - What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?



                          - E-*****: Is it rational to brag about a very high end card? That's debatable but I don't care. But someone feeling bad because someone else has a 20% faster device? Either a game is playable on both or it's not. For instance, can the guy with the faster card that costed 500€, stand the fact that min fps go below 40? He bought it to play without compromise at all. Does the other guy need to feel bad because it only has 95% of the image quality if he needs to reach the other guys fps?



                          - Hard-disk failure is horrible, so are memory problems that quite often lead to data corruption on the cache, crashes, Blue Screen etc... A CPU temporary failure may lead to Blue Screen, etc.... But a gfx card data corruption sometimes leads only to a garbled display. Why doesn't some people make an effort to make sure that other components are really stable and good? And btw, ensure top performance cards, and then must cheapo on other components, saving 20€ and 30€ there?



                          - What's the point of buying high end cards with inflating prices, riding the max out bandwagon, saying that gaming is too expensive, then buy a console and contribute to hurt PC gaming?
                          I read your entire post and there are some things I agree with, some things I don't agree with, and some things I couldn't fully make out.

                          I will comment on a few things, though.

                          I agree with you about the high resolution and anti-aliasing. I've noticed from experience that if you're playing at your native resolution, the difference between 4xAA, 8xAA and 16xAA is VERY small. The easiest way to get rid of jaggies is by just moving furthur away from the screen. I'm not sure about the PS3 and 360, but the original xbox didn't have anti-aliasing AND it played at a low resolution. But, because we all sat 10 or so feet away from the TV, you could barely notice them.

                          You said:
                          "The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?"

                          Actually, with today's high-end cards, AF does nothing to your FPS. If anything, it'll take off maybe 1fps at the most. This is how AA will hopefully be in the future. Most people don't use default settings. They usually put them either on all high or all low and adjust until they get a nice balance between quality and performance. The reviewers test the cards at the highest graphic settings so that the reader can see exactly how well the card performance under the most intense settings.

                          You said: "What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?"

                          The 8800GT is the best price for performance card out at the moment if you can buy it for $200 or less.

                          Graphics cards and processors are made very differently. Graphics cards can take temperatures up to 90C. Processors on the other hand, take A LOT less heat. That's why you can overclock a graphics card without watercooling and still get disirable results. If you want to overclock a high-end processor higher than lets say, 3.8ghz or so, you'll need some type of extreme cooling because it just can't handle the higher temperatures.

                          You said: "What's the point of buying high end cards with inflating prices, riding the max out bandwagon, saying that gaming is too expensive, then buy a console and contribute to hurt PC gaming?"

                          Some people who have high-end graphics cards like dual 8800 ultra, or the 9800GX2 have the money to spend. They most likely also have other top of the line hardware like processor, RAM, cooling, etc. They will also upgrade their already high-end PC whenever the have the chance to. They also usually have the big screen HDTVs, and the consoles to go with it. Of course, there will also be quite a few that just bought it as a one-time thing and saved up forever to buy a high-end PC, and won't be buying a new one for a few years.

                          Comment


                            #43
                            Actually, with today's high-end cards, AF does nothing to your FPS. If anything, it'll take off maybe 1fps at the most
                            Then it's even better that I thought

                            Most people don't use default settings. They usually put them either on all high or all low and adjust until they get a nice balance between quality and performance.
                            The reviewers test the cards at the highest graphic settings so that the reader can see exactly how well the card performance under the most intense settings.
                            Sorry it wasn't clear, I meant on the drivers. I believe that most people don't tweak the drivers, just download newer or faster ones.

                            The 8800GT is the best price for performance card out at the moment if you can buy it for $200 or less.
                            Hmm... if 200 is important, then yes. My point is, why not 250 or 150? 150 is for people on a budget, 250 too expensive and 200 "right"? Well ok.

                            Graphics cards and processors are made very differently. Graphics cards can take temperatures up to 90C. Processors on the other hand, take A LOT less heat.
                            For me, the same silicon on the same fab has the same electromigration at the same clock, voltage and temperature. Intel fabs and silicon are usually better than TSMC or whatever.

                            That's why you can overclock a graphics card without watercooling and still get disirable results. If you want to overclock a high-end processor higher than lets say, 3.8ghz or so, you'll need some type of extreme cooling because it just can't handle the higher temperatures.
                            That depends on what you consider desireable. Overclock it to 3.5 and you can do it on air. Also some Pentium-M could run at 1GHz for a few minutes without fan and w/o heatsink! All this is debatable has hell. The GPU doesn't take higher temperature because NVidia tells so. No wonder cards have higher failure rate.


                            Some people who have high-end graphics cards like dual 8800 ultra, or the 9800GX2 have the money to spend. They most likely also have other top of the line hardware like processor, RAM, cooling, etc. They will also upgrade their already high-end PC whenever the have the chance to. They also usually have the big screen HDTVs, and the consoles to go with it. Of course, there will also be quite a few that just bought it as a one-time thing and saved up forever to buy a high-end PC, and won't be buying a new one for a few years.
                            Exactly, people will! PC gaming is not expensive, but it gives that impression. The specs of the PS3:
                            - Cell processor. A lower Athlon X2 or the new Pentium dual-core cost 60€. They can be overclocked and the x86 is much easier to program. And the PC can be used for anything.
                            - Memory: 256MB. 1 GB for 20€
                            - 40GB disk. Hitachi 160GB = 40€
                            - Video is similar to a 7800gt. For 1280x720, no-AA. But an ATI 1950pro or 8600gt cost 70€, HD3650 = 60€. Note that with the same image quality of the console. Not maxed out, this part is what people don't get.
                            - PS3 have Bluray player, but the PC can have DVD burner, 30€
                            - A good motherboard without tons of stuff costs 80. With SATA ports, dual-network cards, expansion slots
                            - It's possible to buy a low end case with 50€.
                            - 50€ for floppy, keyboard and mouse. None on the PS3, just the controller
                            Windows Media Center: 90€

                            Total: 490€, even though I can get the above for 450€ if I cut some corners.

                            Comment


                              #44
                              Originally posted by Benfica View Post
                              Hmm... if 200 is important, then yes. My point is, why not 250 or 150? 150 is for people on a budget, 250 too expensive and 200 "right"? Well ok.
                              If someone's budget is $150 and no higher, than I would suggest them the 9600GT. If their budget was $200 and no higher, I would suggest them the 8800GT (If you can snag one for that price). If their budget was $300 and no higher, I would recommend them the 8800GTS 512mb (G92). If their budget was unlimited... well. I think we know where this is going.

                              It all depends on what your budget is. For most people who are building a gaming PC, their budget is usually between $150 - $250 for a graphics card. And for a while, the 8800GT was the cheapest "high-end" card you could get.

                              (I'd also like to say that when I mention prices, I mean them in canadian dollars, as that's where I'm from)


                              For me, the same silicon on the same fab has the same electromigration at the same clock, voltage and temperature. Intel fabs and silicon are usually better than TSMC or whatever.


                              That depends on what you consider desireable. Overclock it to 3.5 and you can do it on air. Also some Pentium-M could run at 1GHz for a few minutes without fan and w/o heatsink! All this is debatable has hell. The GPU doesn't take higher temperature because NVidia tells so. No wonder cards have higher failure rate.

                              I'm sorry, but I don't know enough technical information about the graphics card chips and processor chips to give you a full answer on why one can take more temperatures over the other. I suggest you go to a dedicated tech forum (not a tech sub-forum on a game forum) to get a definate answer.

                              Comment


                                #45
                                So does the 9 series of nvidia cards have dedicated physics processors built into the card or what?

                                Comment

                                Working...
                                X