Announcement

Collapse
No announcement yet.

new Nvidia 9600GT slaughters the competition

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    Originally posted by gargorias View Post
    That my friend has been happening since PC's hit the desktop market way back when and it is because of that that we are were we are today with the technology Thank Heavens. PC games have had ups and downs in the market since as long as I can remember but we don't just get top notch PC's these days just to play games do we and as for consoles well if I was that way inclined which I haven't beeen since the demise of the SNES I would just plug it into my beautiful 30" Dell monitor and go "YUK how disgusting" and that would be the end of that - throw the console in the bin? The reason for the console rise at the moment IMHO is because there is so many more bloody games coming out for it then the PC and I know why.
    Ok, PC is losing due to other reasons, warez, friends can play together with a console:, etc...
    My main point is that a lot of people believe that gaming must be too expensive. Because when someone says that "rig A is not fast enough for gaming, you should upgrade", people are always talking about the max settings, the "highest your res, the more you need" AA , etc.., not the image quality of a console.

    Still most of this BS becomes from review sites where the cards are shown on the left, and the ads to buy one appear on the right.

    Leave a comment:


  • replied
    Originally posted by Benfica View Post
    - What's the point of buying high end cards with inflated prices, riding the max out bandwagon, spreading FUD that PC gaming is too expensive, leading others to buy a console and contribute to hurt PC gaming?
    That my friend has been happening since PC's hit the desktop market way back when and it is because of that that we are were we are today with the technology Thank Heavens. PC games have had ups and downs in the market since as long as I can remember but we don't just get top notch PC's these days just to play games do we and as for consoles well if I was that way inclined which I haven't beeen since the demise of the SNES I would just plug it into my beautiful 30" Dell monitor and go "YUK how disgusting" and that would be the end of that - throw the console in the bin? The reason for the console rise at the moment IMHO is because there is so many more bloody games coming out for it then the PC and I know why.

    Leave a comment:


  • replied
    http://www.tweaktown.com/reviews/129...sts/index.html
    Temps and sound are too high. Looks like nVidia stuffed up with G94, they're all hotter than the G92's which run faster. Will have to wait for custom cooling and then see.

    Originally posted by Benfica View Post
    IMO 8800gt owners have 3 options in the summer:
    - Degrade the card
    - Install a better cooler, which defeats the point of not going for a 8800gts-512 instead
    - Underclock it by 10% or so. Congrats, you have now the performance of an ATI HD3870
    That's only if you bought version 1.0 boards with the single slot cooler. My Dual slot V2.0 board runs around 56c on load, I have no problem in summer.
    Anyone who has a single slot just need to replace the cooler, and they'll find it much better, no need to degrade, lower clock or get a GTS, geez man.
    Plus, if you get the DUAL power connector version, simply swap a jumper pin, and you'll have real good overclocking with the right cooler.

    Leave a comment:


  • replied
    Originally posted by Vidiot View Post
    ATI has great driver support, otherwise I wouldn't be mentioning that I get really great FPS with the 8.3 drivers . Even my extremely overclocked 3870 only increases about 2fps in games like Crysis, so a stock 3870 with the 8.3 drivers still beats the 9600gt pretty bad. And when you are talking 10fps more out of only like 40fps, that's a 25% increase!
    Well, that's OK then. Especially since I only paid $129 for my 9600GT, unlike you who paid $200+ for the 3870. An extra 10 FPS in crysis certainly isn't worth an extra $70 or more.

    And 10FPS doesn't even quality as "beating it pretty bad"

    Leave a comment:


  • replied
    [shot]http://img.ncix.com/images/26783_l.jpg[/shot]
    Hmm, the fat cooler on this 8800gt...

    Originally posted by Benfica
    The card on the review is overclocked, against a stock speed hd3870. So who bought the 3870 already shouldn't worry, even more with a dual-slot cooling that allows nice oc So, it ends up being a matter of personal preference I guess
    ...
    I see a lot of BS on cards and gaming, so I had to rant
    ...
    And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?
    http://www.theinquirer.net/gb/inquir...ermal-analysis
    IT SOUNDS LIKE Nvidia's G92, the next-gen high-end part, is having heat problems. Several people told us that a few weeks ago, they got an urgent letter from NV to send them computers that the new G92 would go in for 'thermal analysis'. Hmmm, makes you wonder, doesn't it?

    More interestingly, the OEMs: several told the same story, said they were given about a week to comply, slap it in a box and FexEx that sucker, ASAP. Other than 'thermal analysis' and 'do it now', no explanation was given. That really made uswonder.
    http://www.theinquirer.net/gb/inquir...ermal-analysis
    A FEW MONTHS ago, Nvidia mass mailed OEMs asking for their chassis to be sent in for 'thermal analysis'. What that meant, or why it did it remained a mystery until a little bird gave me the last remaining piece of the puzzle.

    Let's backpedal a bit. If you have been following the G92 vs RV670 soap opera, you will know that the initial leaked numbers showed that ATI was going to pip Nvidia by a few points in the benchmark 3DMark06 benchmark. While I should say benchmark again, I won't, I will just say that early odds had ATI leading by about 10 per cent.

    When the cards came out, Nvidia was in the lead by about 10 per cent, more or less, depending on applications, but by about as much as it was tipped to lose by in the early days. How did it get the 20 per cent? Remember our friend the 'thermal analysis'?

    It seems that Nvidia took a much lower clocked G92 and cranked it to the edge of living with its single slot cooling "solution". It had to validate it against every chassis it could to crank it to the point of errors, then dial it back by the barest minimum.

    So, it looks to have worked on the outside. A quick poll of OEMs shows that in their testing, there is little to no thermal margin for the 8800GT in a one slot configuration. Remember boys and girls, this vents heat inside the chassis, not directly outside like a two slot card.
    IMO 8800gt owners have 3 options in the summer:
    - Degrade the card
    - Install a better cooler, which defeats the point of not going for a 8800gts-512 instead
    - Underclock it by 10% or so. Congrats, you have now the performance of an ATI HD3870

    Leave a comment:


  • replied
    Originally posted by Tomeis View Post
    Exactly. your EXTREMELY OVERCLOCKED 3870. It doesn't even "completely pound" it. Maybe just 10 or so FPS more at higher resolutions.

    For the most part, the two cards are very equal. In some games and resolutions, the 9600 is superior, and in other games at other resolutions, the 3870 is superior.

    Like you said, the 3870 is $170. Well, you can get a 9600GT for $150, and even around $130 (I bought mine for $129). The obvious choice would be the 9600GT as it's cheaper, and the two cards don't differ that much. If you're willing to spend $170 on a 3870.. well... you'd be better off just spending a tiny bit more and getting an 8800GT for $184
    http://www.ncix.com/products/index.p...y%20Technology

    Also, I'm not too familiar with ATI, but I hear their driver support isn't the greatest
    ATI has great driver support, otherwise I wouldn't be mentioning that I get really great FPS with the 8.3 drivers . Even my extremely overclocked 3870 only increases about 2fps in games like Crysis, so a stock 3870 with the 8.3 drivers still beats the 9600gt pretty bad. And when you are talking 10fps more out of only like 40fps, that's a 25% increase!

    Leave a comment:


  • replied
    Originally posted by NeoKnight View Post
    So does the 9 series of nvidia cards have dedicated physics processors built into the card or what?
    Not that I'm aware of.

    Leave a comment:


  • replied
    So does the 9 series of nvidia cards have dedicated physics processors built into the card or what?

    Leave a comment:


  • replied
    Originally posted by Benfica View Post
    Hmm... if 200 is important, then yes. My point is, why not 250 or 150? 150 is for people on a budget, 250 too expensive and 200 "right"? Well ok.
    If someone's budget is $150 and no higher, than I would suggest them the 9600GT. If their budget was $200 and no higher, I would suggest them the 8800GT (If you can snag one for that price). If their budget was $300 and no higher, I would recommend them the 8800GTS 512mb (G92). If their budget was unlimited... well. I think we know where this is going.

    It all depends on what your budget is. For most people who are building a gaming PC, their budget is usually between $150 - $250 for a graphics card. And for a while, the 8800GT was the cheapest "high-end" card you could get.

    (I'd also like to say that when I mention prices, I mean them in canadian dollars, as that's where I'm from)


    For me, the same silicon on the same fab has the same electromigration at the same clock, voltage and temperature. Intel fabs and silicon are usually better than TSMC or whatever.


    That depends on what you consider desireable. Overclock it to 3.5 and you can do it on air. Also some Pentium-M could run at 1GHz for a few minutes without fan and w/o heatsink! All this is debatable has hell. The GPU doesn't take higher temperature because NVidia tells so. No wonder cards have higher failure rate.

    I'm sorry, but I don't know enough technical information about the graphics card chips and processor chips to give you a full answer on why one can take more temperatures over the other. I suggest you go to a dedicated tech forum (not a tech sub-forum on a game forum) to get a definate answer.

    Leave a comment:


  • replied
    Actually, with today's high-end cards, AF does nothing to your FPS. If anything, it'll take off maybe 1fps at the most
    Then it's even better that I thought

    Most people don't use default settings. They usually put them either on all high or all low and adjust until they get a nice balance between quality and performance.
    The reviewers test the cards at the highest graphic settings so that the reader can see exactly how well the card performance under the most intense settings.
    Sorry it wasn't clear, I meant on the drivers. I believe that most people don't tweak the drivers, just download newer or faster ones.

    The 8800GT is the best price for performance card out at the moment if you can buy it for $200 or less.
    Hmm... if 200 is important, then yes. My point is, why not 250 or 150? 150 is for people on a budget, 250 too expensive and 200 "right"? Well ok.

    Graphics cards and processors are made very differently. Graphics cards can take temperatures up to 90C. Processors on the other hand, take A LOT less heat.
    For me, the same silicon on the same fab has the same electromigration at the same clock, voltage and temperature. Intel fabs and silicon are usually better than TSMC or whatever.

    That's why you can overclock a graphics card without watercooling and still get disirable results. If you want to overclock a high-end processor higher than lets say, 3.8ghz or so, you'll need some type of extreme cooling because it just can't handle the higher temperatures.
    That depends on what you consider desireable. Overclock it to 3.5 and you can do it on air. Also some Pentium-M could run at 1GHz for a few minutes without fan and w/o heatsink! All this is debatable has hell. The GPU doesn't take higher temperature because NVidia tells so. No wonder cards have higher failure rate.


    Some people who have high-end graphics cards like dual 8800 ultra, or the 9800GX2 have the money to spend. They most likely also have other top of the line hardware like processor, RAM, cooling, etc. They will also upgrade their already high-end PC whenever the have the chance to. They also usually have the big screen HDTVs, and the consoles to go with it. Of course, there will also be quite a few that just bought it as a one-time thing and saved up forever to buy a high-end PC, and won't be buying a new one for a few years.
    Exactly, people will! PC gaming is not expensive, but it gives that impression. The specs of the PS3:
    - Cell processor. A lower Athlon X2 or the new Pentium dual-core cost 60€. They can be overclocked and the x86 is much easier to program. And the PC can be used for anything.
    - Memory: 256MB. 1 GB for 20€
    - 40GB disk. Hitachi 160GB = 40€
    - Video is similar to a 7800gt. For 1280x720, no-AA. But an ATI 1950pro or 8600gt cost 70€, HD3650 = 60€. Note that with the same image quality of the console. Not maxed out, this part is what people don't get.
    - PS3 have Bluray player, but the PC can have DVD burner, 30€
    - A good motherboard without tons of stuff costs 80. With SATA ports, dual-network cards, expansion slots
    - It's possible to buy a low end case with 50€.
    - 50€ for floppy, keyboard and mouse. None on the PS3, just the controller
    Windows Media Center: 90€

    Total: 490€, even though I can get the above for 450€ if I cut some corners.

    Leave a comment:


  • replied
    Originally posted by Benfica View Post
    All IMHO, of course. Ok, I like decent kit, but I really don't care about gfx cards. You can prove me wrong on some of the points, it has typos, tons of "?", etc... I'm kinda tired and I'm not a specialist. I see a lot of BS on cards and gaming, so I had to rant


    - I don't care about rocks, sucks, who beats who, having card faster or slower. If an OC'ed card considered crappy but has 90 to 95% of the performance of another known sexy kickass that owns any game thrown at it, how come the former sucks?


    - The higher the res, the higher the AA some people use! Wtf? It's exactly the opposite, right? Sometimes a game it's unplayble at 1680x1050 4xAA. What's the point of hogging the card with 4xAA, sometimes being impossible to see the jaggies with 2xAA and very hard even without.


    - Why feeling bad that a game lags on settings that the guy doesn't use or care? And what the hell do I care about how a card runs game A or B, if I don't give a rat about them and play UT3 where it is sick fast?


    - People max out, whatever max is. I know 3 games that even the grass has shadows! On 1 of them it's "unplayable", so the res must come down to 1024x768 no-AA. That is not max out, it's pure ****. And they whine that card A and card B are not good enough for gaming. Aren't the devs allowed to create whatever improvements and a kickass engine? Should they be limited and be careful to not make highest a tad slow, because a lot of guys whine that they can't run on highest, only on on med-high, and they say that the game is poorly optimized and it sucks. Can't the players go to advanced settings and disable the most demanding feature. Are people like that?


    - The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?



    - The 9600gt, 8800gs, HD3850-512, HD3870 and the older 8800gts-320 are within 20% performance difference. So:
    * If people are OC'ers why don't they take into account that now, there are changes between them?
    * The 8800gts-320 is a still kickass card that a lot of people bought. Since it fits well on the previous group, does any of the others suck by magic?
    * The other way around: How the hell it is "outdated" when it has the performance of others that are new and considered good?
    * I hate NVidia with a passion because of this: they released the 8600gts costing only 40€ less that the 8800gts because there wasn't much competition on DX10 IIRC. That had the psycological effect of making people consider the 8800 a sick deal and ruuuush for it. A card with only 320MB Ram?? And 100€ for extra 320MB
    * And the sexy names, who cares if the name 9600gt is sexier than the others?



    - What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?



    - E-*****: Is it rational to brag about a very high end card? That's debatable but I don't care. But someone feeling bad because someone else has a 20% faster device? Either a game is playable on both or it's not. For instance, can the guy with the faster card that costed 500€, stand the fact that min fps go below 40? He bought it to play without compromise at all. Does the other guy need to feel bad because it only has 95% of the image quality if he needs to reach the other guys fps?



    - Hard-disk failure is horrible, so are memory problems that quite often lead to data corruption on the cache, crashes, Blue Screen etc... A CPU temporary failure may lead to Blue Screen, etc.... But a gfx card data corruption sometimes leads only to a garbled display. Why doesn't some people make an effort to make sure that other components are really stable and good? And btw, ensure top performance cards, and then must cheapo on other components, saving 20€ and 30€ there?



    - What's the point of buying high end cards with inflating prices, riding the max out bandwagon, saying that gaming is too expensive, then buy a console and contribute to hurt PC gaming?
    I read your entire post and there are some things I agree with, some things I don't agree with, and some things I couldn't fully make out.

    I will comment on a few things, though.

    I agree with you about the high resolution and anti-aliasing. I've noticed from experience that if you're playing at your native resolution, the difference between 4xAA, 8xAA and 16xAA is VERY small. The easiest way to get rid of jaggies is by just moving furthur away from the screen. I'm not sure about the PS3 and 360, but the original xbox didn't have anti-aliasing AND it played at a low resolution. But, because we all sat 10 or so feet away from the TV, you could barely notice them.

    You said:
    "The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?"

    Actually, with today's high-end cards, AF does nothing to your FPS. If anything, it'll take off maybe 1fps at the most. This is how AA will hopefully be in the future. Most people don't use default settings. They usually put them either on all high or all low and adjust until they get a nice balance between quality and performance. The reviewers test the cards at the highest graphic settings so that the reader can see exactly how well the card performance under the most intense settings.

    You said: "What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?"

    The 8800GT is the best price for performance card out at the moment if you can buy it for $200 or less.

    Graphics cards and processors are made very differently. Graphics cards can take temperatures up to 90C. Processors on the other hand, take A LOT less heat. That's why you can overclock a graphics card without watercooling and still get disirable results. If you want to overclock a high-end processor higher than lets say, 3.8ghz or so, you'll need some type of extreme cooling because it just can't handle the higher temperatures.

    You said: "What's the point of buying high end cards with inflating prices, riding the max out bandwagon, saying that gaming is too expensive, then buy a console and contribute to hurt PC gaming?"

    Some people who have high-end graphics cards like dual 8800 ultra, or the 9800GX2 have the money to spend. They most likely also have other top of the line hardware like processor, RAM, cooling, etc. They will also upgrade their already high-end PC whenever the have the chance to. They also usually have the big screen HDTVs, and the consoles to go with it. Of course, there will also be quite a few that just bought it as a one-time thing and saved up forever to buy a high-end PC, and won't be buying a new one for a few years.

    Leave a comment:


  • replied
    All IMHO, of course. Ok, I like decent kit, but I really don't care about gfx cards. You can prove me wrong on some of the points, it has typos, tons of "?", etc... I'm kinda tired and I'm not a specialist. I see a lot of BS on cards and gaming, so I had to rant


    - I don't care about rocks, sucks, who beats who, having card faster or slower. Wtf is the point of all this if my slower card gives excellent 80 fps?


    - If an OC'ed card considered crappy but has 90 to 95% of the performance of another known sexy kickass that owns any game thrown at it, how come the former sucks?


    - The higher the res, the higher the AA some people use! Wtf? It's exactly the opposite, right? Sometimes a game it's unplayble at 1680x1050 4xAA. What's the point of hogging the card with 4xAA, sometimes being impossible to see the jaggies with 2xAA and very hard even without.


    - Why feeling bad that a game lags on settings that the guy doesn't use or care? And what the hell do I care about how a card runs game A or B, if I don't give a rat about them and play UT3 where it is sick fast?


    - People max out, whatever max is. I know 3 games that even the grass has shadows! On 1 of them it's "unplayable", so the res must come down to 1024x768 no-AA. That is not max out, it's pure ****. And they whine that card A and card B are not good enough for gaming. Aren't the devs allowed to create whatever improvements and a kickass engine? Should they be limited and be careful to not make highest a tad slow, because a lot of guys whine that they can't run on highest, only on med-high, or highest except 2 settings, say that the game is poorly optimized and it sucks. Can't the players go to advanced settings and disable the most demanding feature. Are people like that?


    - The reviewers don't have a clue that AF usually eats 3% to 5% of the fps and use (no-AA, no-AF) or both, neglecting (no-AA, 16xAF). Others compare ATI and NVidia cards by configuring the drivers to highest quality. What do they know about how demanding the lack of optimization is for different cards and if they can tell the difference why not they post IQ screenshots? if most people use defaults, what's the use of the benchmarks of different settings?



    - The 9600gt, 8800gs, HD3850-512, HD3870 and the older 8800gts-320 are within 20% performance difference. So:
    * If people are OC'ers why don't they take into account that now, there are changes between them?
    * The 8800gts-320 is a still kickass card that a lot of people bought. Since it fits well on the previous group, does any of the others suck by magic?
    * The other way around: How the hell it is "outdated" when it has the performance of others that are new and considered good?
    * I hate NVidia with a passion because of this: they released the 8600gts costing only 40€ less that the 8800gts because there wasn't much competition on DX10 IIRC. That had the psycological effect of making people consider the 8800 a sick deal and ruuuush for it. A card with only 320MB Ram?? And 100€ for extra 320MB
    * And the sexy names, who cares if the name 9600gt is sexier than the others?



    - What's up with the 8800gt madness anyway? I'm tired to see "with a bit more" go for the 8800gt. Why the hell stop here and don't go for the 8800gts-512, or HD3850-Crossfire? And how come 90ºC is acceptable for a 65nm circuit and people are scared with 60ºC on a CPU?



    - E-*****: Is it rational to brag about a very high end card? That's debatable but I don't care. But someone feeling bad because someone else has a 20% faster device? Either a game is playable on both or it's not. For instance, can the guy with the faster card that costed 500€, stand the fact that min fps go below 40? He bought it to play without compromise at all. Does the other guy need to feel bad because it only has 95% of the image quality if he needs to reach the other guys fps?



    - Hard-disk failure is horrible, so are memory problems that quite often lead to data corruption on the cache, crashes, Blue Screen etc... A CPU temporary failure may lead to Blue Screen, etc.... But a gfx card data corruption sometimes leads only to a garbled display. Why doesn't some people make an effort to be sure that other components are really stable and good? And btw, ensure top performance cards, and then must cheapo on other components, saving 20€ here and 30€ there?



    - What's the point of buying high end cards with inflated prices, riding the max out bandwagon, spreading FUD that PC gaming is too expensive, leading others to buy a console and contribute to hurt PC gaming?

    Leave a comment:


  • replied
    Originally posted by Vidiot View Post
    Maybe in those OLD benchmarks the 9600gt barely beat my 3870, but now there are new patches for those games, and new drivers for the 3870 :P. My point is that my extremely overclocked 3870 + new drivers completely pounds the 9600gt into the ground. And not only that, but the 3870 is only $170 right now!
    Exactly. your EXTREMELY OVERCLOCKED 3870. It doesn't even "completely pound" it. Maybe just 10 or so FPS more at higher resolutions.

    For the most part, the two cards are very equal. In some games and resolutions, the 9600 is superior, and in other games at other resolutions, the 3870 is superior.

    Like you said, the 3870 is $170. Well, you can get a 9600GT for $150, and even around $130 (I bought mine for $129). The obvious choice would be the 9600GT as it's cheaper, and the two cards don't differ that much. If you're willing to spend $170 on a 3870.. well... you'd be better off just spending a tiny bit more and getting an 8800GT for $184
    http://www.ncix.com/products/index.p...y%20Technology

    Also, I'm not too familiar with ATI, but I hear their driver support isn't the greatest

    Leave a comment:


  • replied
    Originally posted by Bret Hart View Post
    Just tickle it.
    Meh. With OCs I have to run at 5/3 to get a reliable 60 FPS all the time. I can run 5/5 easily on some maps, 5/4 easily on most... others, 5/2 would probably be a little better. More often I'm hurt by my CPU tho.

    Leave a comment:


  • replied
    Originally posted by ]NIN[ View Post
    My 8800GTS 640MB doesn't laugh how do you make it do that?
    Just tickle it.

    Leave a comment:

Working...
X