Announcement

Collapse
No announcement yet.

New Tim Sweeny interview, mostly about tech-specs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #46
    Originally posted by Moloko View Post
    If everyone thought like a couple of posters here, we'd still all be playing 2d games on green monitors...so negative about progress. Is Vista perfect ,hell no, it needs a lot of trimming,polishing. Its excessive memory use is probably tied up with all the unfortunate DRM stuff and security/ file integrity checks.
    People still are not understanding my point. I don't CARE if vista uses more resources. Of course it does! It's a newer OS released in a generation of faster PCs! I am NOT against moving forward (why would I want to get DX10 on XP then). I do not have ANY problems with shelling out megabucks for new hardware to play new games every 6 months (even if I don't do that, if I had the money I would).

    I DO have a problem with microsoft trying to act as "big brother" with vista. All the drm BULL. Paying money for the RIGHT to listen to your music (compressed mp3s at that) instead of actually OWNING it is NOT moving forward. It's not quite that bad with standard a/v yet, but it is annoying and it is getting there. It IS that bad with the HD stuff already with vista. I do no t feel like typing out 3 pages of information here just to explain my personal reasoning to people on an online game forum. If you want to know for yourself search google. If you don't agree with my opinion, fine. I don't care. But please don't accuse me of things like being stubborn, not wanting to move forward or spend money, etc... before you even know the details behind my reasoning. It has NOTHING to do with using more resources or having to spend money (although those are a little annoying).

    Comment


      #47
      Originally posted by Moloko View Post
      If everyone thought like a couple of posters here, we'd still all be playing 2d games on green monitors...
      Wait... you guys are playing this game in color? WTF?

      Looks like I've gotta upgrade to that CGA I've been hearing so much about! Four colors at once, here I come!

      Comment


        #48
        Originally posted by martinblank View Post
        I DO have a problem with microsoft trying to act as "big brother" with vista.
        Me too! All those toe-touchies before breakfast are really getting to me! But it looks like people provide their own Two-Minute Hate without much prompting, which is good for the war effort, I guess.

        Freedom is Slavery!

        Comment


          #49
          Originally posted by Benfica View Post
          I'm so tired of this BS. Maybe I'm getting old and grumpy

          d00d, because the 6800 has the same programming capabilities of 2x 7950GX2 SLI. Any shader code that runs on one, runs on the other.
          AND some code or features are optional, examples:
          - Shadows, soft shadows
          - Blur
          - HDR
          - Post processing
          - Texture size
          - AF

          Disabling all the above, you don't have *incorrect* or distorted rendering in the sense that ruins gameplay. Also you can play at any res you like, from 640x480 up to 2048x1536

          [edit]Ah, and when a game is demanding (a.k.a slow as f&/""), there will be a lot of friends that go play something else. This is what some guys don't understand.

          To M$ bashers, the ones that do that without a solid reason, just because... : www.gatesfoundation.org

          To the ones that put everything on max, want holy $hit settings and then say that the game is demanding, you wouldn't have fun with:
          - A ZX spectrum in 1983
          - A Commodore Amiga in 1986
          - A Celeron 300A PC on 1993
          - A PS1
          You wouldn't even consider buying a fun as hell Nintendo Wii because it's sooo last generation.

          Game developers aren't programing games to run on old hardware expecially the 6800 ultra which does not compare physicaly to what a today card even the 7950 outside of SM, outside of Vertex Shader, outside of HDR, etc etc..can do. Obviously the code in PC ported games are gutted down to accommodate the old hardware in consoles which is why current games can only do soo much playing on a 6800 even with grafic settings all the way down. IF developers were really going to cater to old hardware then rainbow 6 vegas, Oblivion, Doom3 would all run today at a min FPS of 60 Progressive at even lowest settings in the most demanding of areas. Of course this is not true since the code in the game just isnt there to provide this. The hell if they are gong to make a seperate version or alter the code to achieve this or the company would never produce. Just look at MS Vista for example.

          Its a hell of alot easier to do this on console when the hardware is the same for years with every machine without having to gut the code tremendously and only needing to maintain 30fps when interlaced is 60fps.

          If at anytime a Pc game gets to be demanding for ones PC's hardware then obviously there is no choice but to play something else or save up. Its not that they really want to play something else. Why do you think consoles have done so well? Because there just isnt any need to continuously upgrade in order to play a current hit title and having it run just the same as your friend down the road outside of the friendly user ease of simply popping the game in and pressing play.

          http://www.steampowered.com/status/survey.html

          Now as far as that survey site goes... Of course this site is only catering to Source engine am I right? If so then that means we are looking at HL2, CS-Source, Day of Defeat mainly. Now exactly how is this any sort of "accurate" survery when we are only looking at 1 engine out of the whole spectrum? I mean really, are you going to take 1 game which so happens to not be the most up to date in Eye candy as well as popularity compared to a game like WOW which has 8 million subscribers world wide and consider this to be the true source to base what hardware the "Majority" of gamers use? You cant be serious now are you? Lets also take into account that the most popular online FPS game is still CS original with but, 68k to CS-S at 28k with BF2 at 13k. Theres a diffrence there but, nothing at all compared to WOW.

          Even on that survey.. 45% are with 1gb of ram when 24% are with 2gb. The norm now is about 2gb at just about every popular forum and now with Vista its slowly going to 4gb. Dual Core cpu's have been out for consumers in Personal Computers for 2 years now and according to the survey 77% are still using single core compared to 22% using dual. If that doesn't tell you exactly what type of people are mainly playing the source engine then I don't know what does.

          You want to see popularity? Here I'll show you...
          http://www.blizzard.com/press/070123.shtml
          http://www.blizzard.com/press/070111.shtml

          I know one thing.. If i was budget savvy then I would stay with CS-Source, UT2k4 then BF2 since BF2 is a HELL of alot more demanding than either of those expecially when BF2 was one of the first games to really show performance improvement while using 2gb ram instead of 1gb.

          Im able to get 45min FPS in HL2 at 1024x768 full grafic settings with my x800Xt-Pe P4 3.0ghz. The HELL if im going to get the same in BF2. I don't consider anything below 50fps to be smoothly on progressive for my prefrence. Obviously if 25-40 is "smooth" for you then thats simply your opinion.

          http://www.legitreviews.com/article/505/3/
          http://www.legitreviews.com/article/504/3/

          Not to say that these reviews are the all end result to expect with dx10 performance in UT3 or Crysis but, as it is.. its still to slow for my needs unless I go with SLI/Crossfire OR wait till the hardware catches up. Im personaly glad its like this, this way there are rushing competitive advancements. Not some Communist ****.

          Comment


            #50
            Whats the big fuss about? UT3 is gonna be Tune able and you know that. You can easily change settings that you likely not to observe for example (not in my case juts an example) If u are a serious or medium gamer then while playing you ain't gonna be looking at soft/normal shadows.Now if you don't want texture quality reduction reduce the resolution and put it to window mode. Did you know all games that I play have to have window mode lol.

            Secondly you can also tune distortion effects if you want if you are not gonna actually be viewing distortion created by heat. You can also put less players i.e medium number of players instead of filling up a map full of dudes throwing rockets and shrapnel here and there thus reducing CPU and graphic load. So problem solved.

            Just relax. UT3 will run on all PCs.

            Comment


              #51
              Originally posted by OblivionLord View Post
              Game developers aren't programing games to run on old hardware expecially the 6800 ultra which does not compare physicaly to what a today card even the 7950 outside of SM, outside of Vertex Shader, outside of HDR, etc etc..can do.
              The 6800Ultra has similar speed as the 7600GT, X1650XT, and sometimes even a 8600GT because this card is wierd. These cards are for sale TODAY and can cost up to 150€ or more. And some guys have 6800Ultra SLI.

              Obviously the code in PC ported games are gutted down to accommodate the old hardware in consoles which is why current games can only do soo much playing on a 6800 even with grafic settings all the way down.
              Bull. Go on, disable shadows and AA. Point me to a downloadable game demo that is not playable at 1024x768, good texture quality and HDR.

              IF developers were really going to cater to old hardware then rainbow 6 vegas, Oblivion, Doom3 would all run today at a min FPS of 60 Progressive at even lowest settings in the most demanding of areas.
              We are not talking about old hardware like a 5200 or 9200. We are talking about cards that have the same performance of current mid-range cards(or were perfectly mid-range 3 months ago). And please there aren't just "lowest" and "highest" settings. You don't need to have low texture quality or play w/o AF on a card that has a 256bit bus. It still has 16 pixel shaders at 450MHz or whatever that is. And there is no sense talking about min 60 fps on the most demanding areas. I gave an example of a equivalent card card (7600gt) that will run a game at 640x480, lowest settings, 30fps MAX.

              btw, I'm really surprised about a few things:
              - how a lot of people just whine that UT2, UT2003, UC1 and UC2 sucked.
              - a lot of people bash the newbies
              - defend the fact that you *should* pay (upgrade) if you want to play
              It looks like they would like to alienate a lot of players.

              Of course this is not true since the code in the game just isnt there to provide this. The hell if they are gong to make a seperate version or alter the code to achieve this or the company would never produce. Just look at MS Vista for example.
              WTF? What code change??? They developed the engine using exactly 6800Ultras in SLI.

              Its a hell of alot easier to do this on console when the hardware is the same for years with every machine without having to gut the code tremendously and only needing to maintain 30fps when interlaced is 60fps.
              Hmm... even if you had vsync on, there is no need to create a frame for each monitor cycle. It could be 42.5 for 85Hz CRT, 1 frame per each 2 cycles. Why this isn't done, I dunno.

              If at anytime a Pc game gets to be demanding for ones PC's hardware then obviously there is no choice but to play something else or save up.
              Now, this is exactly one of my points. That day you will have more people playing retarded shooters instead of UT3.

              Its not that they really want to play something else. Why do you think consoles have done so well? Because there just isnt any need to continuously upgrade in order to play a current hit title and having it run just the same as your friend down the road outside of the friendly user ease of simply popping the game in and pressing play.
              There is no need to continuosly upgrade. Can you play anything with a recent x1950XT or 7950gt with similar eyecandy to a console? Yes you can. But guess what, they are outdated already.

              http://www.steampowered.com/status/survey.html

              Now as far as that survey site goes... Of course this site is only catering to Source engine am I right? If so then that means we are looking at HL2, CS-Source, Day of Defeat mainly. Now exactly how is this any sort of "accurate" survery when we are only looking at 1 engine out of the whole spectrum? I mean really, are you going to take 1 game which so happens to not be the most popular and not as up to date in Eye candy either and consider this to be the true source to base what hardware the "Majority" of gamers use? You cant be serious now are you?
              It wasn't my argument, but I agree with you at 100% here. There are 5200 and MX440 there. WTF, 10 years old cards. There's a limit for everything. The problem is that where you draw the line, on what runs and doesn't run a game. But we are not talking about such poor hardware. It more like this:
              - How come the card where an engine was developed, is way too slow or old, to run an much more optimized version of it?
              - How come is made a decision to develop something that doesn't even load on a X850XT PE?

              Even on that survey.. 45% are with 1gb of ram when 24% are with 2gb. The norm now is about 2gb at just about every popular forum and now with Vista its slowly going to 4gb.
              RAM is inexpensive and it has fantastic value. You are talking about quantity and not speed of the device here. That means 110€ to 150€ for 2GB. Affordable.

              Dual Core cpu's have been out for consumers in Personal Computers for 2 years now and according to the survey 77% are still using single core compared to 22% using dual. If that doesn't tell you exactly what type of people are mainly playing the source engine then I don't know what does.
              What? Would any of them exchange for instance an A64 SC for a lame P4 DC 1 year ago?

              I know one thing.. If i was budget savvy then I would stay with CS-Source, UT2k4 then BF2 since BF2 is a HELL of alot more demanding than either of those expecially when BF2 was one of the first games to really show performance improvement while using 2gb ram instead of 1gb.
              It's not budget savvy, just a matter of choice. If I can't play at max settings, I'll try to have fun at 90%. It's seems hard for some gamers today.
              And the problem is not upgrading. It's doing it constantly. Ask 7950GX2 or X1950XTX owners what they think about Vista.
              The point: you can't upgrade just in time for a single game. Some people use their PCs for something else and had upgrade cycles recently.

              Im able to get 45min FPS in HL2 at 1024x768 full grafic settings with my x800Xt-Pe P4 3.0ghz. The HELL if im going to get the same in BF2. I don't consider anything below 50fps to be smoothly on progressive for my prefrence. Obviously if 25-40 is "smooth" for you then thats simply your opinion.
              For me, it's 40-45 min.
              The problem is actually the P4. It is slower than an AthlonXP @ 2.2GHz. But unfortunately you can't scale down CPU settings that much, or at least like gfx cards

              http://www.legitreviews.com/article/505/3/
              http://www.legitreviews.com/article/504/3/

              Not to say that these reviews are the all end result to expect with dx10 performance in UT3 or Crysis but, as it is.. its still to slow for my needs unless I go with SLI/Crossfire OR wait till the hardware catches up. Im personaly glad its like this, this way there are rushing competitive advancements. Not some Communist ****.
              Default settings. A tech demo. Unoptimized engine. With shadow high and shadow map 1024x1024. Whatever, I give up

              Note, I don't care about any other game besides UT2004, Unreal series and Epic btw. Maybe that's why I rant so much about keeping a lot of people out of UT and playing **** Strike.

              Comment


                #52
                really really long post man. Hard to read lol. Read my last post. Graphics are tune able.

                Comment


                  #53
                  Originally posted by Benfica View Post
                  Man, so are most of the gamers. That's what a lot of people don't have a clue about. That may mean say, 60% of US, 70% European, 80% of Latin America and Asia gamers.

                  And then there are guys with laptops only (I'm not talking about the pathetic Intel video here), the 6 million Nintendo Wii owners, Dells and other PCs with lower end GFX cards, the new Macs, etc...
                  I hate to put it like this but then buy the game for the 360 or the PS3. PC gaming has always been an expensive habit, and it always will be. It's more of an adults hobby since a large part of it involves upgrading and playing about with a computer.

                  If you think it's expensive now that's a tad funny. It's always been quiet pricey. Anybody who remembers shelling out for the first OGL cards back in the 90s, or buying one of the first pentiums remembers the pain. Developers are currently pretty nice to people about what stuff will and won't play on.

                  Also remember that premium parts aren't targeted at most people. Unless you plan on spending a fair amount of cash playing at low details for a while is just part of the hobby.

                  I DO have a problem with microsoft trying to act as "big brother" with vista. All the drm BULL. Paying money for the RIGHT to listen to your music (compressed mp3s at that) instead of actually OWNING it is NOT moving forward. It's not quite that bad with standard a/v yet, but it is annoying and it is getting there. It IS that bad with the HD stuff already with vista.
                  Ahh a side track derail into DRM fiascos.

                  If you don't like it then don't buy it. Nobody is forcing you to buy HD-DVD, blu-ray, music or any of that. Just don't, why bother it's all **** anyways.

                  And before you fire off on a "well that god **** MS they are causing it" think again. Blame the MPAA, blame the artists, blame the people who are buying it, and then blame the likes of apple and MS who have to play ball with a system that other people created. It's not like DRM doesn't already exist outside of vista on a draconian level.

                  It's not like you can't get around all of that anyways.

                  Im personaly glad its like this, this way there are rushing competitive advancements. Not some Communist ****.
                  And this is the key issue here, the faster things move forward the cheaper things are.

                  Case in point is what happened with dual core amd 64 right after core2 duo. It's not like amd's tech is bad, it's that things moved forward so fast they had to drop their prices. With UT3 still being out over the horizon I'm sure there will be better stuff for cheaper then.

                  You can't scale down CPU settings that much, or at least like gfx cards
                  No, but you can overclock a cpu and your fsb far more easily then you can your gfx card, and with far better results.

                  Comment


                    #54
                    Personally I'm glad UT3 won't be limited by DX9. If there are some features that DX10 can do that DX9 can't, then obviously we can't complain about that because we know DX9 has some limitations. Complaining that UT9 shouldn't look better in DX10 is just asking UT3 to basically be limited by DX9's features. I don't want that.

                    I'm DX9 right now, but next February I'll go DX10 all the way, and I definitely want to take advantage of that by then.

                    Comment


                      #55
                      Can someone tell me just one thing please: Will UT3 run fine on XP?

                      I can't tell if people here are just complaining about having to use Vista for Dx10, or just about having to use Vista all together.

                      Comment


                        #56
                        Originally posted by durtytarget View Post
                        Whats the big fuss about? UT3 is gonna be Tune able and you know that. You can easily change settings that you likely not to observe for example (not in my case juts an example) If u are a serious or medium gamer then while playing you ain't gonna be looking at soft/normal shadows.Now if you don't want texture quality reduction reduce the resolution and put it to window mode. Did you know all games that I play have to have window mode lol.

                        Secondly you can also tune distortion effects if you want if you are not gonna actually be viewing distortion created by heat. You can also put less players i.e medium number of players instead of filling up a map full of dudes throwing rockets and shrapnel here and there thus reducing CPU and graphic load. So problem solved.

                        Just relax. UT3 will run on all PCs.
                        Hey, I'm relaxed, I will play at reasonable high settings I have a PC fast enough for it: C2D OC'ed at 3.2Ghz, 2GB RAM and a 8600gt with components OC'ed from to 30% to 50%. It becomes faster than a 8600gts or x1950pro.
                        And I can always sell the card and buy a 8900gs by christmas or so. Because I'm lucky and personally can. I'm just thinking about other guys

                        Comment


                          #57
                          Originally posted by roadrash View Post
                          I hate to put it like this but then buy the game for the 360 or the PS3. PC gaming has always been an expensive habit, and it always will be. It's more of an adults hobby since a large part of it involves upgrading and playing about with a computer.
                          It was never my point. My 2 main points are:
                          - Some gamers aren't capable of having fun without setting everything to highest, independently of how crappy the details are, ruining gameplay even. Sometimes you don't really like tons of coronas, vegetation and weather effects.
                          - Some people can't afford very expensive kit. They are out when a game doesn't scale. If someone can buy holy-**** hardware, they play at HS settings. If they don't or can't afford it, there is no particular reason why they shouldn't play at lower settings with a 6800, 7600, x1650,etc... I believed they would, until I saw that the existing UE3 engine game runs at 640x480 30fps low on a 7600gt.

                          If you think it's expensive now that's a tad funny. It's always been quiet pricey. Anybody who remembers shelling out for the first OGL cards back in the 90s, or buying one of the first pentiums remembers the pain.
                          Oh, I have lots of similar pains. Like 2 laptops for 2500€ each. Or 600€ for a color inkjet printer.

                          Developers are currently pretty nice to people about what stuff will and won't play on.
                          Let's see if that happens about the only game that will matter for me in the future.

                          No, but you can overclock a cpu and your fsb far more easily then you can your gfx card, and with far better results.
                          The P4 3GHz OC'ed 15%, the Athlon XP 3200+ didn't OC at all. The A64 DualCore Oc 20 to 30%. Low frequency Core2Duo OC like hell.
                          The CPUs where it would really be most important, are the ones that OC less.

                          Comment


                            #58
                            Originally posted by TheNub View Post
                            Can someone tell me just one thing please: Will UT3 run fine on XP?

                            I can't tell if people here are just complaining about having to use Vista for Dx10, or just about having to use Vista all together.
                            You're asking a question none of us can hope to answer with authority, but I'm certain it'll run fine in XP with DX9. There will just be some DX10 features that DX9 just can't do.

                            Comment


                              #59
                              Can some1 tell me whats the difference between polygons and triangles please. I know i sound stupid but i am a boy willing to learn. I am asking this cause I saw the number of triangles in a complex seen and wanted to compare. btw is can some1 tell me as to whether ATI 1950 is a high end graphic card or is it mid range?

                              Comment


                                #60
                                "The 6800Ultra has similar speed as the 7600GT, X1650XT, and sometimes even a 8600GT because this card is wierd. These cards are for sale TODAY and can cost up to 150€ or more. And some guys have 6800Ultra SLI."

                                1024x768 = 40-45fps AVG on the 8600gt and GTS in COH. Screw that.
                                1024x768 = 35-40fps AVG in Supreme Commander. Come On
                                http://www.anandtech.com/video/showdoc.aspx?i=2975&p=5
                                http://www.anandtech.com/video/showdoc.aspx?i=2975&p=2

                                Obviously its been proven that the 8600GT and GTS aren't really designed for gamers when they are really directed towards HTPC owners. They have higher clock speeds then the 8800 series mainly to decode formats. You Honestly think that these cards will last in UT3 or Crysis?


                                "Bull. Go on, disable shadows and AA. Point me to a downloadable game demo that is not playable at 1024x768, good texture quality and HDR."

                                Forget demos... here is a full game...
                                Farcry with HDR.. cheap example here but, usable on the 6800Ultra
                                http://www.firingsquad.com/hardware/..._1.3/page6.asp
                                http://www.bit-tech.net/gaming/2004/...patch13_eval/5
                                http://www.elitebastards.com/page.ph...d=1&comments=1
                                http://www.techreport.com/reviews/20.../index.x?pg=10

                                Lets look at a game with HDR "Patch" and the 6800. 2 sites here with the GT and 3 i see with the Ultra. These are AVG frame rates and deffently are **** playability at 1024x768 noAA with HDR. Some are at the min FPS still ****.

                                30FPS max is utter ****. If anyone is satisfied with that then thats their preference and obviously they arent a serious gamer.


                                "WTF? What code change??? They developed the engine using exactly 6800Ultras in SLI."

                                Again 6800's in Ultra is what? Bragging rights when it really doesn't do squat compared to a modern single video card thats a year old.

                                Same code but obviously as with the benches I posted... the performance sucks unless you alter the code just like with consoles to allow such old hardware to play at higher FPS on a Progressive monitor. When I say alter the code I mean take out this and that just inorder for it to play fast. Doom3 on Xbox is a prime example.

                                "Hmm... even if you had vsync on, there is no need to create a frame for each monitor cycle. It could be 42.5 for 85Hz CRT, 1 frame per each 2 cycles. Why this isn't done, I dunno."

                                To me 60fps is adaquate. To you 42fps is decent. I can tell a diffrence at 60fps on 85hz to 42fps 85hz

                                "There is no need to continuosly upgrade. Can you play anything with a recent x1950XT or 7950gt with similar eyecandy to a console? Yes you can. But guess what, they are outdated already."

                                The x1950Xt and 7950gt may play anything of today but they are but 1 year old. Hardly considered outdated compared to something thats 3 years old like the 6800Ultra. Besides that.. both of these cards on Newegg cost $200-250 outside of rebates. That also proves still how modern they are compared to the price performance of 6800Ultra which Ebay is about the only place youll find it on if any.

                                "RAM is inexpensive and it has fantastic value. You are talking about quantity and not speed of the device here. That means 110€ to 150€ for 2GB. Affordable."

                                Yes quantity which obviously HL2 doesnt require nor CS-S or CS. This game obviously can run efficently on 1 gb ram. BF2 shows a hugee performance with 2gb. On that survey.. the majority are using 1gb. Of course if the same type of survey for BF2 was done then im positive to say that the majority would not be using 1gb but instead 2gb mainly because it is as you said afforadble. So being that ddr or ddr2 at stock speed at the least is very inexpencive.. why are the people on Steam mainly using 1gb? To answer that question.. obviously they have no need to upgrade to a more demanding requirement for a more modern game since they are happy with what they are currently playing. Whole reason why I myself haven't upgraded till something I really want to play comes out.

                                "What? Would any of them exchange for instance an A64 SC for a lame P4 DC 1 year ago?"

                                A dual core today at the least can go for $80 bucks new retail. However you have to upgrade everything else to go along with it which makes it more expensive considering that these people aren't willing to do it in the first place. To answer your question.. Im sure if it was given to them free them yes since these people are running high end single core cpu's to begin with since even as the survey shows.. the majority of A64 users aren't in the 3.0ghz area which is obviously overclocking therefore proving they aren't spending money to cool the cpu or case or anything beyond air cooling since a stock single core a64 3000 is adaquate for CS-S. The hell if any of those people "NEED" a San Diego at 2.6 or 2.8ghz to play CS-Source. What a joke.


                                "It's not budget savvy, just a matter of choice. If I can't play at max settings, I'll try to have fun at 90%. It's seems hard for some gamers today.
                                And the problem is not upgrading. It's doing it constantly. Ask 7950GX2 or X1950XTX owners what they think about Vista.
                                The point: you can't upgrade just in time for a single game. Some people use their PCs for something else and had upgrade cycles recently."


                                It is called budget savy.. its called only being able to deal with what you got. obviously the people who went out and paid top dollar for a x1950xtx and 7950gx2 either have the money to spend or didnt do their investigation to know that DX10 is just around the corner to settle with something less for now and then upgrade. The 7950gx2 is of all sences a crazy card to own. When it came out it was about the price of 2 7900gtx. It also had issues. The x1900xtx is just as efficent as the 1950xtx, not really a price diffrence. Still foolish to only look at that. Any person on a budget wouldnt have even spent high dollor for these cards last year since they wernt in the $200 range of course. Obviously today someone who is able to buy either of those cards would be foolish since the price ratio of the 8800GTS 320mb easilyyyyy tops the performance of both those cards and its even DX10 compatable. How much price diffrence? Well you can get a 8800GTS 320mb for $250 that includes rebates. Thats mearly $50 more than what the price of a x1900/1950 xtx are today.
                                For me, it's 40-45 min.

                                "The problem is actually the P4. It is slower than an AthlonXP @ 2.2GHz. But unfortunately you can't scale down CPU settings that much, or at least like gfx cards"

                                Please tell me your not refering to a Barton AthlonXP 2.2ghz because a P4 3.0ghz is just on par with that. An A64 2.2ghz Ill agree with.

                                "Default settings. A tech demo. Unoptimized engine. With shadow high and shadow map 1024x1024. Whatever, I give up"

                                Come on.. I said that it was not the all end result. Plus Rainbow 6 Vegas is just the same even though its a terrible Console Port. Also if you want to get technical then Supreme Commander, being that it is optimized, shows 40fps performance even on the 8800 series since its soo demanding CPU and GPU wise. The newer "big" patch is said to really change this however we wont really know till its out but I greatly doubt it.

                                Comment

                                Working...
                                X