Announcement

Collapse
No announcement yet.

New Tim Sweeny interview, mostly about tech-specs

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    #91
    Originally posted by OblivionLord View Post
    The x1950Xt and 7950gt may play anything of today but they are but 1 year old. Hardly considered outdated compared to something thats 3 years old like the 6800Ultra. Besides that.. both of these cards on Newegg cost $200-250 outside of rebates. That also proves still how modern they are compared to the price performance of 6800Ultra which Ebay is about the only place youll find it on if any...
    I just got a 7950GT for $150 after rebates. (built a whole new rig for video editing ($600 w/ A4600 X2 /2gb ram).

    Why?

    Because the 7950 is cheaper than a 8600 and smokes it in performance.

    Maybe it doesn't support DX10, but let me tell ya I just 86'd Vista and went back to MCE2005, because it really needs a service pack or two before it runs as well as MCE2005 (IMHO) for my needs.

    As far as UE3 goes The LP demo runs avg 40fps at 1280x720 w/ details on high.

    Even without DX10 goodness I imagine I will be very happy with how UT3 will run.

    Comment


      #92
      Originally posted by MrLemur2U View Post
      I just got a 7950GT for $150 after rebates. (built a whole new rig for video editing ($600 w/ A4600 X2 /2gb ram).

      Why?

      Because the 7950 is cheaper than a 8600 and smokes it in performance.

      Maybe it doesn't support DX10, but let me tell ya I just 86'd Vista and went back to MCE2005, because it really needs a service pack or two before it runs as well as MCE2005 (IMHO) for my needs.

      As far as UE3 goes The LP demo runs avg 40fps at 1280x720 w/ details on high.

      Even without DX10 goodness I imagine I will be very happy with how UT3 will run.
      People need to realize that the 80 series is not about better frame rates, under the 8800 series it's all about pushing HD video, and it kicks *** at that.

      And for the most part, high end parts still smoke middle end parts of the next gen. Nvidia was kind with the 6600 and 7600 parts, partly because they had to be.

      Comment


        #93
        Originally posted by roadrash View Post
        People need to realize that the 80 series is not about better frame rates, under the 8800 series it's all about pushing HD video, and it kicks *** at that.

        And for the most part, high end parts still smoke middle end parts of the next gen. Nvidia was kind with the 6600 and 7600 parts, partly because they had to be.
        Which is why the 8600 are only good for HTPC not gaming, and not worth $200 price tag at this time. Even Tom's hardware admits that.

        Comment


          #94
          Originally posted by roadrash View Post
          Wrong the game is a pile of ****. Pong would be a better benchmark.

          It's a hashed together console game and a pile of ****. Again Pong would be a better benchmark.
          I can agree that, but I'd really like to see more *technical* details, no subjective arguments. Is the x86 code poorly compile, not using SSE2? Does the DX code path have more layers. Does this game only have load balance very optimized for triple-core? Does is use heavily the Xenos tesselator? What is it?


          You "might", but in other tests the 8800gts (let alone the gtx) thrashes the x2900xt. The x2900xt also uses more power. The 8800gts is cheaper (now) to boot.

          So other then the hope, that based off a ****ty console port that shouldn't be played at all, let alone used for benchmarks, and even then not all benchmarks report ATi winning.... you should buy a more expensive part that sucks more power and in some cases is proven slower.
          Who said again that is about *ME* or *BUY*? Do people only see themselves here? No chance to give someone an advice to *just wait*, or trying to help other people, staying away from buy buy buy?

          You'd be a complete idiot to purchase a product based off that. All that said I have great hopes for the 2900 parts, with better drivers they should really show some force, but currently they don't.
          Don't worry, I'm a complete idiot but on a few other issues not related with PC hardware
          Look, to buy a card that costs 300€, 400, 500€ or more, it is really wise to wait. So please stay away from be insist that someone shouldn't buy a X2900, but it's ok to buy a 8800 now independed of the price. It isn't ok, it's the same ****.

          Comment


            #95
            Originally posted by MrLemur2U View Post
            Because the 7950 is cheaper than a 8600 and smokes it in performance.
            **** you guys A 7950gt costs here almost TWICE the price of a 8600gt.

            Comment


              #96
              Which is why the 8600 are only good for HTPC not gaming, and not worth $200 price tag at this time. Even Tom's hardware admits that.
              I honestly can't help but wonder if the 8600 series is only meant to be the third GPU people purchase. If you look at the big picture that makes sense. 680i ships with 3x pcie x16 slots, the middle one only works at x8 electrical and is meant for an 8 series card to run physics. SLI can not display over multiple monitors, so you must be using a second GPU to run multiple displays without toggling. And since you already have a beefy 3d system the fact that the 8600 series kicks *** at HD video yet is slow in 3d is a bonus.

              If you look at it from that standpoint it makes sense.

              I can agree that, but I'd really like to see more *technical* details, no subjective arguments. Is the x86 code poorly compile, not using SSE2? Does the DX code path have more layers. Does this game only have load balance very optimized for triple-core? Does is use heavily the Xenos tesselator? What is it?
              Who knows, I don't think anybody has really looked at it. It's rather like the debacle the Halo port was. Ran great on inferior console parts, but ran like *** on PC no matter what you did.

              At this point I'm not even sure if SLI works in it, I remember it didn't for a while. With all epics claims that SLI shows performance gains, and the fact that they were using SLI ultras for a while, the fact that R6 vegas doesn't work with it right there should raise a red flag on using it to guestimate UT3 peformance.

              Comment


                #97
                Originally posted by roadrash View Post
                I honestly can't help but wonder if the 8600 series is only meant to be the third GPU people purchase. If you look at the big picture that makes sense. 680i ships with 3x pcie x16 slots, the middle one only works at x8 electrical and is meant for an 8 series card to run physics. SLI can not display over multiple monitors, so you must be using a second GPU to run multiple displays without toggling. And since you already have a beefy 3d system the fact that the 8600 series kicks *** at HD video yet is slow in 3d is a bonus.

                If you look at it from that standpoint it makes sense.



                Who knows, I don't think anybody has really looked at it. It's rather like the debacle the Halo port was. Ran great on inferior console parts, but ran like *** on PC no matter what you did.

                At this point I'm not even sure if SLI works in it, I remember it didn't for a while. With all epics claims that SLI shows performance gains, and the fact that they were using SLI ultras for a while, the fact that R6 vegas doesn't work with it right there should raise a red flag on using it to guestimate UT3 peformance.
                Please, NEVER compare Ubisoft to Epic. I still regret buying Splinter Cell: DA, a horrible port, buying King Kong: Gamer's Edition, which runs horribly no matter what, and not to mention Vegas and the issues with places where the 360 would load data brings PCs to a crawl.

                Rainbow Six: Vegas should not be considered an example of how UE3 will run, otherwise Gears of War wouldn't run so good on the 360.

                Comment


                  #98
                  I really doubt people are going to buy a DX10 card just to get a DX10 experience even thought the actual performance of the card sucks. IMO one would rather buy a 8800 (save up) or stay with a great DX9 card like the X1950 or 7950 series.

                  Comment


                    #99
                    Originally posted by durtytarget View Post
                    I really doubt people are going to buy a DX10 card just to get a DX10 experience even thought the actual performance of the card sucks. IMO one would rather buy a 8800 (save up) or stay with a great DX9 card like the X1950 or 7950 series.
                    I'm getting a dx10 card along with dx10 on vista just so I can enable AA+64bitHDR.

                    Comment


                      Originally posted by durtytarget View Post
                      I really doubt people are going to buy a DX10 card just to get a DX10 experience even thought the actual performance of the card sucks. IMO one would rather buy a 8800 (save up) or stay with a great DX9 card like the X1950 or 7950 series.
                      A 8600gt OC'ed is surprisingly good for the price if you don't use AA or very high res

                      Comment


                        Indeed. They're quite a step up from Intel Extreme Graphics.

                        Comment


                          Originally posted by DX-GAME View Post
                          Releases?
                          Can you say DEMO?

                          Im tired of upgrading. Im tired of new games comming out that the mainstream cant hope to play. Ive got a thousand other choices.

                          If it doesnt run good with a Intel P.D 2.8, GF6800 PCIE, and 2gigs of ram, then Ill not be getting it any time soon. The funny thing is neither will the majority of online FPS players, who have specs lower than mine according to quite a few serveys hosted by top gaming companies.

                          I dont give a rats butt about Vista, DX10, or 2006 hardware. If thats the target, and the target is mainstream, Epic missed the shot and killed the family cat.

                          Like you've said, I'm not upgrading my PC again, I just went out and brought a new car so there's no way i can afford another £2,000 upgrade.

                          This badboy better run on my gf7800 or im gonna be mighty ******.

                          Comment


                            Interview Skills

                            Originally posted by Doc Shock View Post
                            The website of the german mag "PC Games Hardware" features an article about Unreal Engine 3, containing an interesting interview with Tim Sweeny.
                            Unfortunately there's no english version available, so I had to write a rough translation (I hope you don't mind my mediocre interview skills). It's quite an interesting read, pointing out lots of the technical features UT3 will utilize.
                            Enjoy!

                            --------------------------------------------------------------------------

                            Overview of the most important rendering features:

                            - multi-threaded renderer (4+ threads)
                            - 16 Bit per componenent HDR-Pipeline
                            - runs on 64 Bit operating systems
                            - (confirmed) performance advantages through SLI (most likely Crossfire as well)
                            - post-processing effects (some examples): motion blur, depth-of-field blur, bloom
                            - deferred shading
                            - physics: Ageia PhysX engine
                            - 300 - 1.000 visible objects per scene
                            - huge scenes typically consist of 500.000 to 1.500.000 triangles
                            - normal maps and texture maps usually have a resolution of 2.048 x 2.048

                            --- Edit: The translation has been edited out. Though I credited PCGH, I didn't ask them for permission, my apologies for that. But my post here provoked that they released the original interview, so my work wasn't totally in vain. So, here's the link to the german article and here's the one to the new original english interview.


                            --------------------------------------------------------------------------

                            That's it, I hope you guys had a good read.

                            Greetings,
                            Doc Shock
                            thank you !!!

                            Comment

                            Working...
                            X