Announcement

Collapse
No announcement yet.

A question for Epics's game engine programmers

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    A question for Epics's game engine programmers

    Not that any will answer, but what the hell.

    The Unreal engine is CPU bound. Big time. It was true with UT and is still true with UT2003/2004. Any hardware enthusiast site doing a massive vid card review will show that all cards... regardless of capability or design, will have virtually the same score. It's because the CPU is the limiting factor. You would have to go way back in time... several generations at least, to find a vid card that becomes a bottleneck. As an example, I upgraded from a Ti4400 to a 6800 GT and got zero performance improvement. Yes, I can now run 4X AA and 8X AF and still pull the same framerates, but I certainly didn't get a boost.

    But what is puzzling is that in your "read-me", you guys still act like the vid card makes a difference. You say that one of the biggest things you can do to improve performance is to reduce the resolution? Hello? I would bet that 98% of the people reading these forums wouldn't see a difference in FPS between 640 x 480 and 1280 x 1024. What's up with that?

    Also, I was guessing in another thread that the reason the Unreal engine is so CPU bound is because you guys decided to do a lot of the geometry processing with the CPU instead of the graphics adapter. I also figured the reason for that was to reduce compatibility problems in the field and to make for easier coding (less codepaths due to differences in GPU's). Close? Dead wrong?

    And lastly, I'm positive it's way too late to see any offloading to the GPU in the current engine (so all our hot vid cards are going to waste... LOL) but how about the next generation. Will it be more of the same, or will our vid cards actually matter?

    Just curious. Any light shed would be a shock.

    #2
    You fixed it, sorry about the big text

    Comment


      #3
      LOL. Epic was in my brain but Atari came from my fingers.

      < note to self.... proof read the subject line too >

      Fixed.

      Comment


        #4
        Re: A question for Epics's game engine programmers

        Originally posted by Folk
        Not that any will answer, but what the hell.

        The Unreal engine is CPU bound. Big time. It was true with UT and is still true with UT2003/2004. Any hardware enthusiast site doing a massive vid card review will show that all cards... regardless of capability or design, will have virtually the same score. It's because the CPU is the limiting factor. You would have to go way back in time... several generations at least, to find a vid card that becomes a bottleneck. As an example, I upgraded from a Ti4400 to a 6800 GT and got zero performance improvement. Yes, I can now run 4X AA and 8X AF and still pull the same framerates, but I certainly didn't get a boost.

        But what is puzzling is that in your "read-me", you guys still act like the vid card makes a difference. You say that one of the biggest things you can do to improve performance is to reduce the resolution? Hello? I would bet that 98% of the people reading these forums wouldn't see a difference in FPS between 640 x 480 and 1280 x 1024. What's up with that?

        Also, I was guessing in another thread that the reason the Unreal engine is so CPU bound is because you guys decided to do a lot of the geometry processing with the CPU instead of the graphics adapter. I also figured the reason for that was to reduce compatibility problems in the field and to make for easier coding (less codepaths due to differences in GPU's). Close? Dead wrong?

        And lastly, I'm positive it's way too late to see any offloading to the GPU in the current engine (so all our hot vid cards are going to waste... LOL) but how about the next generation. Will it be more of the same, or will our vid cards actually matter?

        Just curious. Any light shed would be a shock.
        PArtly incorrect, it all depends on your memory for starters then your graphics card then your CPU.

        I have 2 AMD Opterons and the first one is dedicated to gaming, and during the game I use 5% of its total processing capabilities during game, and 20% on loading the level.

        My memory 2 GBs is populated with 900 megs in use.

        I have an ATI Radeon X800 XT Platinum, and I do notice a considerable difference between my ATI Radeon 9800 XT. I'm running a resolution of 1600x1200, and on the best possible settings for ingame settings. My AA is at 8x and AF 16x. Catalyst 4.9 Drivers.

        Either your running a CPU that cant handel the needed clocks per second, or you have something seriously wrong with your graphics card, because the next thing if the graphics card doesn't work is your CPU.

        EDIT: BEFORE posting a topic that makes assumptions take a look at the scores (benchmarks) for the unreal tournament 2004 (on any grahics card review that has been conducted within 2004) with the ATI Radeon X800 series, and Nvidia Geforce 6800 series, which compare them to the ATI Radeon 9800 series and the Nvidia Geforce 5950 series.

        You may have serious problems with your computer because, when your graphics card fails to run with the game it relies on your CPU to do the proccessing which your graphics card was supposed to do. In reality your CPU should only be handeling to direction of the projectiles, the position of the players, vehicles, nodes, cores, weapon pickups, and just objects in general. YOUR GRAPHICS CARD TAKES CARE OF THE REST.

        The difference between 800x600 and 1280x1024 is that 800x600 proccesses 480,000 pixels on your screen. Where as 1280x1024 proccesses 1,310,720 pixels on your screen, even if you dont have that much thats how many pixels it proccesses. In the end you get a crisper resolution and a less noticable Anti-Aliasing effect. Plus there is not a great loss in the FPS rate, inbetween settings.

        Comment


          #5
          To be honest, UT2004 is only "CPU bound" when there's bots.... I get like 60-85 fps constant online, but with bots they take up all the CPU processing so that's why it drops massively. I think that's what you mean?

          Comment


            #6
            Well the good AI of UT2004 probably chews up some CPU cycles

            Comment


              #7
              Re: Re: A question for Epics's game engine programmers

              Originally posted by SonicBlaster200
              BEFORE posting a topic that makes assumptions take a look at the scores (benchmarks) for the unreal tournament 2004 (on any grahics card review that has been conducted within 2004) with the ATI Radeon X800 series, and Nvidia Geforce 6800 series, which compare them to the ATI Radeon 9800 series and the Nvidia Geforce 5950 series.
              http://www.extremetech.com/article2/...1648912,00.asp

              All cards tested run even here, but it's likely due to the fact that the Painkiller benchmark, even with demanding resolution/quality settings, is CPU-bound. Painkiller runs much faster on Athlon 64 CPUs, and we suspect the difference has to do with the game engine using X87 floating-point code, a performance area AMD currently dominates. However, when we've tested cards in Athlon 64-based systems, we find that all 3D cards tend to get roughly the same frame rate, indicating that the game is CPU-bound here as well, just at a higher level.
              http://www.planetduke.com/duke4/faq/appendix2.shtml

              The Unreal engine is heavily CPU bound, so our focus on optimization is more on releasing that CPU tension. For example: the Unreal engine uses span buffering for occlusion. This approach has a lot of benefits, but is CPU intensive. We have saved a lot of framerate by reducing the complexity of the span calculations and reducing the number of required calculations. This kind of optimization is unrelated to the render API layer.
              http://www.linuxelectrons.com/articl...40419200456201

              Gee, a lousy score of 32! With Quake III I rack in 155fps. It appears that UT2004 is CPU bound.
              http://graphics.tomshardware.com/gra...e_6600-04.html

              With FSAA und AF disabled, the fastest cards prove to be CPU bound.
              http://www.firingsquad.com/print_art...rticle_id=1478

              We’re CPU-bound with the X800 PRO at 800x600 and 1024x768 on the 3400+, and CPU-bound at all resolutions with the same graphics card and an AMD Athlon XP processor.
              I could go on.....

              Comment


                #8
                Re: Re: A question for Epics's game engine programmers

                Originally posted by SonicBlaster200
                I have an ATI Radeon X800 XT Platinum, and I do notice a considerable difference between my ATI Radeon 9800 XT. I'm running a resolution of 1600x1200, and on the best possible settings for ingame settings. My AA is at 8x and AF 16x. Catalyst 4.9 Drivers.

                Either your running a CPU that cant handel the needed clocks per second, or you have something seriously wrong with your graphics card, because the next thing if the graphics card doesn't work is your CPU.

                EDIT: BEFORE posting a topic that makes assumptions take a look at the scores (benchmarks) for the unreal tournament 2004 (on any grahics card review that has been conducted within 2004) with the ATI Radeon X800 series, and Nvidia Geforce 6800 series, which compare them to the ATI Radeon 9800 series and the Nvidia Geforce 5950 series.
                I think you missed the point of Folk's post. There's no question that UT is CPU-bound, and you'll see this point demonstrated perfectly if you consult benchmarks for UT2K4 on an X800 vs. a 9800 Pro; the question is why the game is so CPU-bound and whether it will be able to scale differently in future incarnations. That's what Folk is asking.

                Saying a game is "CPU-bound" doesn't mean your X800 won't run it at 1600x1200 with all the bells 'n' whistles. Basically, it means that your X800 will run it at 1600x1200 with all the bells 'n' whistles at essentially the same fps as your 9800 Pro will run it (without the bells 'n' whistles, since AA and AF are a hit on the graphics card rather than the CPU).

                Comment


                  #9
                  Re: Re: Re: A question for Epics's game engine programmers

                  Originally posted by suibhne
                  I think you missed the point of Folk's post.
                  <Folk bows to suibhne>

                  Comment


                    #10
                    Perhaps having an awesome video card won't improve framerates in all maps, but it certainly will in high-poly maps such as ONS-Tricky that choke on anything less than a 9800.

                    Also, while a fast vid card may not improve FPS, a mediocre card like the FX5200 will do a wonderful job of killing framerates, especially in ONS where I can barely pull 30 FPS.

                    Comment


                      #11
                      cpu bound = unreals virtual mashine.

                      Comment


                        #12
                        Re: A question for Epics's game engine programmers

                        Originally posted by Folk
                        The Unreal engine is CPU bound. Big time.
                        Nope. If all results are same it just shows that performance it limited by something. With todays cards its indeed often the CPU but there is nothing to point to the guiltiness of the engine. btw that link you posted shows that the GT6600 is 2,5 times faster than the FX5750 .. Makes perfectly sense. What does not is to run the tests on P3-500 and logically see no difference then between R9800 and X800. If the results are same even on Athlon64 whatsoever, it only means that X800 is still too fast for it.

                        EPICPLZFIX your game my CPU is too slow :bulb:

                        Comment


                          #13
                          Gaal = Idiot

                          Comment


                            #14
                            MonsterMadness
                            now trolling spree again. pretty low, your friend-in-love Sasuke at least had style

                            Comment


                              #15
                              luckily new CPUs are cheaper than new graphic cards

                              Comment

                              Working...
                              X