Announcement

Collapse
No announcement yet.

Is the "forced antialiasing" done by the CPU?

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

  • replied
    The native resolution is 1680x1050, but 1440x900 with 4xAA honestly looks better to me. It would be unacceptable for desktop use, as text is a bit too blurry, but in-game it's fine.

    Leave a comment:


  • replied
    Originally posted by rebel7254 View Post
    Well, if I hide my weapon and play in 1440x900, I can use AA and get my desired framerate the vast majority of the time.
    What's your monitor's native resolution? Remember, no point in Antialiasing if you're running an LCD monitor at an awkward resolution. (Unless it just happens to fluke out and not look too bad... like my 1680x1050 monitor seems to for whatever reason like 1280x720)

    Leave a comment:


  • replied
    Well, if I hide my weapon and play in 1440x900, I can use AA and get my desired framerate the vast majority of the time.

    Leave a comment:


  • replied
    Originally posted by kray28 View Post
    That's only supersampling AA, there are other AA algorithms as well, some more efficient, and some producing better image quality (multi-sampling, coverage sampling)
    Technically its the same for all methods of antialiasing (except Matrox's method of drawing imaginary curves and lines... which is completely and utterly useless when you run any shader program what-so-ever)... in all methods you're finding sub-pixels.

    Leave a comment:


  • replied
    Originally posted by _Lynx View Post
    Antialiasing is done solely by the GPU, because in fact it renders the picture at a higher resolution and then scales it down. That, however, requires GPU to process the data at a higher speed, and CPU just can't keep up and cannot supply the data at required speed. That's what means "CPU limited".
    That's only supersampling AA, there are other AA algorithms as well, some more efficient, and some producing better image quality (multi-sampling, coverage sampling)

    Leave a comment:


  • replied
    The CPU definitely DOES NOT do the anti-aliasing.

    Leave a comment:


  • replied
    Yeah I can't o/c my core any more than that for sure. It's not even stable at those speeds, but it will play for 5-10 minutes.

    Leave a comment:


  • replied
    Originally posted by Entil'Zha View Post
    And when you say...


    ...it actually means the nvidia hack is good enough for a game like UT3 since you won't be spending time looking for the imperfections.

    And hence players on WinXP DX9 with nVidia can use this hack with UT3 and enjoy somewhat anti-aliasing.

    There's not that much of a downside to it, as I understand.
    Oh definitely. If you have spare frames to kill -- do it.

    However:

    1) Epic didn't program it, so if its glitchy then it may just be the hack causing instability. (I had issues with Bioshock and UT3 crashing as a result of trying to hack. If you don't, bonus... but if problems appear at some point... that would be the first thing to check instead of assuming its an actual game bug.)

    2) If you have Vista and a DX10 card... use the REAL antialiasing provided through the game itself.

    But yea, pretty much everything you'll see will be at roughly a constant intensity except for like -- entrances in a cave... its almost entirely reduced except in specific situations... but its not eliminated entirely and will never be until you run in DX10 mode.

    Originally posted by rebel7254 View Post
    CPU limitation can't be it, unless AA is stressing the CPU and not the GPU.

    If I turn AA off, overclocking gains me about 6 or 7 more fps at 620/1475/1000. But when I turn AA on, the framerate drops to the same number (around 50 fps in the middle of ShangriLa), regardless of clockspeeds.

    It doesn't make sense that overclocking increases my framerate, except when I enable AA. If AA is stressing the GPU, then overclocking should make it faster
    Depends, you did some SERIOUS overclocking to your memory, but relatively small overclocks to your Gpu-core and shaders. Its possible that antialiasing bottlenecks your core and shaders (which will not see a huge improvement with memory overclocking)... but without antialiasing you're bottlenecked in video memory bandwidth (which will make the 100mhz overclock more visible).

    Now I'm not saying that you should push your GTX core and shaders further... that would be insanity... just saying you may have just not overclocked the thing that needs overclocking.

    Leave a comment:


  • replied
    Antialiasing is done solely by the GPU, because in fact it renders the picture at a higher resolution and then scales it down. That, however, requires GPU to process the data at a higher speed, and CPU just can't keep up and cannot supply the data at required speed. That's what means "CPU limited".

    Leave a comment:


  • replied
    CPU limitation can't be it, unless AA is stressing the CPU and not the GPU.

    If I turn AA off, overclocking gains me about 6 or 7 more fps at 620/1475/1000. But when I turn AA on, the framerate drops to the same number (around 50 fps in the middle of ShangriLa), regardless of clockspeeds.

    It doesn't make sense that overclocking increases my framerate, except when I enable AA. If AA is stressing the GPU, then overclocking should make it faster

    Leave a comment:


  • replied
    OKay, you seem to know a thing or two about the subject.

    And when you say...
    You don't typically notice the loss of quality of course because you need to be seeing where two very different intensity objects attempt to antialias together.
    ...it actually means the nvidia hack is good enough for a game like UT3 since you won't be spending time looking for the imperfections.

    And hence players on WinXP DX9 with nVidia can use this hack with UT3 and enjoy somewhat anti-aliasing.

    There's not that much of a downside to it, as I understand.

    Leave a comment:


  • replied
    1) Overclocks don't actually yield that much performance increase.

    2) You may have been CPU limited

    3) You might have any one of like -- 4 frame limiters on (drivers, vsync, game, FRAPs) (making your framerate look a LOT lower than it really is... since its essentially wasting time)

    4) Forcing Antialiasing in the nVidia Control panel does NOT make the game antialias. It tells the *drivers* to Antialias. The Drivers should know to offload to the videocard... but again.

    http://img.photobucket.com/albums/v3...bioshockaa.jpg

    This antialiasing is not perfect as visible by Bioshock running a similar driver hack.

    I'm not saying turn it off... I'm just saying its not an Epic-coded feature... its a hack nVidia installed into their drivers which basically antialiases the framebuffer at the wrong time. You don't typically notice the loss of quality of course because you need to be seeing where two very different intensity objects attempt to antialias together.

    So... in closing...

    Overclocking -- especially by that low of an amount -- probably won't be visible within the typical framerate noise visible from the game as is. Frankly, its pretty pointless in general. (Mind you the E6400 @ 3GHz does have some noticeable benefits... provided you cool it well enough you don't start noticing the disadvantages.)

    Probably #1 or #4's your problem... either you driver hack just has some glitches from it being just that -- a hack and nothing more... or your GPU overclock just isn't big enough to push an extra frame out. And frankly... most GPUs are overclocked enough at the factory that adding more is practically suicide.

    Leave a comment:


  • replied
    Well I've never heard of software AA but AA does cause an FPS drop, though not so much any more on these highend cards. But it'll be interesting to see how this pans out. Good question.

    Leave a comment:


  • started a topic Is the "forced antialiasing" done by the CPU?

    Is the "forced antialiasing" done by the CPU?

    This pertains to UT3 running in Windows XP DirectX 9

    I'm beginning to wonder whether forcing AA in the Nvidia Forceware control panel does nothing more than force the CPU to do the anti-aliasing, i.e. "software antialiasing". The reason I'm thinking this is because I get the exact framerate drop whether I'm running my 8800GTX at 600/1400/900 or at 620/1475/1000. If enabling AA was really stressing my GPU, then wouldn't overclocking it increase the framerate? This holds true at all resolutions, by the way.

    Other specs:

    Intel Core 2 Due E6400 @ ~3GHz
    2GB DDR2 RAM
Working...
X