Results 1 to 34 of 34
  1. #1
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default Antialiasing/Anisotropic Filtering

    I know that there is a way to make UT3 use antialiasing, by forcing it through the control panel after renaming the game BIOSHOCK.exe, but I wonder if the developers are going to put an end to that for the retail version. After all, I wouldn't want people renaming my game to something else. I hope they actually fix the issue. Obviously the game can use antialiasing, so I am hopeful it will be implemented.

    Developers these days don't care about how a game looks. Look at that horrible bloom lighting being overused. Antialiasing and Ansitropic Filtering are BASIC features, and should be available for EVERY game that is made, including this game, but they are absent without the hack.

    Can anyone at Epic confirm that the AA/AF issue will be resolved and working for all versions of the game when the game goes to retail?
    Last edited by gutter; 10-18-2007 at 03:06 PM.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  2. #2
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    63

    Default

    The Bioshock trick worked for me, but it had too much negative influence on my performance that I decided to keep it off.

  3. #3
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    To me, it looks like crap without AA/AF. I would uninstall the game if I couldn't use AA/AF.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  4. #4
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    336

    Default

    It uses AF 16x, it's Anti-Aliasing thats the issue. DX9 uses are being left in he dark because of the DX10 toys.
    Intel Core 2 Duo E6750-2.66Ghz
    EVGA nForce 680i SLI
    4Gb Kinston Ram
    NVIDIA 9600GT
    Windows 7 RC1(7100)

  5. #5
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by WrATH View Post
    It uses AF 16x, it's Anti-Aliasing thats the issue. DX9 uses are being left in he dark because of the DX10 toys.
    Here's the thing:

    This game engine was not supposed to be able to use antialiasing in Dx9, but it can! Using that "Bioshock.exe" trick, we can enable any amount of AA that we like.

    The problem is, that trick may not work in the retail version if Epic wants to be jerks about it.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  6. #6
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    336

    Default

    Quote Originally Posted by gutter View Post
    Here's the thing:

    This game engine was not supposed to be able to use antialiasing in Dx9, but it can! Using that "Bioshock.exe" trick, we can enable any amount of AA that we like.

    The problem is, that trick may not work in the retail version if Epic wants to be jerks about it.
    Maybe for UT3 but COD4 works with AA fine, they have the same engine.
    Intel Core 2 Duo E6750-2.66Ghz
    EVGA nForce 680i SLI
    4Gb Kinston Ram
    NVIDIA 9600GT
    Windows 7 RC1(7100)

  7. #7
    Redeemer
    Join Date
    Sep 2007
    Location
    Finland
    Posts
    1,290

    Default

    Quote Originally Posted by WrATH View Post
    Maybe for UT3 but COD4 works with AA fine, they have the same engine.
    No they don't.

  8. #8
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    336

    Default

    Quote Originally Posted by Xendance View Post
    No they don't.
    My mistake, I though COD4 used UE3.
    Intel Core 2 Duo E6750-2.66Ghz
    EVGA nForce 680i SLI
    4Gb Kinston Ram
    NVIDIA 9600GT
    Windows 7 RC1(7100)

  9. #9
    Banned
    Join Date
    Jan 2007
    Posts
    278

    Default

    I'm curious about something. UE3 was out before DX10. So why wouldn't it support AA?

  10. #10
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by Flybye View Post
    I'm curious about something. UE3 was out before DX10. So why wouldn't it support AA?
    Development is going backwards. Look at Bloom Lighting...it is UGLY. Look at GRAW 2. Ugliest game ever made besides R6Vegas.

    In my opinion, HDR sucks too. I hate what it does to the games. Look at Test Drive Unlimited...that game had potential, and even though you could disable HDR, the visuals were HORRIBLE with HDR on. Granted, Source handles HDR better, albeit without AA. (which is nvidia's fault, not the developer)

    Antialiasing, anistropic filtering, vertical sync, etc...should all be OPTIONS for the user to enable, and they should be available in EVERY game, whether they can be forced through video drivers or in game menus.

    PS: See how EPIC dev's are ignoring this thread? They are going to screw us out AA/AF in the retail release.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  11. #11
    MSgt. Shooter Person
    Join Date
    Sep 2007
    Posts
    200

    Default

    are you guys forgetting this is a "beta" and i am sure that they left certain things out and some options , beta is not to give you a feel for what the retail game is going to be like that is a Demo's job

  12. #12
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    69

    Default

    Ok, first point:
    PS: See how EPIC dev's are ignoring this thread? They are going to screw us out AA/AF in the retail release.
    How many other threads have they replied to exactly? This is not all about you. If we want them to implement all the changes we're crying out for in a reasonable amount of time, they've got a ****load of work to do, so they might not have huge amounts of time to reply to, frankly, ill-informed rants.

    In your opinion, HDR sucks. Fine, but that doesn't mean everyone else thinks so. Nor does it mean that graphical development is somehow going 'backwards'. There are ways of turning these effects off you know. Just because you don't like a new feature doesn't mean that 'developers don't care about how games look anymore', it just means you disagree with their opinions.

    I said this was ill-informed; this is why - AA will definitely be supported under Vista since DX10 has the capability to run both HDR and AA simultaneously. On the other hand, it is likely that the AA will be external (i.e. from the graphics driver setup dialog) rather than UT native since UT has never had native AA support in the past. I'm just speculating though, they may well have native support. Added to that, someone's already posted to correct you about anisotropic filtering, but you weren't really listening. So stop having a go at epic for 'ignoring' you or not caring what you think. Give them a chance, for ****s sake.

    Flybye, none of this is aimed at you, and in answer to your question, as far as I know DX9 does not support AA and HDR simultaneously (and I may be wrong on that if gutter is correct about the bioshock trick working on DX9). DX10 does, so it's actually more likely that a game made AFTER DX10 came out would support AA.

    I'd like to point out that the reason I'm making this reply is not because of what you are saying, gutter, but how you are saying it. If you were asking about these things reasonably rather than assuming you know what's going on and having a rant, I'd be happy to discuss it with you.
    Last edited by Six Ways; 10-18-2007 at 05:33 PM.

  13. #13

    Default

    Quote Originally Posted by gutter View Post
    Development is going backwards. Look at Bloom Lighting...it is UGLY. Look at GRAW 2. Ugliest game ever made besides R6Vegas.

    In my opinion, HDR sucks too. I hate what it does to the games. Look at Test Drive Unlimited...that game had potential, and even though you could disable HDR, the visuals were HORRIBLE with HDR on. Granted, Source handles HDR better, albeit without AA. (which is nvidia's fault, not the developer)

    Antialiasing, anistropic filtering, vertical sync, etc...should all be OPTIONS for the user to enable, and they should be available in EVERY game, whether they can be forced through video drivers or in game menus.

    PS: See how EPIC dev's are ignoring this thread? They are going to screw us out AA/AF in the retail release.
    Please educate yourself on what HDR and Antialiasing actually are and how they are done before giving developers commands.

    HDR Lighting is not something you should see -- you should only see the side effects of it. Its practical purpose is to provide a full range of light intensities so an engine can accurately portray light and color in any environment.

    People equate HDR lighting to Bloom. This is absolutely not true at all. Bloom is the side-effect of HDR lighting.

    With HDR lighting -- its next to impossible to accurately portray proper anti-aliasing in DirectX9 mode.

    http://img.photobucket.com/albums/v3...bioshockaa.jpg

    Check out the image.

    Even with the Bioshock 'fix' -- sure, some parts of the image are blended properly (blue arrows) -- some are not antialiased properly (red arrows).

    The only way to get true Antialiasing + HDR is to be in Vista under DX10 mode.

    HDR is a developer toy -- if you can notice its effects, either the developer is attempting to flaunt their technology by amplifying the effects -- or its part of their art style they defined for their game. Its *actual* purpose is if -- for example -- the player needs to constantly change between indoor and outdoor settings; also if the engine is designed to be used for multiple art styles.
    Last edited by Phopojijo; 10-18-2007 at 05:42 PM.
    Core i7 920, Foxconn Renaissance, 6GB OCZ DDR3, GeForce GTX460 + GTX 260
    X-Fi Platinum, Bose QC15 Headphones, Windows 7 x86-64
    Rosewill RK-9000 (PS/2 Adapt. NKRO), Adesso Cybertablet 12000, LG Wide 20" LCD + Samsung 23" 1080p LED + LG Wide 22" Glossy LCD

  14. #14
    Banned
    Join Date
    Jan 2007
    Posts
    278

    Default

    Even if they do, there is a fan base large enough to do something about AA. And you're right. I remember playing GRAW2 for a lil bit, and just remembered that it had that weird lighting effect to it as UT3.

    I was about to say why we dont see this in BioShock, but it's an all indoor game.

    I really don't like this weird "foggy" type looking effect. Whatever it's called. I had to play with contrast, gamma, and brightness a bit to get it to look acceptable, but on suspense, you really can't get away from it.
    Last edited by Flybye; 10-18-2007 at 07:16 PM.

  15. #15

    Default

    Quote Originally Posted by gutter View Post
    I know that there is a way to make UT3 use antialiasing, by forcing it through the control panel after renaming the game BIOSHOCK.exe, but I wonder if the developers are going to put an end to that for the retail version. After all, I wouldn't want people renaming my game to something else. I hope they actually fix the issue. Obviously the game can use antialiasing, so I am hopeful it will be implemented.

    Developers these days don't care about how a game looks. Look at that horrible bloom lighting being overused. Antialiasing and Ansitropic Filtering are BASIC features, and should be available for EVERY game that is made, including this game, but they are absent without the hack.

    Can anyone at Epic confirm that the AA/AF issue will be resolved and working for all versions of the game when the game goes to retail?
    A little percentage of users can use AA/AF with this game, because it would require an 8800ultra.
    Athlon X2 6000+ | 4GB DDR2 667Mhz | MSI GeForce 9800GTX 512MB | ASRock ALive NF6G-VSTA | WinVista SP1 Business 64bit

  16. #16
    MSgt. Shooter Person
    Join Date
    Dec 2006
    Location
    Buffalo, NY
    Posts
    197
    Gamer IDs

    Gamertag: Cyclic AMP

    Default

    Gutter, you posted at 5:12 PM. Generally, most white-collar employees such as game developers leave the workplace at 5 PM in America.

    Keep in mind that their perceived lack of attention on the forums could also be a result of them being too busy with both Gears of War PC and UT3 to spend any time on the forums.

  17. #17
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    West Coast, Southern Sunny California
    Posts
    36

    Exclamation Try your Video Control Panel!

    Quote Originally Posted by Flybye View Post
    I really don't like this weird "foggy" type looking effect. Whatever it's called. I had to play with contrast, gamma, and brightness a bit to get it to look acceptable, but on suspense, you really can't get away from it.
    I would think that your video card contol panels would be able to fix the AA/AF, if you set them to be system or non/app controlled - heck, I can run mine in sepia if I wanted to - for that antique look in a demorec!
    Most new cards have a plethora of settings for you to choose, from game profile settings to some of the new HDR stuff. If the settings you need aren't there, you may need to update your drivers or email your card company and ask if they're planning to release an update that will address this issue!
    PS. You should probably show some respect for the people that worked their butts off trying to satisfy you 'the consumer' and every other avid UT gamer! Granted, it needs a few updates - you waited this long to get it, so a little longer to get it your way isn't gonna hurt! :P
    Last edited by LiKeMiKeS; 10-18-2007 at 08:17 PM.

  18. #18
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by Phopojijo View Post
    Please educate yourself on what HDR and Antialiasing actually are and how they are done before giving developers commands.

    HDR Lighting is not something you should see -- you should only see the side effects of it. Its practical purpose is to provide a full range of light intensities so an engine can accurately portray light and color in any environment.

    People equate HDR lighting to Bloom. This is absolutely not true at all. Bloom is the side-effect of HDR lighting.

    With HDR lighting -- its next to impossible to accurately portray proper anti-aliasing in DirectX9 mode.

    http://img.photobucket.com/albums/v3...bioshockaa.jpg

    Check out the image.

    Even with the Bioshock 'fix' -- sure, some parts of the image are blended properly (blue arrows) -- some are not antialiased properly (red arrows).

    The only way to get true Antialiasing + HDR is to be in Vista under DX10 mode.

    HDR is a developer toy -- if you can notice its effects, either the developer is attempting to flaunt their technology by amplifying the effects -- or its part of their art style they defined for their game. Its *actual* purpose is if -- for example -- the player needs to constantly change between indoor and outdoor settings; also if the engine is designed to be used for multiple art styles.
    So essentially you start off being an a$$, and then proceed to state NOTHING contradictory. Thank you for the definitions, even if they were not the focus of my statements.

    You acted like I was making judgements without prior knowledge, but really you were just chiming in to sound like you were smart. Never once did I say what HDR was, nor did I say what bloom was, except for saying they are ugly effects in new games. Yes, I can see the diff between HDR and no HDR and I think it ruins the game visuals. I never said that HDR and bloom were the same thing.

    You understand what I'm saying? You had nothing constructive to say besides providing the "meaning" or "description" of the effects I was talking about, and yet somehow found it important to start the post by insulting me.

    I consider this to be far more "trollish" than any other "troll-ified" post by any troll who has ever trolled a forum.

    EDIT: HERE IS THE PM THIS GUY SENT ME!! LOL!!! He is desperate to pretend he is smart or a techie.

    There are way too many misconceptions about antialiasing... they're actually more prominent than the truth itself by multiple heck -- even orders of magnitude.

    I don't blame you for not knowing the difference since the lie-perpetuates are very believable (since they believe it themselves).


    Quote Originally Posted by LiKeMiKeS View Post
    I would think that your video card contol panels would be able to fix the AA/AF, if you set them to be system or non/app controlled - heck, I can run mine in sepia if I wanted to - for that antique look in a demorec!
    Most new cards have a plethora of settings for you to choose, from game profile settings to some of the new HDR stuff. If the settings you need aren't there, you may need to update your drivers or email your card company and ask if they're planning to release an update that will address this issue!
    If a game is using HDR or Bloom lighting in Windows XP (Direct X 9) and you use an NVIDIA video card and the game in question does not support AA then you cannot force AA through the drivers (NTUNE). I am not too proud admit that this MIGHT be incorrect, but this is what I've learned from playing Ghost Recon Advanced Warfighter 2.
    Last edited by gutter; 10-18-2007 at 08:23 PM.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  19. #19
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    West Coast, Southern Sunny California
    Posts
    36

    Default

    You understood my meaning perfectly, but since I'm not even using a card the game was designed for - that might have something to do with it.
    I'm using an ATI x800PRO 256MB and have a choice between OpenGL & DX9/10 drivers, I use DX9 for editing & DX10/OpenGL for play/testing - granted the OGL has limited settings & takes up more memory, but the speed & visual quality are awesome.
    As far as the game not supporting AA/AF, it wouldn't matter if you instead use your system video drivers to do the same thing. But if the (Demo) texture resolutions are low!, no amount of AA/AF is going to change the way the 'foggy' effect as you say is going to look ingame! Unless of course you have the ability to patch/update the textures with higher res versions yourself!

  20. #20
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    87

    Default

    I agree, UE3 post processing effects really do look awful. Most UE3 games really destroy my eyeballs. UT3 is the best so far but it still makes me cringe.
    Bearded ruminant mammal with hollow horns and coarse hair belonging to the genus Capra.

  21. #21
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by NotAgOat View Post
    I agree, UE3 post processing effects really do look awful. Most UE3 games really destroy my eyeballs. UT3 is the best so far but it still makes me cringe.
    Yes, they are ugly. I wonder which of the developers used it and said "wow, that looks great"...LOL. It's horrible.

    Back on topic.

    Can any dev from Epic confirm that "bioshock.exe" fix for antialiasing will work with the retail version???????
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  22. #22

    Default

    Quote Originally Posted by gutter View Post
    Yes, they are ugly. I wonder which of the developers used it and said "wow, that looks great"...LOL. It's horrible.

    Back on topic.

    Can any dev from Epic confirm that "bioshock.exe" fix for antialiasing will work with the retail version???????
    My prior post that you quoted here claims that the Antialiasing "fix" for Bioshock.exe isn't actually a fix. (Again, the screenshot proving this is here)

    Using the bioshock.exe hack will not give you true antialiasing in your game.

    You will still have jagged edges in places where you'd need DX10's level of control to antialias.

    So to summarize:

    The option to turn off HDR in UnrealEngine3 is not present since UnrealEngine3 can only render in HDR.

    The option to turn on Antialiasing in UnrealEngine3 in DX9 mode is not present because of fundamental flaws in DX9.

    There are just some things that some game designers cannot do. DirectX9 API just does not have the functions to allow it. It just *will not work*.
    Last edited by Phopojijo; 10-19-2007 at 01:07 AM.
    Core i7 920, Foxconn Renaissance, 6GB OCZ DDR3, GeForce GTX460 + GTX 260
    X-Fi Platinum, Bose QC15 Headphones, Windows 7 x86-64
    Rosewill RK-9000 (PS/2 Adapt. NKRO), Adesso Cybertablet 12000, LG Wide 20" LCD + Samsung 23" 1080p LED + LG Wide 22" Glossy LCD

  23. #23
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by Phopojijo View Post
    My prior post that you quoted here claims that the Antialiasing "fix" for Bioshock.exe isn't actually a fix. (Again, the screenshot proving this is here)

    Using the bioshock.exe hack will not give you true antialiasing in your game.

    You will still have jagged edges in places where you'd need DX10's level of control to antialias.

    So to summarize:

    The option to turn off HDR in UnrealEngine3 is not present since UnrealEngine3 can only render in HDR.

    The option to turn on Antialiasing in UnrealEngine3 in DX9 mode is not present because of fundamental flaws in DX9.

    There are just some things that some game designers cannot do.
    Not "true" antialiasing???

    That's funny, that "Bioshock.exe" trick and forcing 8XAA in Ntune seems to have worked for me. Actually, I don't care if it's true AA, false AA, blah AA, it's taken all the jaggies out of my game.

    I play at 1440x900, with everything maxed out. I am forcing 8xAA (I don't need to force AF since it engages in the game automatically if you play with the game at max.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  24. #24

    Default

    Quote Originally Posted by gutter View Post
    Not "true" antialiasing???

    That's funny, that "Bioshock.exe" trick and forcing 8XAA in Ntune seems to have worked for me. Actually, I don't care if it's true AA, false AA, blah AA, it's taken all the jaggies out of my game.

    I play at 1440x900, with everything maxed out. I am forcing 8xAA (I don't need to force AF since it engages in the game automatically if you play with the game at max.
    http://img.photobucket.com/albums/v3...bioshockaa.jpg

    It takes most out, but look above: not all.

    The problem is that you're attempting to antialias 0-255 intensity color that was derived from what may be 30-orders-of-magnitude different intensity. Once you throw away the extra data and work in "cropped" color space... you'll get things turning wrong shades... creating the jaggies again.

    In DirectX 10 with DirectX10-based cards... you have access to the original HDR data to antialias. Thus: It works in DX10 mode... and Antialiasing will be available in DX10.

    So: If you get Vista -- UT3 will antialias no problem.

    But yes -- the Bioshock trick is better than nothing... and definitely is good enough for most people. But the hack is just that -- a hack -- so the developers cannot implement it.
    Last edited by Phopojijo; 10-19-2007 at 01:13 AM.
    Core i7 920, Foxconn Renaissance, 6GB OCZ DDR3, GeForce GTX460 + GTX 260
    X-Fi Platinum, Bose QC15 Headphones, Windows 7 x86-64
    Rosewill RK-9000 (PS/2 Adapt. NKRO), Adesso Cybertablet 12000, LG Wide 20" LCD + Samsung 23" 1080p LED + LG Wide 22" Glossy LCD

  25. #25
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Posts
    48

    Default

    In my opinion, HDR sucks too. I hate what it does to the games. Look at Test Drive Unlimited...that game had potential, and even though you could disable HDR, the visuals were HORRIBLE with HDR on. Granted, Source handles HDR better, albeit without AA. (which is nvidia's fault, not the developer)
    Actually, Source can handle AA + HDR concurrently even on PS2.0 cards. Valve's implementation of HDR is somewhat different to most out there, I believe. Are you sure your nVidia card doesn't support it? (old ATI card here and it works).
    UWindows FTW!
    Adult content please!
    Motion blur would be awesome, too =)

  26. #26
    Banned
    Join Date
    May 2007
    Location
    Canada /QuÚbec
    Posts
    300

    Default

    Stop complaining guys.

    1- They confirmed that AA will be implemented in the UE3 engine soon.

    2- AF is currently working without any problem and doesn't give any performance loss.

    3- Most people struggle to get 30-40fps so why would they want to activate 2-4x AA...

  27. #27
    Redeemer
    Join Date
    Oct 2002
    Location
    U.K
    Posts
    1,050
    Gamer IDs

    Gamertag: Sharpfish

    Default

    Yes HL2 EP2 with 16XAA + HDR + *SUBTLE* motion blur looks frigging awesome on my 8800GTX in widescreen (note Valve know how to do widescreen correctly too!).

    Honestly, the engine is older (but upgraded) but I saw stuff in EP2 that looked phenomenal and it has mostly to do with texture choice, level design and especially the lighting, it was almost real.. subtle but just right. Solid as f**K too, none of this 'crap control' and 'flaky performance' stuff like in Bioverhypedshock


    re the issue - as i've said elsewhere, on an 8800gtx I can play UT3 beta with 4XAA on (in drivers 'bioshock.exe' renaming) and full details comfortably.
    Last edited by Sharpfish; 10-19-2007 at 01:53 AM.
    >> MY MAPS << UT3: None! UT2k4: DM-MindGames2 UT2k3: FestiveWorlds, Mindgames, SharpfishPark, Starfall, Tropica][, FestiveRidge UT99: DM-Resistor, DM-Capacitor, CTF-Afinity, DM-EvilTavern, DM-8ballHero Unreal 98: A few...

    Download Games Spacehotel

  28. #28
    MSgt. Shooter Person
    Join Date
    May 2006
    Location
    Germany, NiedersaXen
    Posts
    252

    Default

    the lack of "AA" has nothing to do with "HDRR"(High Definition Range-Rendering)!
    DX9 can show AA and HDRR together fine, only the GeForce 7xxx generation gfx cards can not show AA and HDRR at the same time.

    the lack of AA in UE3 and DX9 is because UE3 uses a technic called "deferred shading", it's like a subset of the stuff a "deferred renderer" uses.

    the x-ray engine that Stalker uses is a "deferred renderer" that means Stalker + DX9 = no AA, same goes to the UE3.

    ok, there are tricks to activate AA under DX9, but sometimes ther are some flaws.

  29. #29
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by MuLuNGuS View Post
    the lack of "AA" has nothing to do with "HDRR"(High Definition Range-Rendering)!
    DX9 can show AA and HDRR together fine, only the GeForce 7xxx generation gfx cards can not show AA and HDRR at the same time.

    the lack of AA in UE3 and DX9 is because UE3 uses a technic called "deferred shading", it's like a subset of the stuff a "deferred renderer" uses.

    the x-ray engine that Stalker uses is a "deferred renderer" that means Stalker + DX9 = no AA, same goes to the UE3.

    ok, there are tricks to activate AA under DX9, but sometimes ther are some flaws.
    You're wrong. 8 series cannot render AA and HDR at the same time iun DirectX 9/WinXP, unless the game engine supports the feature.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  30. #30
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    devs?....anything?

    How about some feedback?
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  31. #31
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Location
    Spain
    Posts
    360
    Gamer IDs

    Gamertag: PC: Guasacaca

    Default

    Quote Originally Posted by gutter View Post
    devs?....anything?

    How about some feedback?
    too busy doing.... ehrm... something i guess.

    @Phopojijo
    Head to gamespot and watch the screenies they have of forced AA in Bioshock, and then come again and rain some more knowledge to me/us

  32. #32
    MSgt. Shooter Person
    Join Date
    Oct 2007
    Location
    Brisbane, Australia
    Posts
    486

    Default

    FYI, HDR isn't an effect, it refers to pixel illuminance values higher than 1, or basically pixels that go from black (darkness), to white (say, a piece of paper), to 'super white' (the sun). What developers do with HDR is the effect, and sometimes it's not so nice. Subtle is usually the best choice.

    If you're referring to bloom, that's just one small effect. Simulating an aperture effect (entering or exiting dark/light areas and seeing the contrast readjust), as well as extremely realistic reflections are other advantages of HDR.

    Anyway, as I said, just an FYI.

  33. #33
    MSgt. Shooter Person
    Join Date
    Sep 2006
    Location
    In a smooth antialiased world.
    Posts
    233
    Gamer IDs

    Gamertag: ConsolesSUCK

    Default

    Quote Originally Posted by faultymoose View Post
    FYI, HDR isn't an effect, it refers to pixel illuminance values higher than 1, or basically pixels that go from black (darkness), to white (say, a piece of paper), to 'super white' (the sun). What developers do with HDR is the effect, and sometimes it's not so nice. Subtle is usually the best choice.

    If you're referring to bloom, that's just one small effect. Simulating an aperture effect (entering or exiting dark/light areas and seeing the contrast readjust), as well as extremely realistic reflections are other advantages of HDR.

    Anyway, as I said, just an FYI.
    Oh no. Not another one.

    Thanks for your definition of what HDR and Bloom are, but you are wrong about one thing. High Dynamic Range lighting is a visual "effect". Anything that alters the visuals for better or worse and are intentional, can be considered an effect. You are thinking that "effect" is a side-effect. That's not what I am referring to.

    Regardless, these "effects" have nothing to do with the topic.

    WILL I BE ABLE TO FORCE AA WITH A NVIDIA 8 SERIES CARD IN WIN XP?
    Last edited by gutter; 10-19-2007 at 12:47 PM.
    Best movie ever: Hot Rod (SEE IT)
    Antec Nine Hundred Case - Thermaltake ToughPower 850 - Asus Crosshair Motherboard - Athlon X2 5200+ @2.8Ghz - Corsair 2GB XMS2 4/4/4/12 - eVGA 8800Ultra - (2) 150GB WD Raptors Raid 0 - Soundblaster Audigy 4 7.1 - Windows XP Sp2 - DVD-RW (x2) - Vigor Gaming Monsoon II TEC HS/F

  34. #34

    Default

    Quote Originally Posted by gutter View Post
    Oh no. Not another one.

    Thanks for your definition of what HDR and Bloom are, but you are wrong about one thing. High Dynamic Range lighting is a visual "effect". Anything that alters the visuals for better or worse and are intentional, can be considered an effect. You are thinking that "effect" is a side-effect. That's not what I am referring to.

    Regardless, these "effects" have nothing to do with the topic.

    WILL I BE ABLE TO FORCE AA WITH A NVIDIA 8 SERIES CARD IN WIN XP?
    No you will not.

    Want more proof then?

    Interview with Tim Sweeney, Lead Programmer and CEO of Epic Games

    Q. Recently you spoke of Unreal Tournament 3 (UT3) using DirectX 10 rendering features. How will this differ from the DirectX 9.0 graphical effects, is this just for a speed increase? How is the geometry shader being used with DirectX 10 hardware and UT3?

    A. We’re primarily using DirectX 10 to improve rendering performance on Windows Vista. The most significant benefit is that, on Vista, DirectX 10 enables us to use video memory more efficiently than DirectX 9 and thus use higher-res textures.

    The most visible DirectX 10-exclusive feature is support for MSAA on high-end video cards. Once you max out the resolution your monitor supports natively, antialiasing becomes the key to achieving higher quality visuals.

    Q. Does Unreal Tournament 3 support HDR rendering with Anti-aliasing?

    A. Yes, on Windows Vista. On all PC platforms, we support running with 16-bit-per-component frame buffer (64 bits total). MSAA anti-aliasing support is only enabled on DirectX 10, because the deferred rendering techniques used by the engine require some capabilities not included in DirectX 9.

    Quote Originally Posted by Ignotium View Post
    too busy doing.... ehrm... something i guess.

    @Phopojijo
    Head to gamespot and watch the screenies they have of forced AA in Bioshock, and then come again and rain some more knowledge to me/us
    I did:

    I posted 1 of them repetitively because it best illustrates where the hack fails: Here it is again for those who have yet to see it.

    It fails during Antialiasing between very-different intensity objects... such as the bright light on a night-lit wall.

    Quote Originally Posted by Pse View Post
    Actually, Source can handle AA + HDR concurrently even on PS2.0 cards. Valve's implementation of HDR is somewhat different to most out there, I believe. Are you sure your nVidia card doesn't support it? (old ATI card here and it works).
    It wasn't a Valve fix, it was an ATI fix. Also -- if you run an ATI card on UnrealEngine3 it will not antialias HDR properly either.

    Valve's HDR implementation was at very low bittage and therefore didn't need to use deferred rendering... and therefore could be antialiased. Real HDR implementations use 64bit color (typically) which cannot work with ATI's hack-fix.

    Quote Originally Posted by MuLuNGuS View Post
    the lack of "AA" has nothing to do with "HDRR"(High Definition Range-Rendering)!
    DX9 can show AA and HDRR together fine, only the GeForce 7xxx generation gfx cards can not show AA and HDRR at the same time.

    the lack of AA in UE3 and DX9 is because UE3 uses a technic called "deferred shading", it's like a subset of the stuff a "deferred renderer" uses.

    the x-ray engine that Stalker uses is a "deferred renderer" that means Stalker + DX9 = no AA, same goes to the UE3.

    ok, there are tricks to activate AA under DX9, but sometimes ther are some flaws.
    You're actually right.

    ATI made HDR+AA work in DirectX9 -- the problem is it was very low bitage HDR (which defeats the purpose for some more advanced applications of HDR).

    64-bit FP HDR (which is what I was referring to) requires Deferred Rendering... which is where AA breaks down. So -- you're right, but when we refer to HDR we typically mean 64FP HDR... which again means Deferred Rendering.

    Lower bittage HDR just isn't high enough precision for applications Epic wanted to use HDR for.
    Last edited by Phopojijo; 10-19-2007 at 09:22 PM.
    Core i7 920, Foxconn Renaissance, 6GB OCZ DDR3, GeForce GTX460 + GTX 260
    X-Fi Platinum, Bose QC15 Headphones, Windows 7 x86-64
    Rosewill RK-9000 (PS/2 Adapt. NKRO), Adesso Cybertablet 12000, LG Wide 20" LCD + Samsung 23" 1080p LED + LG Wide 22" Glossy LCD


 

Bookmarks

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  
Copyright ©2009-2011 Epic Games, Inc. All Rights Reserved.
Digital Point modules: Sphinx-based search vBulletin skin by CompletevB.com.