Announcement

Collapse
No announcement yet.

HDR bit option

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    HDR bit option

    From what ive been hearing is that 32 bit HDR will allow you to enable antialiasing at the same time while 64bit HDR wont. I could be wrong but here was my thought. How about having an option in UT3 to switch from which level of HDR lighting you wanna use from 32 to 64 incase you wanted to enable AA at same time.

    #2
    HDR?

    Antialiasing at the same time as HDR?

    Since I have no idea what you're typing about, I'm obviously not the intended audience for your post... but I am curious. Could you enlighten me please?

    Comment


      #3
      Originally posted by Kronos View Post
      From what ive been hearing is that 32 bit HDR will allow you to enable antialiasing at the same time while 64bit HDR wont. I could be wrong but here was my thought. How about having an option in UT3 to switch from which level of HDR lighting you wanna use from 32 to 64 incase you wanted to enable AA at same time.
      where are you hearing this?

      it only seems logical that antialising would be always available, i dont think anyone is going to put down $2k+ for a machine that wont run the best game without antialising. from a logistics standpoint this doesnt make much sense either (isnt the diffrences between 32 and 64 bit just the number of colors?).

      Comment


        #4
        Originally posted by Mort_Q View Post
        HDR?

        Antialiasing at the same time as HDR?

        Since I have no idea what you're typing about, I'm obviously not the intended audience for your post... but I am curious. Could you enlighten me please?
        HDR is fancy game lighting technology, and Antialiasing is what makes straight lines not look jaggedy on you monitor. The regular version of HDR works with antialiasing, but the fancier version doesn't.

        (Click any of the images for better quality)

        [shot]http://upload.wikimedia.org/wikipedia/en/thumb/1/1d/Farcryhdr.jpg/800px-Farcryhdr.jpg[/shot]
        [shot]http://upload.wikimedia.org/wikipedia/en/thumb/c/cc/Hl2hdrcomparison.jpg/800px-Hl2hdrcomparison.jpg[/shot]
        [shot]http://upload.wikimedia.org/wikipedia/en/thumb/a/a9/Lostcoasttrnsmsn.jpg/800px-Lostcoasttrnsmsn.jpg[/shot]
        [shot]http://imgred.com/http://www.tweakguides.com/images/Antialiasing.gif[/shot]

        Comment


          #5
          Unreal Engine 3.0 uses "Deferred Rendering" that means, no "AA" on Direct3D 9 hardware.

          with D3D 10 hardware on Windows Vista "AA" with UE3 will be possible.....again
          ®

          Comment


            #6
            High Dynamic Range lighting is just a method of storing more information about an image. (>24bit colour so your monitor cannot display the image in all its glory)

            It allows images that have areas of high brightness as well as low brightness to be more accuratly represented. (hence the high dynamic range) You can use this extra stored information to create lighting effects like glare.

            Comment


              #7
              Originally posted by MuLuNGuS View Post
              Unreal Engine 3.0 uses "Deferred Rendering" that means, no "AA" on Direct3D 9 hardware.

              with D3D 10 hardware on Windows Vista "AA" with UE3 will be possible.....again
              ®
              How about other games such as Rainbowsix vegas? or Splintercell double agent?

              Comment


                #8
                roboblitz doesnt support AA, the game looked horrible because of it. I hope I can run AA on UT3... otherwise I will be very dissapointed.

                Comment


                  #9
                  Originally posted by Krawler View Post
                  roboblitz doesnt support AA, the game looked horrible because of it. I hope I can run AA on UT3... otherwise I will be very dissapointed.
                  From what Mulungus said, it sounds as if roboblitz would with directx10

                  Comment


                    #10
                    The HDR was awesome on HL2: Lost Coast... So awesome in fact that I would gladly give up my AA for HDR. Besides, at 1280x1024 I don't notice the jaggies anyway.

                    Also... My Nvidia cards don't support HDR and AA simultaneously anyway, so I don't really give a gee whillikers.

                    Comment


                      #11
                      I can still notice jaggies with ease without AA at 1440x900.

                      Comment


                        #12
                        boohoo.
                        i don't give a rat's a** about AA if I can have HDR. jaggies don't kill the game.
                        sure i put x16Q in some games to be confortable, but i'm on a 15" screen

                        Comment


                          #13
                          Meh, I'll take better scene visibility (the direct result of HDR) over smoother edes, besides in my experiance AA is pointless over 1024* 768 unless you have a LOT of extra horsepower, which looking at recent comments form epic, nobodies computer will have.

                          Comment


                            #14
                            i have a 1280x1024 monitor but use AA in splinter cell doulbe agent instead of HDR with its few light effects. i HATE jaggies..

                            Comment


                              #15
                              Aa is overrated, anyway. A few jagged lines versus actually having to squint when I stare into a bright light? I'll take HDR any day.

                              Comment

                              Working...
                              X