Announcement

Collapse
No announcement yet.

Newb question about netspeed....

Collapse
X
 
  • Filter
  • Time
  • Show
Clear All
new posts

    Newb question about netspeed....

    Even though I have been looking at these forums for almost 2 years, I haven't found anything specifically saying what netspeed I should have if my connection is <insert speed here>, so I have to ask the question:
    What should my netspeed be if I have an average of 2.4mbit down/450kbit up?
    Answers appreciated.

    #2
    Multiply your refresh rate by 64, and then round up or down to the nearest half-thousand.

    For me, that would be 85*64=5440, rounded up to 5500.

    Comment


      #3
      Why would refresh rate have anything to do with it?

      (It's 4500 btw)

      Comment


        #4
        http://www.clanvikings.org/tnse/utstuff.html



        EDIT:
        Ugh beaten. :sour:

        Comment


          #5
          Originally posted by slime73
          Why would refresh rate have anything to do with it?

          (It's 4500 btw)
          Follow Eclipse's link above.

          Comment


            #6
            Nitro do you think this information from the link is still valid today ??since it was written what 5 or 6 years ago...just wondering as we have win xp more ram CPU's are godzilla compared to pent .2 350 MHZ ...your opinion please..

            Comment


              #7
              so I read that link, and I am still kinda confused...

              Is it recommending a netspeed of 64*refresh rate for all servers, regardless of server tick? Cause it didn't seem to make a very strong corallation between a good netspeed and serer tick (as it recommends netspeeds based on refresh, not tick)

              Another lil question...if my monitor runs at 60hz, but my computer is renders frames at the cap (say 85), which rate is this article refering to? and does my machine process 85 game states per second or only 60?

              Comment


                #8
                Originally posted by garcia_y_vega
                so I read that link, and I am still kinda confused...
                hehe people have been confused by that link for many years now, including myself.

                Id say, unless you are having networking probs, try and ignore it. I fiddled around a bit and it only seemed to make things worse. When you install UT you specify your connection speed and it defaults a netspeed based on that. Usually the default is close enough to what you want.

                Is there a reason you think you need to change it? Life's too short to worry about adjusting low level stuff like that. Its like the people who up their server tickrate cos it makes the reported ping lower only to make the server perform much worse. From experience, ut2k4 just doesnt perform well if you are far from the server - no matter what tick, netspeed, ping, etc. The netcode just isnt very forgiving im afraid. Stick to local servers and you'll be fine !

                If the game plays ok, leave well alone! Having said that I know some people have fixed probs with these tweaks so who knows...

                Comment


                  #9
                  Originally posted by Jezza101
                  h
                  Id say, unless you are having networking probs, try and ignore it.
                  I do actually have very frustrating packet loss problems on many servers (local and otherwise), so if there is any way to further optimize my net setttings on a per server basis I would like to try it.

                  Comment


                    #10
                    Originally posted by garcia_y_vega
                    I do actually have very frustrating packet loss problems on many servers (local and otherwise), so if there is any way to further optimize my net setttings on a per server basis I would like to try it.
                    The only way to solve problems related with packetloss would be to either send everything reliably (which takes longer, and costs more bandwidth) or send everything multiple times (which costs a ****load more bandwidth)
                    Changing your netspeed (which is practically synonymous with bandwidth allotment/limit) won't help here.

                    Originally posted by Eclipse
                    http://www.clanvikings.org/tnse/utstuff.html
                    That article was written for UT, and even for UT there are some things I quite strongly question.

                    Comment


                      #11
                      Originally posted by garcia_y_vega
                      so I read that link, and I am still kinda confused...

                      Is it recommending a netspeed of 64*refresh rate for all servers, regardless of server tick? Cause it didn't seem to make a very strong corallation between a good netspeed and serer tick (as it recommends netspeeds based on refresh, not tick)

                      Another lil question...if my monitor runs at 60hz, but my computer is renders frames at the cap (say 85), which rate is this article refering to? and does my machine process 85 game states per second or only 60?
                      Refresh rate is how many times the video processing hardware updates your screen per second. Basically, it's the hardware limit of the FPS you can get. You can set the refresh rate in the advanced settings of your desktop properties. So if you're running at 60hz, your frame cap is 60. If you're running at 85hz, your frames cap is 85.

                      However, refresh rate and frames persecond are not the same thing. Since refresh rate is the maximum limit that the PC hardware can update the screen every second, your eyes will only see a maximum of 85 frames drawn every second, even if UT says you're getting, say, 93. When your FPS exceeds your refresh rate, you start getting two frames in one update. Frames are drawn from top to bottom, so this means that the top half of the screen will be one frame but the bottom half will be another. This is what we call "Tearing." Sometimes it can be very noticeable. This is where Vsync kicks in to play; I'm not entirely sure what it does, but I'm pretty sure it has something to do with capping your FPS to avoid getting two frames in one update.

                      As for actual netspeed, I've heard from a few people, some being experienced server admins, that the default 10000 netspeed is super-server speeds, and a home PC user would be hard-pressed to reach that. I figure trying this guide can't hurt.

                      Comment


                        #12
                        those instructions are BS, if you use a netspeed much less than 9 or 10 thousand you will receive less information from the servers, which, especially in ONS, can have a significant impact on your game.

                        Comment


                          #13
                          Originally posted by NiTrOcALyPsE
                          However, refresh rate and frames persecond are not the same thing. Since refresh rate is the maximum limit that the PC hardware can update the screen every second, your eyes will only see a maximum of 85 frames drawn every second, even if UT says you're getting, say, 93. When your FPS exceeds your refresh rate, you start getting two frames in one update. Frames are drawn from top to bottom, so this means that the top half of the screen will be one frame but the bottom half will be another. This is what we call "Tearing." Sometimes it can be very noticeable. This is where Vsync kicks in to play; I'm not entirely sure what it does, but I'm pretty sure it has something to do with capping your FPS to avoid getting two frames in one update.
                          Without VSync, you will ALWAYS get tearing, no matter how high or low your framerates are. It's also not true that with a refreshrate of 85, getting more than 85 FPS is wasted. As you said yourself, the game will start drawing the next frame before your monitor reaches the bottom of the screen (with VSync off this is true regardless of your framerate) but that does mean that the bottom of your screen is more up to date than it would've been if you your PC took longer to render the next frame.


                          Originally posted by NiTrOcALyPsE
                          As for actual netspeed, I've heard from a few people, some being experienced server admins, that the default 10000 netspeed is super-server speeds, and a home PC user would be hard-pressed to reach that. I figure trying this guide can't hurt.
                          Either you misunderstood them, or they just discredited themselves as experienced server admins. I'm guessing the former, because it doesn't take a genius to figure out 10000 refers to 10000 bytes/second. That's actually well reachable for any connection with an upload above 128kbit.
                          What I think they meant is either that the game will never download/upload that much because your FPS won't get that high, or that only a very fast server could provide 32 clients with 10000bytes/s of data so that to run a server like that, you'll have to limit the netspeed serverside.

                          Comment


                            #14
                            Originally posted by Boksha
                            Without VSync, you will ALWAYS get tearing
                            No you won't. I play at 1024x178 at 150Hz and I don't get any tearing or flickering at all, and I have VSync disabled in the nvidia drivers and the UT2004.ini.

                            Comment


                              #15
                              Originally posted by martinblank
                              No you won't. I play at 1024x178 at 150Hz and I don't get any tearing or flickering at all, and I have VSync disabled in the nvidia drivers and the UT2004.ini.
                              It's there, you just don't see it because of your high refreshrate and the fact that tearing is barely visible on full-motion visuals anyway. (you most likely will see it if instead of UT2k4, you saw a white circle moving on a black background)
                              As I said, unless there is some sort of sync between screen refreshes and buffer flips, there WILL be tearing because there's no guarantee the new frame won't be done halfway through the screen refreshing process. (in fact, it's pretty much guaranteed the new frame is done during this process, because it's what your monitor is doing 99% of the time)

                              Comment

                              Working...
                              X