No announcement yet.

Player count maximum

  • Filter
  • Time
  • Show
Clear All
new posts

    Player count maximum

    I'm interested in potentially making a game with UDK which dials back the graphics and complexity of most regular games, but pushes player limit somewhat more than usual.

    After some research on the forums and reading the MMO FAQ thread it appears the maximum number of players supported in multi player is 64. I don't know if this has changed since then?

    Does anyone have any idea what is required to extend number of players above 64 players? Is it even something that can be done with the UDK or does it require more low level control over the source for the engine?

    It seems like an odd arbitrary limit to have, with hardware and bandwidth available enough to push easily into the 100+ players (I'm thinking more along the line of even bigger battles for game ideas) I don't understand why this would be a restriction, unless specifically to stop the engine being used to build MMO style games.

    I'm open to any suggestions and advice, thanks.

    No, you can't go over it. Don't really want to explain why right now so someone else can.


      search it its been asked millions of times


        Hardware and bandwidth is absolutely not an easy problem to solve there, bandwidth especially. An AI controlled Player Pawn takes about 70-80% or so of the bandwidth that a human controlled pawn does (no incoming data to deal with), and if you put 100 ai controlled pawns into a tight area, with 3 clients connected, there will absolutely not be enough bandwidth to even remotely deal with it, at normal settings. Now, add that you have to mirror all this data to 97 more clients, and you have to deal with all the INCOMING data, too.

        When your network cable is warm to the touch, you know you've got a whole hell of a lot of data flying. Direct LAN connections at 100Mbit are not sufficient.


          So you're saying the network requirements are above 100Mbit/sec if we were to have 100 players? Even if the net code was really bad I still don't believe that for a second.

          Some research shows that the maxclientrate is the most the server will allow for each client, by default in the UDK the baseengine.ini is set to 25000

          100mbit is 12.5Mbyte

          12.5 * 1024 * 1024 = 13,107,200

          13,107,200 / 25,000 is 524 players...

          That's using old documentation from Epic on calculating server upload requirements.

          100 players should only need 100* 25,000 = 2,500,000 bytes/sec that's 2.38MB/sec or about 19Mbit upload. That's nothing these days, I'm getting 10Mbit upload on my residential connection a little later this year...

          I don't know about UT3 but certainly the other unreal games can use much lower rates than that, in the 7000-8000 range was enough for UT2004 from what I've read, so 25000 is probably an overkill default?


            I don't think the problem is the network side of things, but mainly the server side hardware. When you double the amount of players, you practically roughly rise the amount of data to process two-fold.


              New stephenchau?


                It just doesn't make much sense to me, we've had 64 player games for a very long time now, old games like BF1942 supported 64 players back in 2004, there is loads of custom counter strike source servers supporting 64 players which is also a 2004 game.

                CPU power is doubling every 18-24 months and bandwidth is going up almost as fast, yet in all this time we have zero improvement in number of supported players. 6 years since those games were release with enough room for a good 3 generation of doubling power puts us at 8x more resources yet we cannot support a higher player count?

                Even if it was a resource issue why would you hard cap the engine? It just narrows the scope of games you can make, surely it's up to the developers to decide what hardware/connection specs are necessary for servers/clients?


                  You might want to have a read through that.


                    Originally posted by Mougli View Post
                    You might want to have a read through that.
                    I've just read through the majority of this article, I already understand a lot of them concepts, but it doesn't describe any real limitations on player count other than the computational power and network speed available.

                    Early in the article a statement is made about network traffic not advancing in speed as aggressively as computational power

                    However, the Internet reality is that 28.8K modems have only perhaps 1% of the bandwidth necessary to communicate complete and exact updates. While consumers' Internet connections will become faster in the future, bandwidth is growing at a rate far lower than Moore's law which defines the rate of improvement in games and graphics.
                    That may have been true back in the 28.8k modem days but since then a lot of high speed internet solutions have been implemented with increase amount of people having access to fibre and hybrid fibre/coax based internet connections.

                    Since that article was written we've seen availability go from 28.8k to 50Mbit+ (you can get 100Mbit+ residential, I have friends with 120Mbit actually, but 50Mbit seems more fair, it has about 50% availability in the UK, we'll have 100mbit for most homes within a few years since FTTC roll-out has started)

                    28.8 * 1024 = 29,491.2 bits/sec
                    50 * 1024 * 1024 = 52,428,800 bits/sec

                    52,428,800 / 29,491 = 1777 increase

                    That's close to a 2 thousand times faster, in a space of about 16 years (1194 for first 28.8k standards). That's 8 generations of Moore's law (every 2 years) which is 2^8 or something like 256x increase in CPU power (give or take when translating to actual processing speed) in that time, so actually this initial assumption is actually quite wrong.

                    So the reason isn't available bandwidth or CPU power, these have both grown exponentially bandwidth even more so than computational power.


                      I've seen people play on high end computers on game servers that had 40+ players on them, and they are choppy as hell, your data rate is terrible.

                      Remember, when you're limiting the data rate per player, to 10k or 20k, you have to drop data, to do it. Now, under normal conditions, let's say that you have 4 players standing next to each other, idling. Each idle player added generates about 900 bytes or so, I'm going to round to 1K per second. So, you have 4 players, roughly 4K per second, that has to be transmitted to each player, for roughly 16K per second.

                      8 players = 8K per second per player, 64K per second overall
                      16 = 16K per second per player, 256K per second overall
                      32 = 32K per second per player, 1024K per second overall
                      64 = 64K per second per player, 4096K per second overall
                      128 = 256K per second per player, 16384K per second overall

                      At around the 10 player rate, is where it starts to drop data that is destined for players with a 10k maxrate, around 20 players with a 20k maxrate. Somewhere relatively close to 110 idle players, and you're pushing the max throughput of 100mbit ethernet. Hopefully you've got your MaxInternetSpeed configured in your INI file, and before you get to this point, the engine has started dropping even more information. Now not only is the engine throttling the data rate to all of your players based on their network speed, it's probably also throttling the data rate based on your own network speed as well. The updates become very very choppy, people start skating around all over the place.

                      Now, there are also several other factors that come into play- many of which do admittedly help the case, some which do not. The big one is that you're likely in a situation where you do not have anywhere near all 100+ players in game within relevant distance to each other at any given time (depending on the map - a large open map will generate a ton more network data and CPU usage due to relevancy checks than a large but tightly closed in city area).

                      The big one on the other side, is that I'm just talking about idling players. Sitting there doing nothing, not moving, not chatting, nothing. They all start moving around, and now everyone is dealing with another 1k or so of data per second. Double all those numbers. At 100 players, you're now talking about the server wanting to pump out 200k per second to 100 players. At 100 players, the server is wanting to put out nearly double what a 100mbit connection will deal with. (and we're not even talking about adding in things like spawning projectiles during gameplay and fun stuff like that)

                      OK, so, hey, everyone is throttled to 10k/sec, the server has it's MaxInternetRate set to 11000000, a practical maximum for a 100mbit connection. It's wanting to send out 200k per second of data to each player, but it has to pick 10k of data that it can send. Great, now your server is only wanting to put out roughly 1 megabyte per second, or 10mbit. However, everyone's experience is that they are missing 19/20th's of all the data they should be getting. Hopefully the network relevancy is cutting that loss of data that can't be transmitted down, but I'd be willing to best that in the best of scenarios, you're still not even transmitting each user anywhere from 20-30% of the data to get the same experience as they would with a "normal" playercount (like 12 or so), and worst case, you're losing > 90% of the information. Hopefully the client-side predictions can keep up.

                      Now, there is one last bastion of hope, you can reduce the amount of information collected to be sent overall, by adjusting the Network Tick Rate. I'm not sure right off hand if it defaults to 60 or 100, but that means that it is basically checking all of your data that might need to be networked 60 or 100 times per second to see if it does, and then sending it off to the network. You can reduce this and drastically decrease everyone's bandwidth requirements. However, doing so tends to also decrease the player experience - far more frequent client side re-adjustments, basically, it appears the same to the player as if they were dropping tons of packets, like they were doing when it's set to the default. IMO, for twitch games, you can't really go much lower than about 40 before people really start to notice the difference, although for other games it is possible to go as low as 10, if you don't mind pawn movements being tracked relatively slowly-ish.

                      Now, of course, I am talking about worst case scenarios for the most part. My actual real world testing, involved a Listen server, and 3 network clients, in a big mostly otherwise empty room, with over a hundred AI controlled pawns updating, and the clients all set for MaxRate of 30k.

                      Somewhat less relevant, I've also done the same sort of thing in some mods for Unreal Engine 2 games, in which case having about 120 AI controlled pawns will crash any player that they suddenly all become relevant to at the same time, with an error in the network handling.

                      If you'd really like to test the boundaries, though, please feel free to open a standard dedicated server, and try connecting 65 or more clients to it. I don't personally know of anyone who has bumped the 64 limit, all the stuff i'm working with is designed for 8 players or fewer, and I don't personally have the computer power even amongst a half-dozen UDK capable machines to run anywhere near that many clients to test it. So, it might just all be a rumor that it's hardcoded, because that's all you could get in earlier versions of the engine.

                      And yes, I realize there might be some faults with my math and assumptions. But, what I've personally tested, is actual network performance at the default Net Tick Rate, with over a hundred AI pawns and 4 player pawns, and it was pretty much unplayable for everyone (including the server!). Knocking the tick rate down to 20 helped significantly decrease the amount of bandwidth that was needed, but was a noticeable sacrifice to the quality of play on all clients. And this was all on a LAN.



                        For some reason my response was sent to a mod and hasn't appeared idea why that is.


                          o.O ? I've not heard of this.


                            The message after I posted instead of taking me back to this thread gave me a message saying a mod had to approve the post and it never appeared.

                            The short version is basically I said that I had read that networking article and that the only thing that it really tells us is that it's a resource issue.

                            I further pointed out that the author (i think it's Tim) says that moore's law is increasing CPU speed faster than networking is growing so we'd never have the network bandwidth to sync everything in game, which I proved isn't the case with a little math on CPU increase over time since then, and bandwidth increase over the same period.

                            Bottom line it's looked upon as a resource issue, and resources is something we have plenty off, so really we're back to the original question of why is the engine limited? Why can't the developers using the engine decide at what point to set the player limit for their game?


                              I'm pretty sure you're wrong.