Originally posted by thesimo
View Post
Originally posted by Loki
View Post
The pre-loading first. When Windows is doing that, it starts to fill the cache. Load task manager and you will see cache increasing and free memory decreasing. That free memory is what people say "that unused is wasted memory". The problem is that unused + cache is really what is available for programs. The 800MB that thesimo is saying is really memory that is not available for UT. If XP shows 500, there will be 1.5GB free. Vista will have 1.2GB. So it's again a bandwagon, Vista is really a hog.
Well, sort of because it's not 500 vs 800.

This is what I think that happens. When you load UT2004 to join a game, what do you see? The splash, the main menu, the server browser, the loading screen and then finally the game. When loading, it is creating auxiliary data structures, etc...
What's the point here? Most of this memory will not be used anymore. So UT will look like using, let's say use 500MB but it will only need 300 when you are fragging. I assume here that exactly when you join, UT invokes a function that is called EmptyWorking set. This tells to the system, ok take all my memory now, and you will give me it back when I touch it.
What happens then? UT will not touch the memory used before you joined the game. So what are those 500MB? It's the absolute max that UT ever needed. The 300MB is what UT needs now. The 300 is called "Working set". The 500 is called "Peak Working Set".
Most programs don't do that. So the memory they seem to use is just a bit lower than the absolute max. Then if a program has a complex loading, it will show a lot of memory used. But often when really running they are not a memory hog after all.
If they all did that at the same time, memory usage would drop like hell. But then, when they are bloat, it would rise too quickly again when are not idle, e.g. the indexer keeps running, you load a new page on the Internet browser, etc...
Leave a comment: