Future of precipitation.
#21
Posted 24 June 2015 - 12:33 AM
#22
Posted 24 June 2015 - 02:33 AM
I split the 32bit process to non LAA usage and LAA usage. What would be the ideal memory usage? The critical situation is evidently when the LAA option is not used. The information below is from my latest executable that I uploaded for testing.
non LAA usage
Allocation for 472,799 particles: 75,647,840 B RAM vertex data 11,347,176 B RAM index data (temporary) 75,647,840 B VRAM DynamicVertexBuffer 11,347,176 B VRAM IndexBuffer 83MB used.
LAA usage
Allocation for 984,605 particles: 157,536,800 B RAM vertex data 23,630,520 B RAM index data (temporary) 157,536,800 B VRAM DynamicVertexBuffer 23,630,520 B VRAM IndexBuffer 173MB used.
Edward.
#23
Posted 24 June 2015 - 11:41 AM
Hamza97, on 24 June 2015 - 12:33 AM, said:
If you have any suggestions for open world games (train sim or not) that have good weather effects, I'm sure we can see what they're up to. :)
edwardk, on 24 June 2015 - 02:33 AM, said:
The ideal memory usage is 0. ;)
Seriously though, there isn't a fixed number any of us can say is the right limit. It's always a balance between good effects and everything else that needs to appear visually. That said, given we're only using a couple of MB currently, I don't think we want to increase it much beyond 25-50MB because that is 10-20x - a sizable increase.
You may be able to change the sizes of the precipitation area to further improve things without increasing the particles beyond the 25-50MB limit too - we may well not be making optimal use of the particles. You may have done this already, though.
And indeed, it's likely that we should be using different methods for near and far precipitation (such as textures with multiple particles drawn on) but that's something that will have to come later.
#24
Posted 24 June 2015 - 12:08 PM
James Ross, on 24 June 2015 - 11:41 AM, said:
The ideal memory usage is 0. :sweatingbullets:
Seriously though, there isn't a fixed number any of us can say is the right limit. It's always a balance between good effects and everything else that needs to appear visually. That said, given we're only using a couple of MB currently, I don't think we want to increase it much beyond 25-50MB because that is 10-20x - a sizable increase.
You may be able to change the sizes of the precipitation area to further improve things without increasing the particles beyond the 25-50MB limit too - we may well not be making optimal use of the particles. You may have done this already, though.
And indeed, it's likely that we should be using different methods for near and far precipitation (such as textures with multiple particles drawn on) but that's something that will have to come later.
You are correct that I have done it already. Both precipitation area and intensity setting go hand in hand so at this point its a matter of balancing both. The reason why I asked about the memory usage is that there are systems being used by people who are unable to take advantage LAA. Memory usage would aleady be tight enough with the other processes running. I can bring the non LAA usage down further without sacraficing too much. I can do the same for LAA usage, but would be nice to leave enough eye candy. Besides, I have pushed my system using LAA and I have not experienced any crashing. The area that will take a hit is the frame rate. In this situation, its possible a high end system may be able to work with the increased particle count.
As far as improvements are concerned, I have a feeling that we would have to switch to a different graphic engine for this.
Edward K.
#25
Posted 24 June 2015 - 12:30 PM
edwardk, on 24 June 2015 - 12:08 PM, said:
This statement makes no sense to me. What makes you think we cannot do better with XNA and DirectX 9? Don't forget that we're just using XNA to access DirectX, we're not using any of its game features - so we're not limited by them either. We can't use DirectX 10 and 11 features, but we're not exactly pushing DirectX 9 here at all (we're not even using instancing!).
#26
Posted 27 June 2015 - 10:02 PM
For the non LAA process, I adjusted the box size and intensity value so that its taking up about 54MB to 56MB. For LAA, I made minor changes even though there were no problems.
James, I left the the trace command in place.
Edward K.
Attached File(s)
-
3180.zip (1.24MB)
Number of downloads: 138
#27
Posted 01 July 2015 - 09:48 AM
edwardk, on 27 June 2015 - 10:02 PM, said:
James, I left the the trace command in place.
That size seems good - though we will of course monitor for feedback. I've updated and approved the blueprint. Please remove the trace command or put it inside an #if which is undefined.
#28
Posted 01 July 2015 - 11:12 AM
Edward K.
#30
Posted 02 July 2015 - 11:27 AM
However there is an issue:
To decide if the player has at his disposal two or more GB of memory, bool Program.Simulator.Settings.UseLargeAddressAware is now used. This does not reflect the real memory situation in my opinion, as it is possible to use the LAA executable and yet have only two GB available, when in a 32 bit Windows the LAA switch has not been set. Instead of the above bool, GC.GetTotalMemory(false) / 1024 / 1024 should be used, as it is in HUDWindow.cs, line 788.
I am a typical case of a person using the LAA executable (because it is the default), but not having enabled my 32 bit Windows to extend application to 3 GB (not having it enabled is the default, and may also be advisable if you have only 4 GB of RAM). As I am using defaults for my configuration, I think I am not the only one.