Future of precipitation.
#16
Posted 21 June 2015 - 07:42 PM
Initially, I was able to improve the situation by changing MaxIntensityPPSPM2 to a lower value and this worked. Until I took advantage of SLI's 50% off sale and purchased some train sets. The first set running was SLI's GP9u with their default consist. With clear skies and no precipitation, I started getting out of memory errors(LAA disabled)??? MaxIntensityPPSPM2 was set at 0.025f. Another consist with a different locomotive would work, but not this. I am not blaming the locomotive, but I realized at this point there is a resource issue. Lowering MaxIntensityPPSPM2 to 0.015f would work, but at this point we are defeating the purpose of the project.
The cause could be anything. I do know that the precipitation process must initialize the index no matter what. With clear skies, resources are still being used. There is always the possibility that OR is unable to manage resources properly or with all that is running, the LAA process is mandatory. Any thoughts or ideas??
Edit: I initially used 300 meters for the precipitation box height. Changing it to 250 would work, but there is still the question as to how memory is being used.
Edward K.
#17
Posted 22 June 2015 - 01:10 PM
Edward K.
#18
Posted 22 June 2015 - 09:35 PM
Note: The test is only for the 32bit graphic device.
Here are the new values if Large Address Aware is not checked
// 32bitbit Box Parameters for systems that are unable to take advantage of LAA
const float ParticleBoxLengthM_Limited_32 = 1970;
const float ParticleBoxWidthM_Limited_32 = 1000;
const float ParticleBoxHeightM_Limited_32 = 160;
public const float MaxIntensityPPSPM2_Limited = 0.015f;
Edward K.
Attached File(s)
-
Weather3169.zip (1.24MB)
Number of downloads: 187
#19
Posted 23 June 2015 - 01:04 PM
The current/existing code allocations look like this:
Allocation for 16,124 particles: 2,579,840 B RAM vertex data 193,488 B RAM index data (temporary) 2,579,840 B VRAM DynamicVertexBuffer 193,488 B VRAM IndexBuffer
That's 2.6MB, or 0.1% of the 32bit virtual address space.
With the patch applied and the larger index buffers in use, the allocations look like this:
Allocation for 1,627,614 particles: 260,418,240 B RAM vertex data 39,062,736 B RAM index data (temporary) 260,418,240 B VRAM DynamicVertexBuffer 39,062,736 B VRAM IndexBuffer
That's 285MB, or 14% of the 32bit virtual address space, and the same on the graphics card.
It's no wonder you're running out of memory. :)
The information can be shown with the following line (with the patch):
Trace.TraceInformation(String.Format("Allocation for {0:N0} particles:\n {1,13:N0} B RAM vertex data\n {2,13:N0} B RAM index data (temporary)\n {1,13:N0} B VRAM DynamicVertexBuffer\n {2,13:N0} B VRAM IndexBuffer", MaxParticles, Marshal.SizeOf(typeof(ParticleVertex)) * MaxParticles * VerticiesPerParticle, (graphicsDevice.GraphicsDeviceCapabilities.MaxVertexIndex > 0xFFFF ? sizeof(uint) : sizeof(ushort)) * MaxParticles * IndiciesPerParticle));
#20
Posted 23 June 2015 - 01:17 PM
What steps do I have to take to do what you just did?
Edit: Never mind. The answer was right in front of me at the very bottom :)
Edward K.
#21
Posted 24 June 2015 - 12:33 AM
#22
Posted 24 June 2015 - 02:33 AM
I split the 32bit process to non LAA usage and LAA usage. What would be the ideal memory usage? The critical situation is evidently when the LAA option is not used. The information below is from my latest executable that I uploaded for testing.
non LAA usage
Allocation for 472,799 particles: 75,647,840 B RAM vertex data 11,347,176 B RAM index data (temporary) 75,647,840 B VRAM DynamicVertexBuffer 11,347,176 B VRAM IndexBuffer 83MB used.
LAA usage
Allocation for 984,605 particles: 157,536,800 B RAM vertex data 23,630,520 B RAM index data (temporary) 157,536,800 B VRAM DynamicVertexBuffer 23,630,520 B VRAM IndexBuffer 173MB used.
Edward.
#23
Posted 24 June 2015 - 11:41 AM
Hamza97, on 24 June 2015 - 12:33 AM, said:
If you have any suggestions for open world games (train sim or not) that have good weather effects, I'm sure we can see what they're up to. :)
edwardk, on 24 June 2015 - 02:33 AM, said:
The ideal memory usage is 0. ;)
Seriously though, there isn't a fixed number any of us can say is the right limit. It's always a balance between good effects and everything else that needs to appear visually. That said, given we're only using a couple of MB currently, I don't think we want to increase it much beyond 25-50MB because that is 10-20x - a sizable increase.
You may be able to change the sizes of the precipitation area to further improve things without increasing the particles beyond the 25-50MB limit too - we may well not be making optimal use of the particles. You may have done this already, though.
And indeed, it's likely that we should be using different methods for near and far precipitation (such as textures with multiple particles drawn on) but that's something that will have to come later.
#24
Posted 24 June 2015 - 12:08 PM
James Ross, on 24 June 2015 - 11:41 AM, said:
The ideal memory usage is 0. :sweatingbullets:
Seriously though, there isn't a fixed number any of us can say is the right limit. It's always a balance between good effects and everything else that needs to appear visually. That said, given we're only using a couple of MB currently, I don't think we want to increase it much beyond 25-50MB because that is 10-20x - a sizable increase.
You may be able to change the sizes of the precipitation area to further improve things without increasing the particles beyond the 25-50MB limit too - we may well not be making optimal use of the particles. You may have done this already, though.
And indeed, it's likely that we should be using different methods for near and far precipitation (such as textures with multiple particles drawn on) but that's something that will have to come later.
You are correct that I have done it already. Both precipitation area and intensity setting go hand in hand so at this point its a matter of balancing both. The reason why I asked about the memory usage is that there are systems being used by people who are unable to take advantage LAA. Memory usage would aleady be tight enough with the other processes running. I can bring the non LAA usage down further without sacraficing too much. I can do the same for LAA usage, but would be nice to leave enough eye candy. Besides, I have pushed my system using LAA and I have not experienced any crashing. The area that will take a hit is the frame rate. In this situation, its possible a high end system may be able to work with the increased particle count.
As far as improvements are concerned, I have a feeling that we would have to switch to a different graphic engine for this.
Edward K.
#25
Posted 24 June 2015 - 12:30 PM
edwardk, on 24 June 2015 - 12:08 PM, said:
This statement makes no sense to me. What makes you think we cannot do better with XNA and DirectX 9? Don't forget that we're just using XNA to access DirectX, we're not using any of its game features - so we're not limited by them either. We can't use DirectX 10 and 11 features, but we're not exactly pushing DirectX 9 here at all (we're not even using instancing!).
#26
Posted 27 June 2015 - 10:02 PM
For the non LAA process, I adjusted the box size and intensity value so that its taking up about 54MB to 56MB. For LAA, I made minor changes even though there were no problems.
James, I left the the trace command in place.
Edward K.
Attached File(s)
-
3180.zip (1.24MB)
Number of downloads: 181
#27
Posted 01 July 2015 - 09:48 AM
edwardk, on 27 June 2015 - 10:02 PM, said:
James, I left the the trace command in place.
That size seems good - though we will of course monitor for feedback. I've updated and approved the blueprint. Please remove the trace command or put it inside an #if which is undefined.
#28
Posted 01 July 2015 - 11:12 AM
Edward K.
#30
Posted 02 July 2015 - 11:27 AM
However there is an issue:
To decide if the player has at his disposal two or more GB of memory, bool Program.Simulator.Settings.UseLargeAddressAware is now used. This does not reflect the real memory situation in my opinion, as it is possible to use the LAA executable and yet have only two GB available, when in a 32 bit Windows the LAA switch has not been set. Instead of the above bool, GC.GetTotalMemory(false) / 1024 / 1024 should be used, as it is in HUDWindow.cs, line 788.
I am a typical case of a person using the LAA executable (because it is the default), but not having enabled my 32 bit Windows to extend application to 3 GB (not having it enabled is the default, and may also be advisable if you have only 4 GB of RAM). As I am using defaults for my configuration, I think I am not the only one.