My game runs nicely at about 50-60 FPS (30/60 if full-screened). The FPS is constant, and the triangle count is always of about 250k-300k, which is perfectly OK for LE (yay us).
I have a spectator code and a third person controller code. I can switch between any of these while exploring my scene by pressing tab.
However, at RANDOM times (not coinciding with more triangles, a regular interval or a recurring operation), I get a major (-20 FPS) drop, for about 1 second. The movement gets jaggy then smooth again. This happens every 10 to 20 seconds.
It cannot be my scene or amount of triangles, because the drop does not occur depending on the amount of triangles on screen. (I get the lag at 100K tris jut as often than at 300K tris).
It cannot be my spectator code, because the drop occurs on the third person controller as well.
It cannot be my third person controller code, because the drop occurs on the spectator code as well.
It cannot be my networking, because it is on another thread and nothing happens at all in it. No packets sent or received.
It cannot be the creation of any class or object, as, well, nothing gets created in code at random intervals.
I highly doubt it is .NET or some kind of garbage collector that would act upon LE or my code. Again, nothing gets created at random intervals, and most of the code is static, so there wouldn't be any need for garbage collection. If you are a .NET performance hater, please ignore this sentence and move on as if I was coding in C++ or BlitzMax. I don't want to base the whole thread on a .NET performance debate. Actually, I had really smoothly running demos before with C#/.NET.
PLEASE (and I'm begging here) help me locate the source of this lag. It is extremely unpleasant and annoying, both on the eye and on gameplay, because it is not perpetual but "coming-and-going" randomly.