Your "clean" version contains 50 ms of data while the "messy" version contains 2000ms of data. If both contain the same type of data, the display looks right. In the messy version, you have more peaks than pixel slices.
If writing an empty array creates 2 seconds worth of real data, we need to figure out where that data comes from! Something is definitely wrong. Could you post a stripped down demo VI that exhibits the problem? What LabVIEW version is this?