paulirish/memory-stats.js

Max 30MB?

Closed this issue · 11 comments

Neat! Sadly the chart seems to max out at 30MB which is ... well bellow my project's memory consumption :) Seems like it's tracking max memory seen so far but not using it to scale the chart.

Hey @rsturgell ! I'm not getting the same behaviour on my end. Have you tried running the demo? What happens?

With the demo I see a nice increasing curve until it hits the top of the chart, then it just repeats the curve over and over. Looks like this is because as soon as the value is over 30MB it fails to reset the heights (because it tries to set a negative value, which is apparently ignored). In fact I can see this behavior in the animated gif in the readme as well.

Ah yes, I believe that is actually by design (see these lines). I'm pretty sure this is to ensure that you can see changes in memory more readily as the base amount of memory consumption gets higher. Going from a scale of 0 to a max value would not be as helpful for applications that have a high consumption baseline, as the graph would only noticeably render significantly large memory changes. @paulirish can you confirm this behaviour is expected?

@rsturgell what are your suggestions for an "alternative scaling behaviour"?

I agree that this is working "as coded", but surely it's not "by design" that old values are recycled when you go over 30 MB! That's very misleading. If you desire a fixed max of 30MB for the graph, at least change:

var height = Math.min( 30, 30 - normValue * 30 );

to

var height = Math.max(0, Math.min( 30, 30 - normValue * 30 ));

so that the graph shows as "max" instead of "random data" when you go over 30MB. I would personally go ahead and scale to the max seen so far. But if you rescale you need to deal with incorrectly scaled old data. You could just go through and scale the old values when you adjust the max (to avoid doing that every frame maybe rescale by 1.5x when you hit max or something).

Ah, I see which part of the functionality you are referencing now :)

You are referencing the kind of "draw and forget" of older data that was set at a different scale, thus the sudden "dips" and the 34 MB = 64 MB height that happens when it crosses the 60 MB threshold (or any threshold of 30 MB increments), correct?

I think I'm not being clear about the behavior! The problem is that, once the memory goes over 30MB, all new samples are completely ignored. It's not a matter of "different scale" (where are you thinking that the scale changes at all (in the code)?).

The nature of the demo app (constantly and linearly increasing memory) makes it look like it's maybe a baseline are scale shift or something - but that's not the case. If the demo instead when up to, say, 33MB and then stopped allocating (so that you would expect to see the graph flatline at some point), you would STILL see the sawtooth pattern in the graph, because at that point it's simply replaying the previous data.

Forgive my use of the word "scale", I was considering the changing baseline for height calculation and the difference from 30 as a changing "scale" when it comes to the presentation of the graph. Not a correct label at all (scale is always 1MB to 1px from what I'm seeing). Sorry about that.

So we can get on the same page, where in the code is the previous data being pulled from?

It re-uses the old data in updateGraph:

var child = dom.appendChild( dom.firstChild );
child.style.height = height + 'px';

dom.firstChild is the oldest sample. It is moved to the front. Then it attempts to set the height of that, but, when memory > 30MB, a negative value is passed in the "height", and setting the height to a negative value is apparently a no-op, so the previous (old) value is used.

Cool we're definitely on the same page now. Sorry it took so long to get here.

Moving on, looks like the height calculation logic needs a fix.

@paulirish would you be able to provide some insight into what the "over 30MB" logic should be doing?

From the code it looks like it was intended to be as @rsturgell initially suggested, which is to max out at over 30MB. If the hard-limit approach is the way to go, there should probably be some options passed in that can change the limit and baseline for projects which use a high amount of memory consumption.

Seems like a bug. I only inherited the logic here, so I can't explain it.

I don't know that we need a hard limit, better to just scale it as needed.

fixed in #30

thanks @tshaddix.