Re: Memory Profiler for Heap Analysis
On 13.06.2007 12:54, Moritz Wissenbach wrote:
Hi Robert,
I don't think something like that exists. Assuming something like
this exists, how does it properly calculate sizes? How many levels
does it follow when counting?
Until it hits a cycle?
If it does not?
How do you count objects that are referenced by multiple other
instances etc.
Add the objects size to both instaces...?
Remember that n is not restricted to 2, could be much higher.
Well, I was sceptical too as to wheather this was even possible, but
when I asked for this feature in the TPTP (eclipse profiler) group, they
replied
This is a valid and useful use-case for memory profiling. While
technically possible (both in theory and in practice), it is not
currently
in the plan for TPTP.
So I thought this might be implemented, if not in a free, then in a
commercial tool.
Of course it's not a trivial (or probably fast for that matter) task,
but I guess you could somehow work around the problems.
I assume the problems are solvable more on a theoretical level. I don't
see this problem practically solvable. But maybe I'm missing something
- the TPTP folks surely have a better understanding of the matter.
I don't want a 100% accurate size count, just an overview over which
classes use up most memory. I find it kind of hard to see with the
standard "heap historgram" but maybe I'm missing something.
Does anyone have any other suggestions? As stated I have an application
(edior) that uses too much memory (about 21 MB for each MB file size
loaded) and want to see where all of this goes (DOM, Graphical
representation, etc..)
DOM is a good candidate. If I were you, I'd create an application
object model, throw away the DOM parsing and create a SAX parser which
directly creates the model. Just my 0.02 EUR...
Kind regards
robert