I am Profiling an Application with Instruments. The profiling is done using Allocations Tool in two ways:
- By choosing Directly the Allocations when I run the App for Profiling
- By Choosing Leaks when I run the App for Profiling.
In both the cases , I had Allocations tool enabled for testing. But surprisingly , I had two different kind of Out put for Allocations at these instances.
Are they supposed to behave differently? or this is a problem with Instruments.
The time I Profile the with Leaks Tool:
In Allocations Graph:
1. I get lot of Peaks in Graph, The Live bytes and overall bytes are same.
2. I get the Black flags (I think it alarms about memory warning) after 1 minutes usage. Then after a set of flags appearing, my App Crashes. (This happens at times, even when directly run the App in Device)
The time I Profile the with Allocation Tool:
In Allocations Graph:
1. I do not get peaks often as it was in the above case. The Live bytes was always way less than Overall bytes.
2. I have used for over 20 minutes and never got Black flags.
One fact I came to know is that, when Live bytes and overall bytes are equal, the NSZombieEnabled could be enabled.
Have any of you ever encountered this problem.
UPDATE 1:
I faced another problem with first case. Whenever I profiled after a short duration (compared to profiling in second case), the app got lots of Black Flags and my App Crashed. (Due to Memory warning)
And when I tried the similar step by step use of Application my Application did not crash and got no flags.
Why this discrepancy?