9
votes
#import <UIKit/UIKit.h>

int main(int argc, char *argv[]) {
    NSAutoreleasePool * pool = [[NSAutoreleasePool alloc] init];
    int retVal = UIApplicationMain(argc, argv, nil, nil);
    [pool release];
    return retVal;
}

The main method calls release on the pool after the application exits, which incidentally sends release to all objects in the pool. But because autoreleased objects created inside the application don't stick around until the application exits, at some point during the runloop, the pool is either drained or released (in the context of iPhone, drain==release.. unless I need to be corrected on this point!). But does anybody know for certain when this happens? It would seem logical for the pool to be drained at the end of a runloop, and a new one to be alloced at the beginning of the next, but I can't find any definitive information on this. Here's a discussion on the apple forums, but it seems highly speculative (not to mention contentious, towards the end). Can anyone give me an answer, ideally with evidence from documentation or source code (or even an experimental program)?

1
the particular pool that you reference isn't drained until the end of the application, but each invocation of the run loop creates its own pool, nested in the application's run loop.cobbal
That makes sense. So, if I never create my own pool, a given application would have two nested pools by default?jakev
Yup; the outer pool and a the cycling pool within the run loop. UIApplicationMain() might create/drain pools as an implementation detail, too.bbum

1 Answers

11
votes

From NSAutoreleasePool Class Reference:

The Application Kit creates an autorelease pool on the main thread at the beginning of every cycle of the event loop, and drains it at the end, thereby releasing any autoreleased objects generated while processing an event.