0
votes

I have just began learning core data programming. I tried to make an example in which there is a table view that displays a list of persons (properties : first name, last name). The table view relies on an NSFetchResultController to display the list of persons.

I followed the nested contexts pattern as follows :

Root Context (NSPrivateQueueConcurrencyType) <---> Main Context (NSMainQueueConcurrencyType) <---> Children Contexts (NSPrivateQueueConcurrencyType).

The children contexts are used to perform huge insertion/fetch (with perormBlock: method). When i try to perform a huge insertion (about 5000 rows), save the child context then the main context then the root context, i see that my UI is blocked until the save is finished.

Could any one please tell me what is the best solution to adopt in order to make a performant application ? Could anyone please provide me a nice simple code that show how to make huge fetch/insertion in background without blocking the UI?

    [_indicator startAnimating];

NSManagedObjectContext *aContext = [[NSManagedObjectContext alloc] initWithConcurrencyType:NSPrivateQueueConcurrencyType];
aContext.parentContext = [[SDCoreDataController sharedInstance] mainManagedObjectContext];

[aContext performBlock:^{

    NSError *error;

    for (int i = 0; i < 5000; i++)
    {
        FootBallCoach *backgroundCoach = [NSEntityDescription insertNewObjectForEntityForName:@"FootBallCoach" inManagedObjectContext:aContext];

        backgroundCoach.firstName = [NSString stringWithFormat:@"José %i",i];
        backgroundCoach.lastName = [NSString stringWithFormat:@"Morinho %i",i];
        backgroundCoach.cin = [NSString stringWithFormat:@"%i",i];

        if (i % 50 == 0)
        {
            [aContext save:&error];

            [aContext reset];
        }
    }

    [[SDCoreDataController sharedInstance] saveMainContext];
    [[SDCoreDataController sharedInstance] saveRootContext];

    dispatch_async(dispatch_get_main_queue(), ^{

        [_indicator stopAnimating];

        [self refreshCoaches:nil];
    });

}];
1

1 Answers

1
votes

Don't do "huge" imports.
Each time a write operation is done to the store, the NSPersistentStoreCoordinator is locking the store for any other kind of operations. So, if your UI is trying to fetch data during that time, it will be blocked.

Segment your saves to 100~200 objects (depending on object size and complexity).
The segmentation really depend on your object graph structure, a pseudocode would be:

Edit: I've edited the code to reflect a correction to your save process.
Your save to the store (the actual file) should also be segmented, otherwise you still end-up with a "huge" save operation.

for ( i = 0; i < LARGE_N; i += BATCHSIZE)
{
    @autoreleasepool {
        batchInfo = importInfos[i : MIN(i+BATCHSIZE-1,LARGE_N-1]; //array of the batch
        //use existing objects or create new ones if needed
        //use batch fetching to reduce existing items find time
        batchInfo = createOrReuseItemsForBatchInfo(batchInfo);
        //you can also practice weeding:
        //  create all items as newly inserted
        //  after batch insertion completed, find existing items, 
        //     replace them with the newly inserted and delete the duplicated inserted objects.
        //save all the way to the store
        NSManagedObjectContext* ctx = context;
        __block BOOL saveSuccessful = YES;
        while(ctx && saveSuccessful) {
            [ctx performBlockAndWait:^{
                saveSuccessful = [ctx save:&error]
            }];
            ctx = ctx.parentContext;
        }
        //handle unsuccessful save
        [context  reset];
        //You can discard processed objects from importInfos array if you like
    }
}