1
votes

The problem is I need to modify (update/create/delete) from 0 to 10000 NSManagedObject's subclasses. Of course if it's <= 1000 everything works fine. I'm using this code:

+ (void)saveDataInBackgroundWithBlock:(void (^)(NSManagedObjectContext *))saveBlock completion:(void (^)(void))completion {
    NSManagedObjectContext *tempContext = [self newMergableBackgroundThreadContext];
    [tempContext performBlock:^{

        if (saveBlock) {
            saveBlock(tempContext);
        }

        if ([tempContext hasChanges]) {
            [tempContext saveWithCompletion:completion];
        } else {
            dispatch_async(dispatch_get_main_queue(), ^{
                if (completion) {
                    completion();
                }
            });
        }
    }]; 
}

- (void)saveWithCompletion:(void(^)(void))completion {
    [self performBlock:^{
        NSError *error = nil;
        if ([self save:&error]) {
            NSNumber *contextID = [self.userInfo objectForKey:@"contextID"];
            if (contextID.integerValue == VKCoreDataManagedObjectContextIDMainThread) {
                dispatch_async(dispatch_get_main_queue(), ^{
                    if (completion) {
                        completion();
                    }
                });
            }
            [[self class] logContextSaved:self];
            if (self.parentContext) {
                [self.parentContext saveWithCompletion:completion];
            }
        } else {
            [VKCoreData handleError:error];
            dispatch_async(dispatch_get_main_queue(), ^{
                if (completion) {
                    completion();
                }
            });
        }
    }];
}

completion will be fired only when main-thread context will be saved. This solution works just perfect, but

When I get more than 1000 entities from server I would like to parallel objects processing, cause update operation takes too much time (for example, 4500 updating about 90 seconds and less than 1/3 of this time takes JSON receiving process, so about 60 seconds I just drilling NSManagedObjects). Without CoreData it's pretty easy by using dispatch_group_t to divide data into subarray and process it in different threads at the same time, but... is somebody knows how to make something similar with CoreData and NSManagedObjectContexts? Is it possible to working with NSManagedObjectContext with NSPrivateQueueConcurrencyType (iOS 5 style) without performBlock: ? And what is the best way to save and merge about 10 contexts? Thanks!

3
What are you doing inside saveBlock? Are you saving once per added entity?Daniel Eggert
Basically I'm updating data for all received objects, managing who was deleted/added, changed some statuses by simple enumerating and updating NSManagedObjects with received JSONDictionaries.iiFreeman

3 Answers

2
votes

By your description, it appears you are grasping at straws to recover performance.

Core Data file I/O performance is dominated by the single threaded nature of SQLite. Having multiple contexts beating on the same store coordinator is not going to make things go faster.

To improve performance, you need to do things differently. For example, you could batch your background writes into larger operations. (How? You need to do more in each GCD block before the save.) You can use Core Data's debugging tools to see what kind of SQL is being emitted by your fetches and saves. (There are lots of ways to improve CD fetch performance, fewer to improve saving.)

1
votes

ok people, after I finish implementing all I want I discovered the following:

dispatch_group_t with different PrivateQueues and NSManagedObjectContexts results:

format is "number of entities/secs":

  • 333 /6
  • 1447/27
  • 3982/77

Single background thread (NSManagedObjectContext + NSPrivateQueueConcurrencyType + performBlock:)

  • 333 /1
  • 1447/8
  • 3982/47

So think I shouldn't try it again, also there is a lot of another issues like app freezes while merging a great number of context (even in background). I will try something else to improve performance.

0
votes

You can create multiple contexts and process a slice of your data on each one...?