4
votes

This is a 2-part Question. I have the following code working which grabs the current display surface and creates a video out of the surfaces (everything happens in the background).

for(int i=0;i<100;i++){
        IOMobileFramebufferConnection connect;
        kern_return_t result;
        IOSurfaceRef screenSurface = NULL;

        io_service_t framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault, IOServiceMatching("AppleH1CLCD"));
        if(!framebufferService)
            framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault, IOServiceMatching("AppleM2CLCD"));
        if(!framebufferService)
            framebufferService = IOServiceGetMatchingService(kIOMasterPortDefault, IOServiceMatching("AppleCLCD"));

        result = IOMobileFramebufferOpen(framebufferService, mach_task_self(), 0, &connect);

        result = IOMobileFramebufferGetLayerDefaultSurface(connect, 0, &screenSurface);

        uint32_t aseed;
        IOSurfaceLock(screenSurface, kIOSurfaceLockReadOnly, &aseed);
        uint32_t width = IOSurfaceGetWidth(screenSurface);
        uint32_t height = IOSurfaceGetHeight(screenSurface);
        m_width = width;
        m_height = height;
        CFMutableDictionaryRef dict;
        int pitch = width*4, size = width*height*4;
        int bPE=4;
        char pixelFormat[4] = {'A','R','G','B'};
        dict = CFDictionaryCreateMutable(kCFAllocatorDefault, 0, &kCFTypeDictionaryKeyCallBacks, &kCFTypeDictionaryValueCallBacks);
        CFDictionarySetValue(dict, kIOSurfaceIsGlobal, kCFBooleanTrue);
        CFDictionarySetValue(dict, kIOSurfaceBytesPerRow, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &pitch));
        CFDictionarySetValue(dict, kIOSurfaceBytesPerElement, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &bPE));
        CFDictionarySetValue(dict, kIOSurfaceWidth, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &width));
        CFDictionarySetValue(dict, kIOSurfaceHeight, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &height));
        CFDictionarySetValue(dict, kIOSurfacePixelFormat, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, pixelFormat));
        CFDictionarySetValue(dict, kIOSurfaceAllocSize, CFNumberCreate(kCFAllocatorDefault, kCFNumberSInt32Type, &size));

        IOSurfaceRef destSurf = IOSurfaceCreate(dict);

        IOSurfaceAcceleratorRef outAcc;
        IOSurfaceAcceleratorCreate(NULL, 0, &outAcc);

        IOSurfaceAcceleratorTransferSurface(outAcc, screenSurface, destSurf, dict, NULL);

        IOSurfaceUnlock(screenSurface, kIOSurfaceLockReadOnly, &aseed);
        CFRelease(outAcc);

        // MOST RELEVANT PART OF CODE

        CVPixelBufferCreateWithBytes(NULL, width, height, kCVPixelFormatType_32BGRA, IOSurfaceGetBaseAddress(destSurf), IOSurfaceGetBytesPerRow(destSurf), NULL, NULL, NULL, &sampleBuffer);

        CMTime frameTime = CMTimeMake(frameCount, (int32_t)5);

        [adaptor appendPixelBuffer:sampleBuffer withPresentationTime:frameTime];

        CFRelease(sampleBuffer);
        CFRelease(destSurf);
        frameCount++;
    }

P.S: The last 4-5 lines of code are the most relevant(if you need to filter).

1) The video that is produced has artefacts. I have worked on videos previously and have encountered such an issue before as well. I suppose there can be 2 reasons for this:
i. The PixelBuffer that is passed to the adaptor is getting modified or released before the processing (encoding + writing) is complete. This can be due to asynchronous calls. But I am not sure if this itself is the problem and how to resolve it.
ii. The timestamps that are passed are inaccurate (e.g. 2 frames having the same timestamp or a frame having a lower timestamp than the previous frame). I logged out the timestamp values and this doesn't seem to be the problem.

2) The code above is not able to grab surfaces when a video is played or when we play games. All I get is a blank screen in the output. This might be due to hardware accelerated decoding that happens in such cases.

Any inputs on either of the 2 parts of the questions will be really helpful. Also, if you have any good links to read on IOSurfaces in general, please do post them here.

1
I was trying to get screenshots in background, your code works great in most apps but in games it returns black screens as described in the second question. Have you got any ideas how to fix that? Also, RecordMyScreen works well in games, but it's not open source now so I can't figure out how it capture screens. - cloudycliff
The above code isn't able to capture screens rendered using openGL. RecordMyScreen uses CARenderServerRenderDisplay() API. The code is at github - github.com/coolstar/RecordMyScreen. The file CSScreenRecorder.m contains the core code. I compared the performance for both the API's and found out that IOMobileFramebufferGetLayerDefaultSurface() is faster compared to CARenderServerRenderDisplay(). - Hrishikesh_Pardeshi
RecordMyScreen's code at github is out of date now. I tried CARenderServerRenderDisplay() in iOS7, only returns black screen. - cloudycliff
You are right. CARenderServerRenderDisplay() doesn't work on iOS7. Have you tried the above code on iOS 7? I was working on this project a year back, I haven't checked it out on iOS 7. - Hrishikesh_Pardeshi
The code above works well on iOS7, but not with games...Games only give me black screen. I tried to reverse engineering the latest RecordMyScreen, seems that they still use CARenderServerRenderDisplay() to capture screenshot, have no idea how it works. - cloudycliff

1 Answers

5
votes

I did a bit of experimentation and concluded that the screen surface from which the content is copied is changing even before the transfer of contents is complete (call to IOSurfaceAcceleratorTransferSurface() ). I am using a lock (tried both asynchronous and read-only) but it is being overridden by the iOS. I changed the code between the lock/unlock part to the following minimal:

IOSurfaceLock(screenSurface, kIOSurfaceLockReadOnly, &aseed);
aseed1 = IOSurfaceGetSeed(screenSurface);
IOSurfaceAcceleratorTransferSurface(outAcc, screenSurface, destSurf, dict, NULL);
aseed2 = IOSurfaceGetSeed(screenSurface);
IOSurfaceUnlock(screenSurface, kIOSurfaceLockReadOnly, &aseed);

The GetSeed function tells if the contents of the surface have changed. And, I logged a count indicating the number of frames for which the seed changes. The count was non-zero. So, the following code resolved the problem:

if(aseed1 != aseed2){
//Release the created surface
continue; //Do not use this surface/frame since it has artefacts
}

This however does affect performance since many frames/surfaces are rejected due to artefacts. Any additions/corrections to this will be helpful.