I'm trying to make a Mac broadcast client to encode into H264 using FFmpeg but not x264 library.
So basically, I am able to get raw frames out from AVFoundation in either CMSampleBufferRef
or AVPicture
. So is there a way to encode a series of those pictures into H264 frames using Apple framework, like AVVideoCodecH264
.
I know the way to encode it use AVAssetWriter, but that only saves the video into file, but I don't want the file, instead, I'd want to have AVPacket so I can send out using FFmpeg. Does anyone have any idea? Thank you.
2
votes
1 Answers
6
votes
After refer to VideoCore project, I'm able to use Apple's VideoToolbox framework to encode in hardware.
Start an VTCompressionSession:
// Create compression session err = VTCompressionSessionCreate(kCFAllocatorDefault, frameW, frameH, kCMVideoCodecType_H264, encoderSpecifications, NULL, NULL, (VTCompressionOutputCallback)vtCallback, self, &compression_session); if(err == noErr) { comp_session = session; }
push the raw frame to the VTCompressionSession
// Delegate method from the AVCaptureVideoDataOutputSampleBufferDelegate - (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection { // get pixelbuffer out from samplebuffer CVPixelBufferRef pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer); //set up all the info for the frame // call VT to encode frame VTCompressionSessionEncodeFrame(compression_session, pixelBuffer, pts, dur, NULL, NULL, NULL); }
Get the encoded frame in the VTCallback, this is a C method to be used as a parameter of VTCompressionSessionCreate()
void vtCallback(void *outputCallbackRefCon, void *sourceFrameRefCon, OSStatus status, VTEncodeInfoFlags infoFlags, CMSampleBufferRef sampleBuffer ) { // Do whatever you want with the sampleBuffer CMBlockBufferRef blockBuffer = CMSampleBufferGetDataBuffer(sampleBuffer); }