8
votes

I'm developing an application for real time streaming. Two parts include for the streaming. I use a capture card to capture some live source and need to stream in real time. and also need to stream a local video file.

To stream local video file in real time I use emgu cv to capture the video frame as bitmaps. To achieve this I create the bitmap list and I save captured bitmap to this list using one thread. and also I display those frames in a picture box. Bitmap list can store 1 second video. if frame rate is 30 it will store 30 video frames. After filling this list I start another thread to encode that 1 second chunk video.

For encoding purpose I use ffmpeg wrapper called nreco. I write that video frames to ffmpeg and start the ffmpeg to encode. After stopping that task I can get encoded data as byte array.

Then I'm sending that data using UDP protocol through LAN.

This works fine. But I cannot achieve the smooth streaming. When I received stream via VLC player there is some millisecond of delay between packets and also I noticed there a frame lost.

private Capture _capture = null;
Image<Bgr, Byte> frame;

// Here I capture the frames and store them in a list
private void ProcessFrame(object sender, EventArgs arg)
{
     frame = _capture.QueryFrame();
     frameBmp = new Bitmap((int)frameWidth, (int)frameHeight, PixelFormat.Format24bppRgb);
     frameBmp = frame.ToBitmap(); 


 twoSecondVideoBitmapFramesForEncode.Add(frameBmp);
                        ////}
     if (twoSecondVideoBitmapFramesForEncode.Count == (int)FrameRate)
     {
         isInitiate = false;
         thread = new Thread(new ThreadStart(encodeTwoSecondVideo));
         thread.IsBackground = true;
         thread.Start();
     }  
 }

public void encodeTwoSecondVideo()
{
    List<Bitmap> copyOfTwoSecondVideo = new List<Bitmap>();
    copyOfTwoSecondVideo = twoSecondVideoBitmapFramesForEncode.ToList();
    twoSecondVideoBitmapFramesForEncode.Clear();

    int g = (int)FrameRate * 2;

    // create the ffmpeg task. these are the parameters i use for h264 encoding

        string outPutFrameSize = frameWidth.ToString() + "x" + frameHeight.ToString();
        //frame.ToBitmap().Save(msBit, frame.ToBitmap().RawFormat);
        ms = new MemoryStream();
        //Create video encoding task and set main parameters for the video encode

        ffMpegTask = ffmpegConverter.ConvertLiveMedia(
            Format.raw_video,
            ms,
            Format.h264,
            new ConvertSettings()
            {

                CustomInputArgs = " -pix_fmt bgr24 -video_size " + frameWidth + "x" + frameHeight + " -framerate " + FrameRate + " ", // windows bitmap pixel format
                CustomOutputArgs = " -threads 7 -preset ultrafast -profile:v baseline -level 3.0 -tune zerolatency -qp 0 -pix_fmt yuv420p -g " + g + " -keyint_min " + g + " -flags -global_header -sc_threshold 40 -qscale:v 1 -crf 25 -b:v 10000k -bufsize 20000k -s " + outPutFrameSize + " -r " + FrameRate + " -pass 1 -coder 1 -movflags frag_keyframe -movflags +faststart -c:a libfdk_aac -b:a 128k "
                //VideoFrameSize = FrameSize.hd1080,
                //VideoFrameRate = 30

            });

        ////////ffMpegTask.Start();
        ffMpegTask.Start();


      // I get the 2 second chunk video bitmap from the list and write to the ffmpeg 
  foreach (var item in copyOfTwoSecondVideo)
        {
            id++;
            byte[] buf = null;
            BitmapData bd = null;
            Bitmap frameBmp = null;

            Thread.Sleep((int)(1000.5 / FrameRate));

            bd = item.LockBits(new Rectangle(0, 0, item.Width, item.Height), ImageLockMode.ReadOnly, PixelFormat.Format24bppRgb);
            buf = new byte[bd.Stride * item.Height];
            Marshal.Copy(bd.Scan0, buf, 0, buf.Length);
            ffMpegTask.Write(buf, 0, buf.Length);
            item.UnlockBits(bd);
        }
   }

This is the process I used to achieve the live streaming. But the stream is not smooth. I tried using a queue instead of list to reduce the the latency to fill the list. Because I thought that latency happens encoding thread encode and send 2 second video very quickly. But when it finishes this encoding process of bitmap list not completely full. So encoding thread will stop until the next 2 second video is ready.

If any one can help me to figure this out, it is very grateful. If the way of I'm doing this is wrong, please correct me. Thank You!

2
I'm not sure that this will solve your problem, but I don't think it's a good idea to create a new Thread every time in the capturing code. This is an unneeded overhead for capturing routine. It's better to create a thread-safe queue of two-second frames lists and process it in a separate working thread. Working thread can be triggered to start processing using, for example, ManualResetEvent. According to initial question one working thread will be enough. I would also consider using TPL, but this is out of scope of initial question.sasha_gud

2 Answers

1
votes

It's hard to say something about the code since your fragments doesn't provide details for whole process.

First of all you may eliminate frames buffer (list of bitmaps) at all. Just create 1 live stream encoding process (creating new process for every 2 seconds chunk is very bad idea) and push bitmaps to VideoConverter with Write method as they come. Since you are getting frames from capturing device in real time you're also don't need to make any manual delays ( Thread.Sleep((int)(1000.5 / FrameRate)) ). As result you should get smooth video at VLC side (some latency - usually about 200-500ms - is unavoidable because of encoding and network transmission).

If you're getting frames from capturing device by fits and starts you may try "-re" FFMpeg option.

0
votes

I changed my code now. When the frame buffer was filled then I started the thread that used to encode the video frame. Inside this thread I encode video frames and I saved those encoded data in thread safe queue. After filling that queue to some extent I start timer. Timer will trigger each 200 millisecond interval and send encoded data.

This works very fine and I got the smooth stream at the receiving end. I tested this with 720 video. But when I try to stream 1080p video it stream very good at the beginning. But after some time stream is displaying part by part. I noticed this happen when my streaming application is not sending data much fast. So player buffer get empty for small millisecond time. I think this happens because emgu cv did not capture frames in real time. It captured frame very fast for low resolution video. When I capture 1080p HD video then capturing will slow. Even both has same frame rate. Every time it will capture the frame when “Application.Idle += ProcessFrame;" event trigger.

I have a capture card that can use to capture video in real time and can use if I have HDMI line. So I don't know how to capture video file using capture card. That's why I used open cv. And also I removed all the thread as you said.