1
votes

From my experience from watching streaming video online, it seems like as long as you have a fast enough connection, the video will play fine.

However, if there is anything between you and the video server slowing down your connection, the ubiquitous video buffering algorithm becomes apparent:

while(user is trying to enjoy video)
{
    if(at least 2 seconds of video has buffered)
    {
        play()
    }
    else
    {
        pause()
        //hope network conditions improve
    }
}

Depending on your mood, this can be hellishly frustrating to endure, or completely hilarious to see that the video player thinks that playing a few seconds and pausing over and over again is the right thing to do.

Is it possible to buffer videos in a way that would make it possible to watch a video with minimal stuttering?

It seems like a logical next step in the above algorithm is to do something like so:

buffer number = 2
annoyance count = 0
while(user is trying to enjoy video)
{

    if(at least buffer number of seconds of video has buffered)
    {
        play()
    }
    else
    {
        annoyance count++
        pause()

        if(annoyance count > 1)
        {
            buffer number++
        }
    }
}

Is there a huge technical factor that has yet to be overcome to make videos watchable on slow connections?

Is there a better algorithm that is avoided for some reason (hard to implement, processing power, not well known etc)?

It seems like humans are able to easily calculate how long you need to pause a video on any given connection speed for a smooth playback experience. Why can't computers? It's just math isn't it?

1

1 Answers

1
votes

Is there a huge technical factor that has yet to be overcome to make videos watchable on slow connections?

It's a slow connection and there's a lower limit to video quality, and thus also the amount of bandwidth required to stream it (although certain videos are certainly a lot higher quality than is needed to be watchable, although the definition of 'watchable' certainly differs, and perhaps the format isn't optimal, but I'm not too clued up on that aspect).

So either you're going to wait long before being able to watch the video, or you're not going to have smooth playback.

Is there a better algorithm that is avoided for some reason (hard to implement, processing power, not well known etc)?

It seems like humans are able to easily calculate how long you need to pause a video on any given connection speed for a smooth playback experience. Why can't computers? It's just math isn't it?

The reasons these things aren't implemented are perhaps laziness and/or ignorance, and you can perhaps argue simplicity.

It is just simple maths. Calculate the download speed over some period - this will allow you to calculate the estimated time it will take to download the remainder of the video and then you simply have to compare that to the remaining video playback time, allowing you to know if you should pause longer or resume, and when you resume you should have smooth playback all the way to the end (except if the line speed is inconsistent, but just adding a little time to the playback time should account for this to a reasonable extent).

The cost of this smooth playback is of course possibly having to wait quite a while before being able to start watching the video, which is certainly not desirable in all cases.

The above is pretty much what I do - I try to keep it paused until I think the video will play smoothly all the way to the end (well, when I want to wait for that).