0
votes

We have a mobile based video editor which uploads peoples videos and thumbnails to an AWS S3 bucket. This then launches an AWS SWF process to encode, upload video, upload thumbnail and wait for processing of said video before telling the user their video is finished. Each user uses their own access tokens so videos are uploaded to their own YouTube channels.

This has been fine for the past 12 months or so, throwing the occasional backendError on thumbnail upload from the YouTube end, which we have a retry strategy which will wait 1, 3, 8, 15 etc, seconds. By the second or third retry the YouTube servers have sorted themselves out and everything is great again.

As of 2017-09-17 17:29 (AEST) none of our videos have managed to upload their thumbnail. System throws a Google_Service_Exception with status code of 503 and response body of,

{ "error": { "errors": [ { "domain": "global", "reason": "backendError", "message": "Backend Error" } ], "code": 503, "message": "Backend Error" } }

At this time, no server packages were updated, neither was any of our own code or used composer packages.

I tried revoking my access token, generated my own a new project and new API key/secret on a different account and still got the same error processing thumbnail on our test server (should rule out rate limiting, which we are very far below any restrictions).

Video upload still works fine. Editing video title from our system still works fine (means access token is functional and has required scopes t set a thumbnail if the channel allows it). Manually uploading a thumbnail directly to YouTube still works fine (rules out the channel itself having an issue).

I even re-downloaded the sample code and created a new test and it still fails. The specific line it fails at in all cases is

$status = $media->nextChunk($chunk);

I have no idea what could be causing this sudden issue if nothing changed on our end, all images are 1920x1080 jpgs well under 2mb, completely different project and access tokens, completely new sample code. And I assume if everyone using the YouTube data API got this error over the past 24 hours there would be a lot more posts about it. So not sure what else to try from our end.

Any suggestions?

EDIT: Well... this is odd. I managed to upload via command line from our server with cURL.

curl -X POST -F "image=@thumbnail_test.jpg" "https://www.googleapis.com/upload/youtube/v3/thumbnails/set?videoId=dCdQ2tJ5wIs&key={REMOVED}&access_token={REMOVED}"

And that worked fine. So that rules out issue with our server (IP blocked or something), issue with our access tokens or issue with our client_id key. It also somehow rules out YouTube backend behaving badly (assuming cURL and php connect via the same mechanisms).

That only leaves me with, our uploading of thumbnails to YouTube (and only thumbnails) now fail from PHP with no server or server code changes, but using identical access tokens and keys via cURL in the command line work fine.

EDIT 2: I have upload a sample here on github (includes a sample image and composer.json files). It has 2 samples in 1. Shows the script failing with youtube php sdk, but then working successfully with a curl request using the same credentials. Script does not include getting an access or refresh token.

The main test.php script is as follows.

<?php 

define('GOOGLE_CLIENT_ID', 'REPLACE_ME');
define('GOOGLE_CLIENT_SECRET', 'REPLACE_ME');

require_once 'vendor/autoload.php';

$token = [
    'access_token' => 'REPLACE_ME',
    'refresh_token' => 'REPLACE_ME',
    'token_type' => 'Bearer',
    'expires_in' => '3600',
    'created' => REPLACE_ME
];

$youtube_id = 'REPLACE_ME';
$thumbnail_path = realpath('thumbnail.jpg');


/*
// Doing the exact same request in cURL works fine.
$data = [
    'filedata' => new CURLFile($thumbnail_path, 'image/jpeg', basename($thumbnail_path)),
];      
// Execute remote upload
$curl = curl_init();

curl_setopt($curl, CURLOPT_HTTPHEADER, [
    'Authorization: Bearer '.$token['access_token'],
]);

curl_setopt($curl, CURLOPT_URL, 'https://www.googleapis.com/upload/youtube/v3/thumbnails/set?videoId='.$youtube_id.'&key='.GOOGLE_CLIENT_SECRET);
curl_setopt($curl, CURLOPT_TIMEOUT, 30);
curl_setopt($curl, CURLOPT_POST, 1);
curl_setopt($curl, CURLOPT_POSTFIELDS, $data);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, 1);

curl_setopt($curl, CURLOPT_VERBOSE, 1);
$response = curl_exec($curl);
curl_close($curl);
echo $response;
*/



// Based on php sample from https://developers.google.com/youtube/v3/docs/thumbnails/set
$client = new Google_Client();   
$client->setClientId(GOOGLE_CLIENT_ID);
$client->setClientSecret(GOOGLE_CLIENT_SECRET);
$client->setScopes([
    'https://www.googleapis.com/auth/yt-analytics.readonly', 
    'https://www.googleapis.com/auth/youtube', 
    'https://www.googleapis.com/auth/youtube.upload',
    'https://www.googleapis.com/auth/userinfo.email',
    'https://www.googleapis.com/auth/userinfo.profile',
]);

$client->setAccessType('offline');  

// If you want to use a refresh token instead to get an access token from it.
//$token = $client->fetchAccessTokenWithRefreshToken($refresh_token);
//var_dump($token);
$client->setAccessToken($token);

$service = new Google_Service_YouTube($client);

try
{
    // Specify the size of each chunk of data, in bytes. Set a higher value for
    // reliable connection as fewer chunks lead to faster uploads. Set a lower
    // value for better recovery on less reliable connections.
    $chunkSizeBytes = 1 * 1024 * 1024;

    // Create a MediaFileUpload object for resumable uploads.
    // Parameters to MediaFileUpload are:
    // client, request, mimeType, data, resumable, chunksize.
    $client->setDefer(true);
    $request = $service->thumbnails->set($youtube_id);
    $client->setDefer(false);
    $mimeType = 'image/jpeg';

    $media = new Google_Http_MediaFileUpload(
        $client,
        $request,
        $mimeType,
        null,
        true,
        $chunkSizeBytes
    );

    $filesize = filesize($thumbnail_path);
    echo "Filesize: $filesize\n";
    $media->setFileSize($filesize);

    // Read the media file and upload it chunk by chunk.
    $status = false;
    $handle = fopen($thumbnail_path, "rb");
    while (!$status && !feof($handle))
    {
        $chunk = fread($handle, $chunkSizeBytes);
        $status = $media->nextChunk($chunk); // The line where the Google_Service_Exception exception is thrown. 
    }

    fclose($handle);

    echo "Thumbnail uploaded success\n";                              
}
catch (Google_Service_Exception $err)
{
    echo $err->getMessage();
}
catch (Exception $err)
{
    echo $err->getMessage();
}
2
Questions seeking debugging help ("why isn't this code working?") must include the desired behavior, a specific problem or error and the shortest code necessary to reproduce it in the question itself. Questions without a clear problem statement are not useful to other readers. See: How to create a Minimal, Complete, and Verifiable example.DaImTo
Can you link to the Minimal, Complete, and Verifiable example for me so I can update the question with best practice for other readers?Brad Moore
stackoverflow.com/help/mcve You should not include your client id and secret i have my own.DaImTo

2 Answers

0
votes
{  
   "error":{  
      "errors":[  
         {  
            "domain":"global",
            "reason":"backendError",
            "message":"Backend Error"
         }
      ],
      "code":503,
      "message":"Backend Error"
   }
}

Is a server error on Googles side. There is nothing you can really do besides wait a few minutes and try again. It is normally caused by going to fast.

Suggested action: Use exponential backoff, include a check before retrying non-idempotent requests.

Implementing exponential backoff

Exponential backoff is a standard error handling strategy for network applications in which the client periodically retries a failed request over an increasing amount of time. If a high volume of requests or heavy network traffic causes the server to return errors, exponential backoff may be a good strategy for handling those errors. Conversely, it is not a relevant strategy for dealing with errors unrelated to rate-limiting, network volume or response times, such as invalid authorization credentials or file not found errors.

Used properly, exponential backoff increases the efficiency of bandwidth usage, reduces the number of requests required to get a successful response, and maximizes the throughput of requests in concurrent environments.

Create requests are not idempotent. A simple retry is insufficient and may result in duplicate entities. Check whether the entity exists before retrying.

The flow for implementing simple exponential backoff is as follows.

  1. Make a request to the API
  2. Receive an error response that has a retry-able error code
  3. Wait 1s + random_number_milliseconds seconds
  4. Retry request
  5. Receive an error response that has a retry-able error code
  6. Wait 2s + random_number_milliseconds seconds
  7. Retry request
  8. Receive an error response that has a retry-able error code
  9. Wait 4s + random_number_milliseconds seconds
  10. Retry request
  11. Receive an error response that has a retry-able error code
  12. Wait 8s + random_number_milliseconds seconds
  13. Retry request
  14. Receive an error response that has a retry-able error code
  15. Wait 16s + random_number_milliseconds seconds
  16. Retry request
  17. If you still get an error, stop and log the error.
0
votes

And just out of no where, 4 days later it suddenly fixes itself. Was an issue within YT, but not one that would be replicable across all their SDKs for some reason (as in, again, server was not touched, remote code suddenly works, test code used locally suddenly works).