2
votes

So I want to upload video's from client desktop application to Azure Media Services (which of course uses Azure Storage).

I am trying to do a combination of:

The first one shows an perfect example of my scenario, but the second one illustrates how to use BlobTransferClient to upload multiple files and have a "progress" indicator.

The problem: It does seem to upload and I don't get any error after uploading, yet nothing is showing up in Azure portal / Storage account. It seems to upload because task takes long, task manager shows wifi upload progress and Azure storage shows that (successful) requests are being made.

So, serverside, I create a SasLocator for a temporary time:

public async Task<VideoUploadModel> GetSasLocator(string filename)
{

    var assetName = filename + DateTime.UtcNow;

    IAsset asset = await _context.Assets.CreateAsync(assetName, AssetCreationOptions.None, CancellationToken.None);

    IAccessPolicy accessPolicy = _context.AccessPolicies.Create(assetName, TimeSpan.FromMinutes(10),
        AccessPermissions.Write);

    var locator = _context.Locators.CreateLocator(LocatorType.Sas, asset, accessPolicy);

    var blobUri = new UriBuilder(locator.Path);
    blobUri.Path += "/" + filename;

    var model = new VideoUploadModel()
    {
        Filename = filename,
        AssetName = assetName,
        SasLocator = blobUri.Uri.AbsoluteUri,
        AssetId = asset.Id
    };
    return model;
}

And client-side, I try to upload:

public async Task UploadVideoFileToBlobStorage(string[] files, string sasLocator, CancellationToken cancellationToken)
{
    var blobUri = new Uri(sasLocator);
    var sasCredentials = new StorageCredentials(blobUri.Query);

    //var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
    var blobClient = new CloudBlobClient(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
    var blobTransferClient = new BlobTransferClient(TimeSpan.FromMinutes(1))
    {
        NumberOfConcurrentTransfers = 2,
        ParallelTransferThreadCount = 2
    };
    //register events
    blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
    //files
    var uploadTasks = new List<Task>();
    foreach (var filePath in files)
    {
        await blobTransferClient.UploadBlob(blobUri, filePath, new FileEncryption(), cancellationToken, blobClient, new NoRetry());
    }



    //StorageFile storageFile = null;

    //if (string.IsNullOrEmpty(file.FutureAccessToken))
    //{
    //    storageFile = await StorageFile.GetFileFromPathAsync(file.Path).AsTask(cancellationToken);
    //}
    //else
    //{
    //    storageFile = await StorageApplicationPermissions.FutureAccessList.GetFileAsync(file.FutureAccessToken).AsTask(cancellationToken);
    //}
    //cancellationToken.ThrowIfCancellationRequested();
    //await blob.UploadFromFileAsync(storageFile);
}

I know I am probably not doing it correctly with naming of assets and using the progress indicator instead of await, but of course I first want this to work first before finishing it.

I configured Azure Media Services to "Connect to Media Services API with service principal", where I created a new Azure AD app and generated keys for that, like this documentation page. I am not really sure how this exactly works, little unexperienced in Azure AD and Azure AD apps (guidance?).

Uploading:

uploading of asset

Asset created but no files:

media services

Storage doesn't show any files either:

blob containers

Storage does show successful upload:

statistics

The reason I can't exactly follow the Upload multiple files with Media Services .NET SDK documentation is because it uses the _context (which is Microsoft.WindowsAzure.MediaServices.Client.CloudMediaContext), that _context I can use serverside but not client-side because it requires the TentantDomain,RESTAPI Endpoint, ClientId and Client Secret.

I guess uploading via SaSLocator is the correct way (?).

UPDATE 1

When uploading using CloudBlockBlob it does upload again and it is shown in my storage account within an asset, yet when I go the media services within azure and click on the particular asset, it doesn't show any files.

So the code for that:

var blob = new CloudBlockBlob(new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path, UriFormat.UriEscaped)), sasCredentials);
//files
var uploadTasks = new List<Task>();
foreach (var filePath in files)
{
    await blob.UploadFromFileAsync(filePath, CancellationToken.None);
}

I've also tried to upload an asset manually within Azure. So Clicking on "Upload" in the Asset menu, then Encoding it. This all works fine.

UPDATE 2:

Digging deeper I came up with the following, not yet production-proof, way to make it currently work:

1. Get a Shared access signature directly from storage and upload it to there:

public static async Task<string> GetMediaSasLocator(string filename)
{
    CloudBlobContainer cont = await GetMediaContainerAsync();
    SharedAccessBlobPolicy policy = new SharedAccessBlobPolicy()
    {
        SharedAccessExpiryTime = DateTimeOffset.UtcNow.AddMinutes(60),
        Permissions = SharedAccessBlobPermissions.Write,
        SharedAccessStartTime = DateTimeOffset.UtcNow.AddMinutes(-5)
    };
    await cont.FetchAttributesAsync();

    return cont.Uri.AbsoluteUri + "/" + filename + cont.GetSharedAccessSignature(policy);
}

With this SaS I can upload just like I showed in UPDATE 1, nothing changed there.

2. Create a Azure Function (which was already planned to do) which handles the asset creation, file uploading to asset, encoding and publishing. This has been done by following this tutorial: Azure Functions Tools for Visual Studio and then implement the code that is illustrated in Upload multiple files with Media Services .NET SDK.

So this "works" but is not perfect yet, I still don't have my progress indicator within my client WPF application and the Azure Function takes quite a long time to complete because we basically "upload" the file again to an Asset after it is already in Azure Storage. I rather use a method to either copy from one container to an asset container.

I came to this point because Azure functions need a fixed given container name, since assets create their own containers within an storage account, you can't trigger an Azure function on those. So to work with Azure Functions it seems I really have to upload it to a fixed container name and thereafter do the rest.

Question still remains: Why uploading a video file to Azure Storage via the BlobTransferClient does not work? And if it will work, how do I trigger an Azure function based on multiple containers. A 'path' like asset-{name}/{name}.avi would be preferred.

1

1 Answers

2
votes

Eventually it turned out that I need to specify the base URL in the UploadBlob method, so without the filename itself which is within the SasLocator URL, but only the container name.

Once I fixed that I also noted it didn't upload to the filename I have provided in the SasLocator I generated server side (it includes a customerID prefix). I had to use one of the other method overloads to get the correct filename.

public async Task UploadVideoFilesToBlobStorage(List<VideoUploadModel> videos, CancellationToken cancellationToken)
{
    var blobTransferClient = new BlobTransferClient();
    //register events
    blobTransferClient.TransferProgressChanged += BlobTransferClient_TransferProgressChanged;
    //files
    _videoCount = _videoCountLeft = videos.Count;
    foreach (var video in videos)
    {
        var blobUri = new Uri(video.SasLocator);
        //create the sasCredentials
        var sasCredentials = new StorageCredentials(blobUri.Query);
        //get the URL without sasCredentials, so only path and filename.
        var blobUriBaseFile = new Uri(blobUri.GetComponents(UriComponents.SchemeAndServer | UriComponents.Path,
            UriFormat.UriEscaped));
        //get the URL without filename (needed for BlobTransferClient (seems to me like a issue)
        var blobUriBase = new Uri(blobUriBaseFile.AbsoluteUri.Replace("/"+video.Filename, ""));

        var blobClient = new CloudBlobClient(blobUriBaseFile, sasCredentials);
        //upload using stream, other overload of UploadBlob forces to put online filename of local filename
        using (FileStream fs = new FileStream(video.FilePath, FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
        {
            await blobTransferClient.UploadBlob(blobUriBase, video.Filename, fs, null, cancellationToken, blobClient, 
                new NoRetry(), "video/x-msvideo");
        }
        _videoCountLeft -= 1;
    }

    blobTransferClient.TransferProgressChanged -= BlobTransferClient_TransferProgressChanged;
}

private void BlobTransferClient_TransferProgressChanged(object sender, BlobTransferProgressChangedEventArgs e)
{
    Console.WriteLine("progress, seconds remaining:" + e.TimeRemaining.Seconds);
    double bytesTransfered = e.BytesTransferred;
    double bytesTotal = e.TotalBytesToTransfer;
    double thisProcent = bytesTransfered / bytesTotal;
    double procent = thisProcent;
    //devide by video amount
    int videosUploaded = _videoCount - _videoCountLeft;
    if (_videoCountLeft > 0)
    {
        procent = (thisProcent + videosUploaded) / _videoCount;
    }

    procent = procent * 100;//to real %
    UploadProgressChangedEvent?.Invoke((int)procent, videosUploaded, _videoCount);
}

Actually Microsoft.WindowsAzure.MediaServices.Client.BlobTransferClient should be able to do concurrent uploads but there is no Method for uploading multiple yet it has properties for NumberOfConcurrentTransfers and ParallelTransferThreadCount, not sure how to use this.

I didn't check if this is now working with Assets as well because I now upload to 1 single container for every file and later using an Azure Function to process to an Asset, mainly because I can't trigger an Azure Function on a dynamic container name (every asset creates its own container).