6
votes

I'm trying to upload files from local command line client to Azure storage through web-api. I'm using Azure Web-Site for that. Working with the client is not a problem. And I've got everything working locally fine. Here is the web-api code:

    public async Task<HttpResponseMessage> PostUpload()
    {
        // need a local resource to store uploaded files temporarily
        LocalResource localResource = null;
        try
        {
            // Azure web-site fails here
            localResource = RoleEnvironment.GetLocalResource("TempStorage");
        }
        catch (Exception e)
        {
            return Request.CreateErrorResponse(HttpStatusCode.InternalServerError, "Unable to get access to local resources");
        }

        var provider = new MultipartFormDataStreamProvider(localResource.RootPath);

        // Read the form data.
        await Request.Content.ReadAsMultipartAsync(provider);

        // snipped validation code
        var container = // code to get container

        foreach (var fileData in provider.FileData)
        {
            var filename = GetBlobName(fileData);
            var blob = container.GetBlockBlobReference(filename);
            using (var filestream = File.OpenRead(fileData.LocalFileName))
            {
                blob.UploadFromStream(filestream);
            }
            File.Delete(fileData.LocalFileName);
        }

        return Request.CreateResponse(HttpStatusCode.OK);
    }

Everything works fine when I run locally, but as soon as I deploy web-site in Azure, I can't upload, because Azure Web-Sites don't have access to LocalResource. And I'll need to switch to Azure Web-Role. I can switch, but accessing local file system is bothering me all together.

And LocalResource is required for instance of MultipartFormDataStreamProvider(). And I have not found alternative ways to upload files to WebApi. My plan was to channel through upload directly to Azure, without storing anything on a local HDD.

Is there any other way to upload files?

p.s. I have seen usages of Shared Access Signatures where I can give client application a url with signature and let the client upload directly to Azure Blog. But I'm not sure about how secure that is going to be and not really comfortable (yet) with passing the signatures down to the client. At the moment I presume the client is going to be run in very hostile environment and nothing can be trusted coming back from the client.

UPD My final solution involved using write only Shared Access Signature issued on the server and passed down to the client. And client then uploads files directly to Azure. This way I save a lot of hassle with managing uploaded files. And here is more detailed description of my solution.

3
What exception is being throw by the following code "localResource = RoleEnvironment.GetLocalResource("TempStorage")"Brian Dishaw
are you comfortable using REST API?dev2d
@BrianDishaw The exception is something along the lines of "Unable to acquire LocalResource, System.Runtime.InteropServices.SEHException (0x80004005): External component has thrown an exception.". Also this page (msdn.microsoft.com/en-us/library/windowsazure/ee758708.aspx ) suggest that local resources are only for Web- and Worker-roles, not for web-sites.trailmax
@VJD yeah, kind of. I guess will have to bite the bullet, generate Shared Access Signature and pass it down to the client, and then client uploads files directly to Azure via the signature.trailmax
you could probably take a look at my answer in this post: stackoverflow.com/questions/15842496/… ...Here i do not have speicifically talk about uploading to Azure, but it should probably give you a good idea.Kiran Challa

3 Answers

12
votes

This isn't exactly the answer you are looking for, but you can use local storage with Azure Websites using MultiPartFileStreamProvider and Path.GetTempPath(). The code would look something like this

public async Task<HttpResponseMessage> PostUpload()
{
    var provider = new MultipartFileStreamProvider(Path.GetTempPath());

    // Read the form data.
    await Request.Content.ReadAsMultipartAsync(provider);

    // do the rest the same     
}
2
votes

I found this StackOverflow article that overrides the MultipartFormDataStreamProvider so that Files are not stored locally first, but directly written to an AWSStream. See: Is it possible to override MultipartFormDataStreamProvider so that is doesn't save uploads to the file system?

But I have to say I also like the solution of trailmax.

1
votes

One possible solution to get your code to work as is with a LocalResource would be to host this inside of a worker process that self-hosts web api via Owin.

You can find a simple walkthrough at: http://www.asp.net/web-api/overview/hosting-aspnet-web-api/host-aspnet-web-api-in-an-azure-worker-role

You just need to startup the Owin-hosted api within the OnStart() method of the RoleEntryPoint. Keep in mind you can also return Html from a web api response so you can make a worker role a very flexible base project.

Here's a quick snippet showing how to set up the Owin host from the link above:

private IDisposable _webApp = null;

public override bool OnStart() {
   ServicePointManager.DefaultConnectionLimit = 5;

   var endpoint = RoleEnvironment.CurrentRoleInstance.InstanceEndpoints["DefaultEndpoint"];
   var baseUri = string.Format("{0}://{1}", endpoint.Protocol, endpoint.IPEndpoint);

   _webApp = WebApp.Start<Startup>(new StartOptions(baseUri));

   return base.OnStart();
}

public override void OnStop() {
    if (_webApp != null) {
        _webApp.Dispose();
    }
    base.OnStop();
}

...

using System.Web.Http;
using Owin;

class Startup {
    public void Configuration(IAppBuilder app) {
        var config = new HttpConfiguration();

        config.Routes.MapHttpRoutes("Default", "{controller}/{id}",
                                    new { id = RouteParameter.Optional });

        app.UseWebApi(config);

    }
}