0
votes

This problem I am facing in title is very similar to this question previously raised here (Azure storage: Uploaded files with size zero bytes), but it was for .NET and the context for my Java scenario is that I am uploading small-size CSV files on a daily basis (about less than 5 Kb per file). In addition the API code uses the latest version of Azure API that I am using in contrast against the 2010 used by the other question.

I couldn't figure out where have I missed out, but the other alternative is to do it in File Storage, but of course the blob approach was recommended by a few of my peers.

So far, I have mostly based my code on uploading a file as a block of blob on the sample that was shown in the Azure Samples git [page] (https://github.com/Azure-Samples/storage-blob-java-getting-started/blob/master/src/BlobBasics.java). I have already done the container setup and file renaming steps, which isn't a problem, but after uploading, the size of the file at the blob storage container on my Azure domain shows 0 bytes.

I've tried alternating in converting the file into FileInputStream and upload it as a stream but it still produces the same manner.

fileName=event.getFilename(); //fileName is e.g eod1234.csv
String tempdir = System.getProperty("java.io.tmpdir");
file= new File(tempdir+File.separator+fileName); //
try {
    PipedOutputStream pos = new PipedOutputStream();
    stream= new PipedInputStream(pos);
    buffer = new byte[stream.available()];
    stream.read(buffer);
    FileInputStream fils = new FileInputStream(file);
    int content = 0;
    while((content = fils.read()) != -1){
        System.out.println((char)content);
    }
    //Outputstream was written as a test previously but didn't work
    OutputStream outStream = new FileOutputStream(file);
    outStream.write(buffer);
    outStream.close();

    // container name is "testing1"            
    CloudBlockBlob blob = container.getBlockBlobReference(fileName);
    if(fileName.length() > 0){
       blob.upload(fils,file.length()); //this is testing with fileInputStream
       blob.uploadFromFile(fileName); //preferred, just upload from file
    }
}            

There are no error messages shown, just we know that the file touches the blob storage and shows a size 0 bytes. It's a one-way process by only uploading CSV-format files. At the blob container, it should be showing those uploaded files a size of 1-5 KBs each.

2
About that fils - if you ran it as we see it here, there is no wonder it has zero bytes. You are reading fils until no bytes are left in it (returning -1 and then you pass it to the upload method. It means there are zero bytes left to upload. You should rewind or reopen it.RealSkeptic

2 Answers

1
votes

Instead of blob.uploadFromFile(fileName); you should use blob.uploadFromFile(file.getAbsolutePath()); because uploadFromFile method requires absolute path. And you don't need the blob.upload(fils,file.length());.

Refer to Microsoft Docs: https://docs.microsoft.com/en-us/azure/storage/blobs/storage-quickstart-blobs-java#upload-blobs-to-the-container

0
votes

The Azure team replied to a same query I've put on mail and I have confirmed that the problem was not on the API, but due to the Upload component in Vaadin which has a different behavior than usual (https://vaadin.com/blog/uploads-and-downloads-inputs-and-outputs). Either the CloudBlockBlob or the BlobContainerUrl approach works.

The out-of-the-box Upload component requires manual implementation of the FileOutputStream to a temporary object unlike the usual servlet object that is seen everywhere. Since there was limited time, I used one of their addons, EasyUpload, because it had Viritin UploadFileHandler incorporated into it instead of figuring out how to stream the object from scratch. Had there been more time, I would definitely try out the MultiFileUpload addon, which has additional interesting stuff, in my sandbox workspace.