1
votes

I am using the latest Azure Storage SDK (azure-storage-blob-12.7.1). It works fine for smaller files but throwing exceptions for larger files > 30MB.

azure.core.exceptions.ServiceResponseError: ('Connection aborted.', timeout('The write operation timed out'))

from azure.storage.blob import BlobServiceClient, PublicAccess, BlobProperties,ContainerClient

    def upload(file):
        settings = read_settings()
        connection_string = settings['connection_string']
        container_client = ContainerClient.from_connection_string(connection_string,'backup')
        blob_client = container_client.get_blob_client(file)
        with open(file,"rb") as data:
            blob_client.upload_blob(data)
            print(f'{file} uploaded to blob storage')
    
    upload('crashes.csv')
1
I think those are old and closed.DevMonk
github.com/Azure/azure-sdk-for-python/issues/12166 seems an accurate description of your issue, please comment on it to help prioritize the work (I work at MS on the SDK team)Laurent Mazuel
@LaurentMazuel I have commented on GitHub as you mentioned.DevMonk

1 Answers

0
votes

Seems everything works for me by your code when I tried to upload a ~180MB .txt file. But if uploading small files work for you, I think uploading your big file in small parts could be a workaround. Try the code below:

from azure.storage.blob import BlobClient

storage_connection_string=''
container_name = ''
dest_file_name = ''

local_file_path = ''

blob_client = BlobClient.from_connection_string(storage_connection_string,container_name,dest_file_name)

#upload 4 MB for each request
chunk_size=4*1024*1024  

if(blob_client.exists):
    blob_client.delete_blob()
    blob_client.create_append_blob()

with open(local_file_path, "rb") as stream:
    
    while True:
            read_data = stream.read(chunk_size)
            
            if not read_data:
                print('uploaded')
                break 
            blob_client.append_block(read_data)

Result:

enter image description here