1
votes

I have videos saved in azure blob storage and i want to upload them into facebook. Facebook video upload is a multipart/form-data post request. Ordinary way of doing this is download azure blob as bytes using readall() method in azure python sdk and set it in requests post data as follows.

# download video from azure blob
video = BlobClient.from_connection_string(AZURE_STORAGE_CONNECTION_STRING, 
                                          AZURE_CONTAINER_NAME,
                                          f"{folder_id}/{file_name}")
video = video.download_blob().readall()

# upload video to facebook
url = f"{API_VIDEO_URL}/{page_id}/videos"
params = {
    "upload_phase": "transfer",
    "upload_session_id": session_id,
    "start_offset": start_offset,
    "access_token": access_token
}

response = requests.post(url, params=params, files={"video_file_chunk": video})

Bytes of the file is loaded into the memory and this is not good for larger files. There is a method in azure sdk readinto(stream) that downloads the file into a stream. Is there a way to connect requests streaming upload and readinto() method. Or is there a another way to upload the file directly from blob storage?

1
Do you mind if saving file in local then uploading file?Jim Xu
@JimXu No. neither local file nor in-memory. i need to pass to facebook directly from azure blobSajith Herath
Ok. I see. As far as I knew, we can use file_url in parameter with the blob sas url. But it has a limit, the video should be downloaded within 5 minutes and its size cannot exceed 1GB. So your video cannot be large and the network should be good : developers.facebook.com/docs/graph-api/reference/page/videos/…Jim Xu
If it is not useful for you, I think we need to write content to local or in-meory streamJim Xu
@JimXu Since i use resumable video upload with chunks file_url doesn't work for me. So i think better way is in-memory with lazy loading. download and upload a chunk at a time.Sajith Herath

1 Answers

0
votes

Regarding how to upload video in chunk with stream, please refer to the following code

from azure.storage.blob import BlobClient
import io
from requests_toolbelt import MultipartEncoder
import requests

blob_poperties=blob.get_blob_properties()
blob_size=blob_poperties.size # the blob size
access_token=''
session_id='675711696358783'
chunk_size= 1024*1024 #the chunk size
bytesRemaining = blob_size
params = {
    "upload_phase": "transfer",
    "upload_session_id": session_id,
    "start_offset": 0,
    "access_token": access_token
}
url="https://graph-video.facebook.com/v7.0/101073631699517/videos"
bytesToFetch=0  
start=0 # where to start downlaoding 
while bytesRemaining>0 :
    
    with io.BytesIO() as f:
        if bytesRemaining < chunk_size:
           bytesToFetch= bytesRemaining
        else:
            bytesToFetch=chunk_size
        print(bytesToFetch)
        print(start)
        downloader =blob.download_blob(start,bytesToFetch)
        b=downloader.readinto(f)
        print(b)
        m = MultipartEncoder(
            fields={'video_file_chunk':('file',f) }
        )
        r =requests.post(url, params=params, headers={'Content-Type': m.content_type}, data=m)
        s=r.json()
        print(s)
        start =int(s['start_offset'])
        bytesRemaining -=int(s['start_offset'])
    params['start_offset']=start
    print(params)

# end uplaod
params['upload_phase']= 'finish'
r=requests.post(url, params=params)
print(r)

enter image description here enter image description here