I have videos saved in azure blob storage and i want to upload them into facebook. Facebook video upload is a multipart/form-data post request. Ordinary way of doing this is download azure blob as bytes using readall() method in azure python sdk and set it in requests post data as follows.
# download video from azure blob
video = BlobClient.from_connection_string(AZURE_STORAGE_CONNECTION_STRING,
AZURE_CONTAINER_NAME,
f"{folder_id}/{file_name}")
video = video.download_blob().readall()
# upload video to facebook
url = f"{API_VIDEO_URL}/{page_id}/videos"
params = {
"upload_phase": "transfer",
"upload_session_id": session_id,
"start_offset": start_offset,
"access_token": access_token
}
response = requests.post(url, params=params, files={"video_file_chunk": video})
Bytes of the file is loaded into the memory and this is not good for larger files. There is a method in azure sdk readinto(stream) that downloads the file into a stream. Is there a way to connect requests streaming upload and readinto()
method. Or is there a another way to upload the file directly from blob storage?
file_url
in parameter with the blob sas url. But it has a limit, the video should be downloaded within 5 minutes and its size cannot exceed 1GB. So your video cannot be large and the network should be good : developers.facebook.com/docs/graph-api/reference/page/videos/… – Jim Xufile_url
doesn't work for me. So i think better way is in-memory with lazy loading. download and upload a chunk at a time. – Sajith Herath