Google Cloud Storage doesn't have folders in the reality, what you see as folders are just a representation, you can see a more detailed explanation here.
What you have to do is to fetch all the files inside of a "folder" recursively. i.e.:
import sys
from google.cloud import storage
from google.cloud.storage.blob import Blob
client = storage.Client()
for blob in client.list_blobs('mybucket', prefix='sofolder'):
blobname = blob.name
blobstring = str(blobname)
blobcleaname = blobstring.rsplit('/', 1)[-1]
if not blobstring.endswith('/'):
blob.download_to_filename('./' + str(blobcleaname))
print(blobstring)
Update 1:
So I did a quick reproduction of your use case by just printing to console the name of my bucket. I have a structure similar to yours:
bucketname
-filexxx
-folderyyy
-sofolder <--- the folder i'm interested in
-file1.png
-folder_a
-fileinfolder_a.png
-folder_b
-fileinfolder_b.png
-folder_c
-fileinfolder_c.png
and by runnning this:
import sys
from google.cloud import storage
from google.cloud.storage.blob import Blob
client = storage.Client()
for blob in client.list_blobs('bucketname', prefix='sofolder'):
blobname = blob.name
blobstring = str(blobname)
if not blobstring.endswith('/'):
print(blobstring)
I'm getting this output:
sofolder/
sofolder/file1.png
sofolder/folder_a/fileinfolder_a.png
sofolder/folder_b/fileinfolder_b.png
sofolder/folder_c/fileinfolder_c.png