How to create an empty folder on Google Storage with Google API? (Assume that /
is path separator.)
8 Answers
@SheRey - looking at folders created via the GCS web interface, the Content-Type is set to application/x-www-form-urlencoded;charset=UTF-8
but it doesn't really matter. Here's what worked for me in python:
# pip install google-cloud-storage
from google.cloud import storage
gcs_client = storage.Client(project='some_project')
bucket = gcs_client.get_bucket('some_bucket')
blob = bucket.blob('some/folder/name/')
blob.upload_from_string('', content_type='application/x-www-form-urlencoded;charset=UTF-8')
Google Cloud Storage does not have folders or subdirectories. However, there is some support for emulating them. gsutil's How Subdirectories Work is a good read for some background.
Google Cloud Storage objects are a flat namespace, but many tools, including gsutil and the Google Cloud Storage UI, create an illusion of a hierarchical file tree.
There are two widely used conventions for creating the illusion of an empty subdirectory:
(recommended) Create an object that ends in a trailing slash. For example, to create a subdirectory called
foo
at the root of a bucket, you would create an empty object (size 0) calledfoo/
.(legacy) Create an object with
_$folder$
appended to the name. For example, to create a subdirectory calledfoo
at the root of a bucket, you would create an empty object (size 0) calledfoo_$folder$
.
Note that most tools and utilities are using method 1 now. Method 2 is less frequently used.
Node.js
+ @google-cloud/storage@^2.5.0
:
Only you need do is to assign a destination like: <folder>/<file Name>
pattern.
For below example, I am using uuid
as my folder name to simulate each user has a folder to store their own files.
it('should upload file and create a folder correctly', async () => {
const myStorage = new Storage({ keyFilename: path.resolve(__dirname, '../../../.gcp/cloud-storage-admin.json') });
const bucket = myStorage.bucket('ez2on');
const fileName = 'mmczblsq.doc';
const filePath = path.resolve(__dirname, `../../../tmp/${fileName}`);
const uuid = faker.random.uuid();
await bucket.upload(filePath, {
destination: `${uuid}/${fileName}`,
gzip: true,
metadata: {
cacheControl: 'public, max-age=31536000'
}
});
});
The result is:
Here is the API docs for @google-cloud/storage
: https://googleapis.dev/nodejs/storage/latest/Bucket.html#upload
Go
+ cloud.google.com/go/storage
package main
import (
"cloud.google.com/go/storage"
"context"
"fmt"
"github.com/google/uuid"
"google.golang.org/api/option"
"io"
"log"
"os"
)
func main() {
ctx := context.Background()
opts := option.ClientOption(
option.WithCredentialsFile(os.Getenv("CredentialsFile")),
)
client, err := storage.NewClient(ctx, opts)
if err != nil {
log.Fatalf("%v", err)
}
filename := "mmczblsq.doc"
filepath := fmt.Sprintf("./tmp/%s", filename)
file, err := os.Open(filepath)
if err != nil {
log.Fatalf("%v", err)
}
defer file.Close()
uuidIns, err := uuid.NewUUID()
if err != nil {
log.Fatalf("%v", err)
}
object := fmt.Sprintf("%s/%s", uuidIns, filename)
log.Printf("object name: %s", object)
wc := client.Bucket("ez2on").Object(object).NewWriter(ctx)
if _, err := io.Copy(wc, file); err != nil {
log.Fatalf("%v", err)
}
if err := wc.Close(); err != nil {
log.Fatalf("%v", err)
}
}
Output of stdout:
☁ upload [master] ⚡ CredentialsFile=/Users/ldu020/workspace/github.com/mrdulin/nodejs-gcp/.gcp/cloud-storage-admin.json go run main.go
2019/07/08 14:47:59 object name: 532a2250-a14c-11e9-921d-8a002870ac01/mmczblsq.doc
Check the file in google cloud platform console:
Thank you for the Question and the chosen Best answer. Here is a code snippet that I wrote: Python Method:
def create_folder(bucket_name, destination_folder_name):
storage_client = storage.Client()
bucket = storage_client.get_bucket(bucket_name)
blob = bucket.blob(destination_folder_name)
blob.upload_from_string('')
print('Created {} .'.format(
destination_folder_name))
main code that calls the method:
folder = create_folder(bucket_name, 'test-folder/')
I was trying to create a unique subdirectory on Google Storage bucket from CircleCI instance. Here's how i did it using gsutil. I created a dummy file. I also appended my folder name to a destination bucket name with slash.
Usage: gsutil cp OBJECT_LOCATION gs://DESTINATION_BUCKET_NAME/
sudo gsutil cp ~/some-path/dummyFile.txt gs://my-bucket/unique-folder-name/