1
votes

I use Google's Firebase Storage through the Java Admin API to store files in a Spring Boot application. For security reasons, I create a separate storage bucket for each customer organization. Since I blow away my test environment frequently, I delete and create these storage buckets often. I'm looking for help to get these buckets quicker into Firebase.

Here's what I currently do:

  1. My Spring Boot application creates a bucket with the Google Cloud Storage libraries per the Firebase Storage documentation.
  2. I add the Google Cloud Storage buckets to Firebase by importing them in the Firebase Storage web console. I can import multiple buckets at once.
  3. I apply the default security rules to each Firebase Storage bucket. I can only do this one bucket at a time.

I want to automate steps 2 & 3:

  • From what I can see in the Firebase docs, I can't do steps 2 and 3 with the Java Admin API.
  • I can probably automate step 3 through the Firebase CLI tool. For that, I need to set up a "deployment target" that includes all the buckets. I have to add the buckets by name, one by one, and can't use any wildcards here. But my bucket names include database IDs which will be pretty much the same across my environments. So I hope that this deployment target only changes when I add more customer organizations.

Does anybody know of better ways to automate steps 2 and 3?

1

1 Answers

0
votes

I decided to change my approach: I now put all customer organizations into one bucket.

  • It's just too much work to deal with the different buckets now. With one bucket, I just delete all the folders in there which seems simple for now in the Firebase Storage web console.
  • I'm not sure how my project would behave with hundreds or thousands of buckets. On the contrary, it seems that having thousands of folders within a bucket is a valid use case.
  • The lack of functionality for Storage buckets both in the Firebase CLI and the Google Cloud SDK indicates to me that you shouldn't have many buckets in your project.