1
votes

I'm looking to set up a transfer job to take files stored within an S3 bucket and load them to a GCS bucket. The credentials that I have give me access to the folder that contains the files that I need from S3 but not to the higher level folders.

When I try to set up the transfer job with the S3 bucket name under 'Amazon S3 bucket' and the access key ID & secret access key filled-in, the access is denied as you would expect given the limits of my credentials. However, access is still denied if I add the extra path information as a Prefix item (e.g. 'Production/FTP/CompanyName') and I do have access to this folder.

It seems as though I can't get past the fact that I do not have access to the root directory. Is there any way around this?

3
Can you provide the exact error you are getting?pradeep
Hi Pradeep. Yes, the message is 'Failed to obtain the location of the source S3 bucket. Additional details: Access Denied'.Paul

3 Answers

0
votes

According to the documentation link:

The Storage Transfer Service uses the project-[$PROJECT_NUMBER]@storage-transfer-service.iam.gserviceaccount.com service account to move data from a Cloud Storage source bucket.

The service account must have the following permissions for the source bucket:

storage.buckets.get Allows the service account to get the location of the bucket. Always required.

storage.objects.list Allows the service account to list objects in the bucket. Always required.

storage.objects.get Allows the service account to read objects in the bucket. Always required.

storage.objects.delete Allows the service account to delete objects in the bucket. Required if you set deleteObjectsFromSourceAfterTransfer to true.

The roles/storage.objectViewer and roles/storage.legacyBucketReader roles together contain the permissions that are always required. The roles/storage.legacyBucketWriter role contains the storage.objects.delete permissions. The service account used to perform the transfer must be assigned the desired roles.

You have to set this permissions on your AWS bucket.

-1
votes

Paul,

Most likely your IAM role is missing s3:ListBucket permission. Can you update your IAM role to have s3:ListBucket , s3:GetBucketLocation and try again?

-1
votes

On AWS permission policy should like below , in case you want to give access to subfolder.

{
    "Version": "2012-10-17",

    "Statement": [
        {
            "Effect": "Allow",
            "Action": [
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::<bucketname>",
                "arn:aws:s3:::<bucketname>/*"
            ]
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:List*",
                "s3:Get*"
            ],
            "Resource": "arn:aws:s3:::<bucketname>",
            "Condition": {
                "StringLike": {
                    "s3:prefix": [
                        "<subfolder>/*"
                    ]
                }
            }
        },
        {
            "Effect": "Allow",
            "Action": [
                "s3:List*",
                "s3:Get*"
            ],
            "Resource": [
                "arn:aws:s3:::<bucketname>/<subfolder>",
                "arn:aws:s3:::<bucketname>/<subfolder>/*"
            ],
            "Condition": {}
        }
    ]
}