0
votes

As an IAM user, I have been given full access to our S3 bucket.

When viewing AWS Management Console > My Security Credentials > Users > my_username, the policy for AmazonS3FullAccess is as follows:

{
"Version": "2012-10-17",
"Statement": [
    {
        "Effect": "Allow",
        "Action": "s3:*",
        "Resource": "*"
    }
]
}

Yet I am unable to download objects uploaded by other IAM users via AWS Management Console. I get the following xml error page when trying to download an object:

This XML file does not appear to have any style information associated with it. The document tree is shown below.
<Error>
<Code>AccessDenied</Code>
<Message>Access Denied</Message>
<RequestId>CB7861FFD8043D3D</RequestId>
<HostId>
whN/Ftumk8fRJcEyugQI7rpLC1F+00YJ7cM3bTILY9COmiuS0j2v7r37mbPf7B4tqYKTzNkjdfM=
</HostId>
</Error>

These other IAM users have the same permissions as me and have made any specific changes regarding the permissions to their objects.

The owner of the bucket -- who also created the IAM user accounts -- tried to add my Canonical ID to the permissions of the objects in question, but it didn't take. He is, however, able to download objects uploaded by any of the IAM users.

Are the permissions setup in the JSON above not enough to have full access to objects created by other IAM users? All IAM Users are created under same account. How can we easily set things up so any of the IAM users in our group can access each other's objects?

EDIT: thought it was a cross-account problem, but I believe all IAM users are created within same account. Changed wording accordingly.

Some additional info: Here is what the actual Bucket Policy looks like:

{
    "Version": "2012-10-17",
    "Statement": [
        {
            "Sid": "Bucket Policy",
            "Effect": "Allow",
            "Principal": {
                "AWS": [
                    "arn:aws:iam::123456789012:root",
                    "arn:aws:iam::987654321012:user/my_username"
                ]
            },
            "Action": [
                "s3:GetObject",
                "s3:GetObjectVersion",
                "s3:PutObject",
                "s3:DeleteObject",
                "s3:ListBucket",
                "s3:GetBucketLocation"
            ],
            "Resource": [
                "arn:aws:s3:::my_bucket/*",
                "arn:aws:s3:::my_bucket"
            ]
        }
    ]
}

I added my user info to Principal, but after reading more about IAM users it doesn't sound like that was necessary. Although, it doesn't seem like that would be a problem either.

1
Ah! You mentioned "cross-account objects"! If an object is uploaded to an Amazon S3 bucket that belongs to a different account, you should select --acl bucket-owner-full-control. If this is not done, then the object is 'owned' by the source account, not the destination account. Things can get quite messy and it can even be difficult to remove the objects, even by the bucket owner!John Rotenstein
Also check that the IAM user has programmatic access. Your MFA policy could be blocking this!joe
Moreover, there may be a bucket policy assigned to the bucket preventing you such actions. Whenever an AWS principal issues a request to S3, the authorization decision depends on the union of all the IAM policies, S3 bucket policies, and S3 ACLs that apply.dsumsky
I don't know if I misspoke with "cross account" access. It may actually be same account, but just objects created by different IAM users in the same account. Not sure about terminology here.Korean_Of_the_Mountain
@user1964692 you should also check whether the object you are trying to download is encrypted by KMS key -- if so, you also need to add kms:Decrypt to your IAM role that is used to perform the download.congbaoguier

1 Answers

0
votes

Apparently this issue is more due to the fact that I'm creating files using Databricks that are stored in a DBFS root directory (https://kb.databricks.com/dbfs/dbfs-root-permissions.html). I didn't think it mattered I was using Databricks, but it does.

The objects with the access issues were all uploaded using DBFS CLI. After using dbutils to copy the objects over to FileStore (https://docs.databricks.com/user-guide/advanced/filestore.html), I am now able to access the objects.