I have written a Java program which is included in the oozie workflow which puts files from HDFS to S3 bucket. However, I am getting the following error
com.amazonaws.services.s3.model.AmazonS3Exception: Access Denied (Service: Amazon S3; Status Code: 403; Error Code: AccessDenied; Request ID: 310F08CD4FF8B5D9), S3 Extended Request ID: fAysD1vgtriV8x+sf1zqHk58eAT89Y6HD+ziEokaPvFPKwaPrHDxt5yygsiA1ktNVsyj+GTmbQ0=
I am creating the key path in S3 bucket dynamically in oozie workflow.
For eg: If my file name is abc_20171009.tsv.gz
then this file should be uploaded to bucket in the following path
tsvFile/year=2017/month=10/day=09/abc_20171009.tsv.gz
In the similar way other day files should be uploaded based on the date.
My query is whether the key path should preexist in the bucket before uploading the files or it can be created dynamically?
// Request server-side encryption.
BasicAWSCredentials awsCredentials = new BasicAWSCredentials(awsAccessKeyId, awsSecretKey);
AmazonS3Client s3Client = new AmazonS3Client(awsCredentials);
PutObjectRequest request = new PutObjectRequest("bucket_name", "key_name","");
ObjectMetadata objectMetadata = new ObjectMetadata();
objectMetadata.setSSEAlgorithm(ObjectMetadata.AES_256_SERVER_SIDE_ENCRYPTION);
request.setMetadata(objectMetadata);
PutObjectResult response = s3Client.putObject(request);
LOGGER.info("Server Side Encryption successful" +response.getSSEAlgorithm());
Note:I am able to manually put the file and connect to S3 bucket through AWS CLI.