1
votes

We are facing error while we are trying to load a huge zip file from S3 bucket to redshift from EC2 instance and even aginity. Waht is the real issue here?

As far as we have checked this can be because of the VPC NACL rules but not sure.

Error : ERROR: Connection timed out after 50000 milliseconds

4

4 Answers

0
votes

I think you are correct, it might be because bucket access rules or secret/access keys.

Here are some pointers to debug it further if above doesn't work.

  1. Create a small zip file, then try again if its something because of Size(but I don't think it is possible case.)

  2. Split your zip file into multiple zip files and create Manifest file for loading rather then single file.

I hope your will find this useful.

0
votes

You should create an IAM role which authorizes Amazon Redshift to access other AWS services like S3 on your behalf, you must associate that role with an Amazon Redshift cluster before you can use the role to load or unload data.

Check below link for setting up IAM role:

https://docs.aws.amazon.com/redshift/latest/mgmt/copy-unload-iam-role.html

0
votes

I got this error when the Redshift cluster had Enhanced VPC Routing enabled, but no route in the route table for S3. Adding the S3 endpoint fixed the issue. Link to docs.

0
votes

I also got this error and the Enhanced VPC Routing is enabled , check the routing from your Redshift cluster to S3.

There are several ways to let the Redshift cluster reach S3 , you can see the link below:

https://docs.aws.amazon.com/redshift/latest/mgmt/enhanced-vpc-routing.html

I solved this error by setting NAT for my private subnet which is used by my Redshift cluster.