3
votes

I am trying to import a JSON file which has been uploaded into S3 into DynamoDB

I followed the tutorial amazon has given

http://docs.aws.amazon.com/datapipeline/latest/DeveloperGuide/dp-importexport-ddb-console-start.html

But when i try to activate the pipeline the component TableLoadActivity fails and DDBDestinationTable says CASCADE_FAILED

Both give the error

at org.apache.hadoop.mapreduce.JobSubmitter.writeOldSplits(JobSubmitter.java:520) at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:512) at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1285) at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1282) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapreduce.Job.submit(Job.java:1282) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:562) at org.apache.hadoop.mapred.JobClient$1.run(JobClient.java:557) at java.security.AccessController.doPrivileged(Native Method) at javax.security.auth.Subject.doAs(Subject.java:415) at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) at org.apache.hadoop.mapred.JobClient.submitJobInternal(

Any help would be appreciated

1
I was getting the similar error , because I was selecting the exact json file . So the solution was to select the folder in which the json and manifest files are present - Anshul

1 Answers

3
votes

I discovered that you have to select the inner folder, where 'manifest' is present. This use to be the folder with the backup's date and time.

I received this error when I have selected the parent folder, that I gave my table name.