2
votes

I have a written a sqoop job to import data from a table in Netezza to HDFS. The job is created successfully and also when executed, starts a Map Reduce job. The job runs till map 100% reduce 0% and gets stuck.The job never completes and data is not transferred at all. There is no error or exception observed.

I have few similar jobs for other tables of same database. Those execute properly and transfer data. What could be the possible reason for this behavior.

Below is configuration for sqoop job given in a option file.

--direct --connect jdbc:netezza://url/database_name --username abcd --password xyz --table table_name --split-by primary_key_column --target-dir hdfs_path -m 8

1
Can you post your sqoop import command?OneCricketeer

1 Answers

2
votes

I removed the --direct option and job worked as expected. The direct option doesn't work if data is having ',' character in case of Netezza. Below is the exception error encountered with --direct:

Unable to execute external table export org.netezza.error.NzSQLException: ERROR: found delim ',' in a data field, specify escapeChar '\' option in the external table definition