I have a written a sqoop job to import data from a table in Netezza to HDFS. The job is created successfully and also when executed, starts a Map Reduce job. The job runs till map 100% reduce 0% and gets stuck.The job never completes and data is not transferred at all. There is no error or exception observed.
I have few similar jobs for other tables of same database. Those execute properly and transfer data. What could be the possible reason for this behavior.
Below is configuration for sqoop job given in a option file.
--direct --connect jdbc:netezza://url/database_name --username abcd --password xyz --table table_name --split-by primary_key_column --target-dir hdfs_path -m 8