I am trying to copy data from HDFS to Teradata through Spark. I am getting UnknownHostException when running thorugh the Spark-shell. I am getting teradata.main not found while running the same through spark submit(I have added teradata jars with the spark submit as well). The Same Teradata connection URL and Credentials are working good when Sqooping.
I have added teradata jars to the executor and driver class path and also in sparkdefaults.conf as well. Please find my Spark Teradata connection code as below,
val jdbcDF = sqlContext.load("jdbc", Map("url" -> "jdbc:teradata://teradataservername, user=***###, password=***###","dbtable" -> "query","driver" -> "com.teradata.jdbc.TeraDriver"))
Please find my excetion that I got while running from spark-shell as below,
TERAJDBC4 ERROR [main] com.teradata.jdbc.jdk6.JDK6_SQL_Connection@219f9031 Connection to (teradata server), TMODE=TERA, username=###, password=### Sun Aug 06 22:43:40 EDT 2017 socket orig=(teradata server), TMODE=TERA, username=###, password=### cid=742ff968 sess=0 java.net.UnknownHostException: (teradata server), TMODE=TERA, username=###, password=###: unknown error at java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method) at java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:928) at java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1323) at java.net.InetAddress.getAllByName0(InetAddress.java:1276)