1
votes

I am trying to test an application on standalone cluster. Here is my scenario. I started a spark master on a node A and also 1 worker on the same node A.

I am trying to run the application from node B(this means I think this acts as driver).

I have added jars to the sparkconf using setJars("jar1","jar2")

When I start the application, I see the following info saying that It could the jar

16/12/16 07:45:56 INFO SparkContext: Added JAR jar1.jar at spark://nodeb:48151/jars/jar1.jar with timestamp 1481899556375

and

16/12/16 07:45:56 INFO SparkContext: Added JAR jar2.jar at spark://nodeb:48151/jars/jar2.jar with timestamp 1481899556376

But I get the following exception from nodeA during netty fetch (I think)

16/12/16 07:46:00 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0, nodeA): java.lang.RuntimeException: Stream '/jars/node2.jar' was not found

at org.apache.spark.network.client.TransportResponseHandler.handle(TransportResponseHandler.java:222)

at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:121)

at org.apache.spark.network.server.TransportChannelHandler.channelRead0(TransportChannelHandler.java:51)

at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)

at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266)

at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:103)

at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)

at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308)

at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294)

at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846)

at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)

at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)

at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)

at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)

at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)


at io.netty.util.concurrent.SingleThreadEven
2
I have the same problem. Any luck with this issue?Mahmoud Khaled

2 Answers

0
votes

Check where your jars are located

It should be located somewhere in HDFS. In my case I just put it under /tmp/hive/ and reference it as setJars(List("/tmp/hive/myJar")) and all worked like a charm

0
votes

This happens when the jars you add the the Spark java classpath aren't available in that path anymore.