I have a problem with Apache job-server and my .jar with SparkJob.
I have VirtualBox with DataStax. There are Cassandra and Spark. I install Apache job-server from git job-server. I want run examples so I write sbt job-server-tests/package
and next run job-server from terminal sbt re-start
Examples from job-server work
curl --data-binary @/home/job-server/job-server-tests/target/job.jar localhost:8090/jars/test
curl -d "" 'localhost:8090/jobs?appName=test&classPath=spark.jobserver.LongPiJob'
Problem is when I make my .jar
I use eclipse on Windows with Scala IDE. I installed sbteclipse plugin and I made folder C:\Users\user\scalaWorkspace\LongPiJob
with scala project. I run cmd, go to this folder and run sbt eclipse
sbt compile
and sbt package
. Then I copy .jar to VirtualBox. Next I use 1. curl command. When I use 2. curl command I get a error
job-server[ERROR] Exception in thread "pool-25-thread-1" java.lang.AbstractMethodError: com.forszpaniak.LongPiJob$.validate(Ljava/lang/Object;Lcom/typesafe/config/Config;)Lspark/jobserver/SparkJobValidation; job-server[ERROR] at spark.jobserver.JobManagerActor$$anonfun$spark$jobserver$JobManagerActor$$getJobFuture$4.apply(JobManagerActor.scala:225) job-server[ERROR] at scala.concurrent.impl.Future$PromiseCompletingRunnable.liftedTree1$1(Future.scala:24) job-server[ERROR] at scala.concurrent.impl.Future$PromiseCompletingRunnable.run(Future.scala:24) job-server[ERROR] at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) job-server[ERROR] at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) job-server[ERROR] at java.lang.Thread.run(Thread.java:745)
in terminal where I started server. In curl terminal I get
[root@localhost spark-jobserver]# curl -d "stress.test.longpijob.duration=15" 'localhost:8090/jobs?appNametestJob1.5&classPath=com.forszpaniak.LongPiJob' { "status": "ERROR", "result": { "message": "Ask timed out on [Actor[akka://JobServer/user/context-supervisor/4538158c-com.forszpaniak.LongPiJob#-713999361]] after [10000 ms]", "errorClass": "akka.pattern.AskTimeoutException", "stack": ["akka.pattern.PromiseActorRef$$anonfun$1.apply$mcV$sp(AskSupport.scala:333)", "akka.actor.Scheduler$$anon$7.run(Scheduler.scala:117)", "scala.concurrent.Future$InternalCallbackExecutor$.scala$concurrent$Future$InternalCallbackExecutor$$unbatchedExecute(Future.scala:694)", "scala.concurrent.Future$InternalCallbackExecutor$.execute(Future.scala:691)", "akka.actor.LightArrayRevolverScheduler$TaskHolder.executeTask(Scheduler.scala:467)", "akka.actor.LightArrayRevolverScheduler$$anon$8.executeBucket$1(Scheduler.scala:419)", "akka.actor.LightArrayRevolverScheduler$$anon$8.nextTick(Scheduler.scala:423)", "akka.actor.LightArrayRevolverScheduler$$anon$8.run(Scheduler.scala:375)", "java.lang.Thread.run(Thread.java:745)"] }
I my .jar I use code from example LongPiJob.scala. I have searched some information about this server error and I think, it can be version problem?
java.lang.AbstractMethodError: com.forszpaniak.LongPiJob$.validate(Ljava/lang/Object;Lcom/typesafe/config/Config;)Lspark/jobserver/SparkJobValidation;
I think instead Object should be SparkContext...
I use DataStax: 4.6 job-server: 0.5.1 scala: 2.10.4 sbt: 0.13 spark: 1.1.0