I just started working with Drools and trying to integrate it with my Spark Streaming job. I'm using Drools 6.3.0.Final with kie-ci so I can pull my kJar remotely from my spark job and use the kie scanner to update if there's a newer version. However I ran into the following exception:
016-04-27 10:22:32,360 WARN [streaming-job-executor-0] Sisu (Logs.java:warn(394)) - Error injecting: org.apache.maven.execution.DefaultMavenExecutionRequestPopulator
com.google.inject.ProvisionException: Guice provision errors:
1) No implementation for org.apache.maven.repository.RepositorySystem was bound.
while locating org.apache.maven.execution.DefaultMavenExecutionRequestPopulator
1 error
at com.google.inject.internal.InjectorImpl$3.get(InjectorImpl.java:974)
at com.google.inject.internal.InjectorImpl.getInstance(InjectorImpl.java:1000)
at org.eclipse.sisu.space.AbstractDeferredClass.get(AbstractDeferredClass.java:48)
at com.google.inject.internal.ProviderInternalFactory.provision(ProviderInternalFactory.java:84)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.provision(InternalFactoryToInitializableAdapter.java:52)
at com.google.inject.internal.ProviderInternalFactory$1.call(ProviderInternalFactory.java:70)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:100)
at org.eclipse.sisu.plexus.PlexusLifecycleManager.onProvision(PlexusLifecycleManager.java:133)
at com.google.inject.internal.ProvisionListenerStackCallback$Provision.provision(ProvisionListenerStackCallback.java:108)
at com.google.inject.internal.ProvisionListenerStackCallback.provision(ProvisionListenerStackCallback.java:55)
at com.google.inject.internal.ProviderInternalFactory.circularGet(ProviderInternalFactory.java:68)
at com.google.inject.internal.InternalFactoryToInitializableAdapter.get(InternalFactoryToInitializableAdapter.java:45)
at com.google.inject.internal.ProviderToInternalFactoryAdapter$1.call(ProviderToInternalFactoryAdapter.java:46)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1018)
at com.google.inject.internal.ProviderToInternalFactoryAdapter.get(ProviderToInternalFactoryAdapter.java:40)
at com.google.inject.Scopes$1$1.get(Scopes.java:59)
at com.google.inject.internal.InternalFactoryToProviderAdapter.get(InternalFactoryToProviderAdapter.java:41)
at com.google.inject.internal.InjectorImpl$3$1.call(InjectorImpl.java:965)
at com.google.inject.internal.InjectorImpl.callInContext(InjectorImpl.java:1011)
at com.google.inject.internal.InjectorImpl$3.get(InjectorImpl.java:961)
at org.eclipse.sisu.inject.LazyBeanEntry.getValue(LazyBeanEntry.java:82)
at org.eclipse.sisu.plexus.LazyPlexusBean.getValue(LazyPlexusBean.java:51)
at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:260)
at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:252)
at org.codehaus.plexus.DefaultPlexusContainer.lookup(DefaultPlexusContainer.java:246)
at org.kie.scanner.embedder.PlexusComponentProvider.lookup(PlexusComponentProvider.java:42)
at org.kie.scanner.embedder.MavenEmbedder.buildMavenExecutionRequest(MavenEmbedder.java:111)
at org.kie.scanner.embedder.MavenEmbedder.<init>(MavenEmbedder.java:84)
at org.kie.scanner.embedder.MavenEmbedder.<init>(MavenEmbedder.java:75)
at org.kie.scanner.embedder.MavenEmbedder.<init>(MavenEmbedder.java:69)
at org.kie.scanner.embedder.MavenProjectLoader.parseMavenPom(MavenProjectLoader.java:55)
at org.kie.scanner.embedder.MavenProjectLoader.parseMavenPom(MavenProjectLoader.java:49)
at org.kie.scanner.ArtifactResolver.getResolverFor(ArtifactResolver.java:127)
at org.kie.scanner.ArtifactResolver.getResolverFor(ArtifactResolver.java:90)
at org.kie.scanner.KieRepositoryScannerImpl.setKieContainer(KieRepositoryScannerImpl.java:88)
at org.drools.compiler.kie.builder.impl.KieServicesImpl.newKieScanner(KieServicesImpl.java:139)
I've set up my remote repository in my settings.xml where I'm running spark correctly because it pulls the newest jar when it created the KieContainer correctly. It only fails when I include the KieScanner for some reason.
My code looks like this :
KieServices ks = KieServices.Factory.get();
ReleaseId releaseId = ks.newReleaseId("test", "test", "LATEST");
KieContainer kieContainer = ks.newKieContainer(releaseId);
KieSession kSession = kieContainer.newKieSession();
KieScanner kScanner = ks.newKieScanner(kieContainer);
kScanner.scanNow();
kSession.fireAllRules();
kSession.dispose();
I'm running Spark 1.6.1 with hadoop 1, if I used other packaged Spark versions with higher hadoop, it will fail with a different exception
(java.lang.NoSuchMethodError: com.google.inject.Binder.bindListener)
Does anybody know why or how I can fix this?