0
votes

Hello Flink Community,

following the documentation to troubleshoot unloading of dynamically loaded classes in Flink I added the database driver library to the opt/flink/lib folder on both the Flink JobManager Container and TaskManager Containers running on K8s (Flink Session Cluster, version: 1.11).

I marked the library as provided in my build.sbt file. The rest of the user code is p[art of the fat jar build by sbt assembly.

Now when I submit a job to the flink cluster using the Flink API (upload and run endpoints) it won't accept the job due to the following error:

java.lang.ClassNotFoundException: com.vertica.jdbc.Driver

Why is the jar not picked up by the Flink classloader?

I even added the class pattern to the config option without any difference:

classloader.parent-first-patterns-additional: com.vertica.jdbc.;

Link: https://ci.apache.org/projects/flink/flink-docs-release-1.12/ops/debugging/debugging_classloading.html#unloading-of-dynamically-loaded-classes-in-user-code

Any recommendation would be highly appreciated.

Cheers

1
@Tamir1989 Sorry, not a duplicate, as the vertica jdbc driver is not a plugin. Already tried the solution adviced in the answer you posted. It does not work in this case.gem_freak

1 Answers

0
votes

Please confirm your jdbc maven dependency is not provided. when the library is provided, the library is active when compile and test.