1
votes

I am new to spark and kafka. We have a requirement to integrate kafka+spark+Hbase(with Phoenix).

ERROR:

Exception in thread "main" java.sql.SQLException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase:

I ended up with the above ERROR. If anybody could you please help how to resolve this issue.

Below is error log:

jdbc:phoenix:localhost.localdomain:2181:/hbase-unsecure
testlocalhost.localdomain:6667
18/03/05 16:18:52 INFO Metrics: Initializing metrics system: phoenix
18/03/05 16:18:52 WARN MetricsConfig: Cannot locate configuration: tried hadoop-metrics2-phoenix.properties,hadoop-metrics2.properties 18/03/05 16:18:52 INFO MetricsSystemImpl: Scheduled snapshot period at 10 second(s). 18/03/05 16:18:52 INFO MetricsSystemImpl: phoenix metrics system started 18/03/05 16:18:52 INFO ConnectionManager$HConnectionImplementation: Closing master protocol: MasterService 18/03/05 16:18:52 INFO ConnectionManager$HConnectionImplementation: Closing zookeeper sessionid=0x161f6fc5e4800a3 18/03/05 16:18:52 INFO ZooKeeper: Session: 0x161f6fc5e4800a3 closed 18/03/05 16:18:52 INFO ClientCnxn: EventThread shut down Exception in thread "main" java.sql.SQLException: ERROR 2007 (INT09): Outdated jars. The following servers require an updated phoenix.jar to be put in the classpath of HBase: region=SYSTEM.CATALOG,,1519831518459.b16e566d706c68469922eba74844a444., hostname=localhost,16020,1520282812066, seqNum=59 at org.apache.phoenix.exception.SQLExceptionCode$Factory$1.newException(SQLExceptionCode.java:476) at org.apache.phoenix.exception.SQLExceptionInfo.buildException(SQLExceptionInfo.java:150) at org.apache.phoenix.query.ConnectionQueryServicesImpl.checkClientServerCompatibility(ConnectionQueryServicesImpl.java:1272) at org.apache.phoenix.query.ConnectionQueryServicesImpl.ensureTableCreated(ConnectionQueryServicesImpl.java:1107) at org.apache.phoenix.query.ConnectionQueryServicesImpl.createTable(ConnectionQueryServicesImpl.java:1429) at org.apache.phoenix.schema.MetaDataClient.createTableInternal(MetaDataClient.java:2574) at org.apache.phoenix.schema.MetaDataClient.createTable(MetaDataClient.java:1024) at org.apache.phoenix.compile.CreateTableCompiler$2.execute(CreateTableCompiler.java:212) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:358) at org.apache.phoenix.jdbc.PhoenixStatement$2.call(PhoenixStatement.java:341) at org.apache.phoenix.call.CallRunner.run(CallRunner.java:53) at org.apache.phoenix.jdbc.PhoenixStatement.executeMutation(PhoenixStatement.java:339) at org.apache.phoenix.jdbc.PhoenixStatement.executeUpdate(PhoenixStatement.java:1492) at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2437) at org.apache.phoenix.query.ConnectionQueryServicesImpl$12.call(ConnectionQueryServicesImpl.java:2382) at org.apache.phoenix.util.PhoenixContextExecutor.call(PhoenixContextExecutor.java:76) at org.apache.phoenix.query.ConnectionQueryServicesImpl.init(ConnectionQueryServicesImpl.java:2382) at org.apache.phoenix.jdbc.PhoenixDriver.getConnectionQueryServices(PhoenixDriver.java:255) at org.apache.phoenix.jdbc.PhoenixEmbeddedDriver.createConnection(PhoenixEmbeddedDriver.java:149) at org.apache.phoenix.jdbc.PhoenixDriver.connect(PhoenixDriver.java:221) at java.sql.DriverManager.getConnection(DriverManager.java:664) at java.sql.DriverManager.getConnection(DriverManager.java:270) at com.spark.kafka.PhoenixJdbcClient.getConnection(PhoenixJdbcClient.scala:41) at com.spark.kafka.PhoenixJdbcClient.currentTableSchema(PhoenixJdbcClient.scala:595) at com.spark.kafka.SparkHBaseClient$.main(SparkHBaseClient.scala:47) at com.spark.kafka.SparkHBaseClient.main(SparkHBaseClient.scala) 18/03/05 16:18:52 INFO SparkContext: Invoking stop() from shutdown hook 18/03/05 16:18:52 INFO SparkUI: Stopped Spark web UI at http://192.168.1.103:4040 18/03/05 16:18:53 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 18/03/05 16:18:53 INFO MemoryStore: MemoryStore cleared 18/03/05 16:18:53 INFO BlockManager: BlockManager stopped 18/03/05 16:18:53 INFO BlockManagerMaster: BlockManagerMaster stopped 18/03/05 16:18:53 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 18/03/05 16:18:53 INFO SparkContext: Successfully stopped SparkContext 18/03/05 16:18:53 INFO ShutdownHookManager: Shutdown hook called 18/03/05 16:18:53 INFO ShutdownHookManager: Deleting directory /tmp/spark-c8dd26fc-74dd-40fb-a339-8c5dda36b973

We are using Amabri Server 2.6.1.3 with HDP-2.6.3.0 and below components:

  • Hbase-1.1.2
  • kafka-0.10.1
  • spark-2.2.0
  • phoenix

Below are the POM artifact's I have added for HBase and Phoenix.

<dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-client</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-common</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-protocol</artifactId>
            <version>1.3.1</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hbase</groupId>
            <artifactId>hbase-server</artifactId>
            <version>1.3.1</version>
        </dependency>

        <dependency>
            <groupId>org.apache.phoenix</groupId>
            <artifactId>phoenix-spark</artifactId>
            <version>4.10.0-HBase-1.2</version>
        </dependency>
        <dependency>
1
How have you attempted to fix the problem yourself?Litty
I changed the title and improved your question formatting. Please put more effort in creating an easily readable question in the future.zx485
I have seen some where (community.hortonworks.com/questions/39449/…) as we need to copy Phoenix Server jar file to lib of HBase. I have done same but still facing same issue.I Soft

1 Answers

0
votes

Try the following

1.Copy Phoenix server jar to all HBase region servers(HBase lib folder)
2.Restart HBase master