0
votes

root> sh import-hive.sh Using Hive configuration directory [/opt/cloudera/parcels/CDH/lib/hive//conf] /opt/cloudera/parcels/CDH/lib/hive/conf:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop/lib/:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop/.//:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/./:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/lib/:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop-hdfs/.//:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/lib/:/opt/cloudera/parcels/CDH-5.13.0-1.cdh5.13.0.p0.29/lib/hadoop/libexec/../../hadoop-yarn/.//:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/lib/:/opt/cloudera/parcels/CDH/lib/hadoop-mapreduce/.// Log file for import is /root/apache-atlas-sources-1.0.0/addons/hive-bridge/src/logs/import-hive.log log4j:WARN No such property [maxFileSize] in org.apache.log4j.PatternLayout. log4j:WARN No such property [maxBackupIndex] in org.apache.log4j.PatternLayout. Enter username for atlas :- admin Enter password for atlas :- Exception in thread "main" java.lang.NoSuchMethodError: com.fasterxml.jackson.databind.util.BeanUtil.okNameForGetter(Lcom/fasterxml/jackson/databind/introspect/AnnotatedMethod;Z)Ljava/lang/String; at com.fasterxml.jackson.module.jaxb.JaxbAnnotationIntrospector.findNameForSerialization(JaxbAnnotationIntrospector.java:936) at com.fasterxml.jackson.databind.introspect.AnnotationIntrospectorPair.findNameForSerialization(AnnotationIntrospectorPair.java:498) at com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector._addGetterMethod(POJOPropertiesCollector.java:502) at com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector._addMethods(POJOPropertiesCollector.java:465) at com.fasterxml.jackson.databind.introspect.POJOPropertiesCollector.collect(POJOPropertiesCollector.java:233) at com.fasterxml.jackson.databind.introspect.BasicClassIntrospector.collectProperties(BasicClassIntrospector.java:142) at com.fasterxml.jackson.databind.introspect.BasicClassIntrospector.forSerialization(BasicClassIntrospector.java:68) at com.fasterxml.jackson.databind.introspect.BasicClassIntrospector.forSerialization(BasicClassIntrospector.java:11) at com.fasterxml.jackson.databind.SerializationConfig.introspect(SerializationConfig.java:490) at com.fasterxml.jackson.databind.ser.BeanSerializerFactory.createSerializer(BeanSerializerFactory.java:133) at com.fasterxml.jackson.databind.SerializerProvider._createUntypedSerializer(SerializerProvider.java:873) at com.fasterxml.jackson.databind.SerializerProvider._createAndCacheUntypedSerializer(SerializerProvider.java:833) at com.fasterxml.jackson.databind.SerializerProvider.findValueSerializer(SerializerProvider.java:387) at com.fasterxml.jackson.databind.SerializerProvider.findTypedValueSerializer(SerializerProvider.java:478) at com.fasterxml.jackson.databind.ser.DefaultSerializerProvider.serializeValue(DefaultSerializerProvider.java:97) at com.fasterxml.jackson.databind.ObjectWriter.writeValue(ObjectWriter.java:494) at com.fasterxml.jackson.jaxrs.base.ProviderBase.writeTo(ProviderBase.java:625) at com.sun.jersey.api.client.RequestWriter.writeRequestEntity(RequestWriter.java:300) at com.sun.jersey.client.urlconnection.URLConnectionClientHandler._invoke(URLConnectionClientHandler.java:204) at com.sun.jersey.client.urlconnection.URLConnectionClientHandler.handle(URLConnectionClientHandler.java:147) at com.sun.jersey.api.client.filter.HTTPBasicAuthFilter.handle(HTTPBasicAuthFilter.java:81) at com.sun.jersey.api.client.Client.handle(Client.java:648) at com.sun.jersey.api.client.WebResource.handle(WebResource.java:670) at com.sun.jersey.api.client.WebResource.access$200(WebResource.java:74) at com.sun.jersey.api.client.WebResource$Builder.method(WebResource.java:623) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:356) at org.apache.atlas.AtlasBaseClient.callAPIWithResource(AtlasBaseClient.java:327) at org.apache.atlas.AtlasBaseClient.callAPI(AtlasBaseClient.java:212) at org.apache.atlas.AtlasClientV2.createEntity(AtlasClientV2.java:285) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerInstance(HiveMetaStoreBridge.java:446) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.registerDatabase(HiveMetaStoreBridge.java:398) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importDatabases(HiveMetaStoreBridge.java:277) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.importHiveMetadata(HiveMetaStoreBridge.java:247) at org.apache.atlas.hive.bridge.HiveMetaStoreBridge.main(HiveMetaStoreBridge.java:168) Failed to import Hive Meta Data!!!

1

1 Answers

0
votes

First of all, note that Apache Atlas is normally not used in CDH.

Apache Atlas is the governance solution that ships with HDP.

In CDH, the governance solution is Cloudera Navigator.


That being said, Atlas is an open source project so you are free to use it in any Hadoop setup. I am not sure this will be easy to achieve, with most likely causes of trouble component dependencies.

If you want to make Atlas work, make sure you have exactly the right supporting components as well as compatible versions of everything.