1
votes

I have created a spark application that process lat/long and identifies the zone defined in custom shape files provided by client. Given this requirement, i have created a shadow jar file using maven. But when i run application via spark-submit it throws following error

WARNING: User-defined SPARK_HOME (/opt/cloudera/parcels/CDH-5.13.2-1.cdh5.13.2.p0.3/lib/spark) overrides detected (/app/cloudera/parcels/CDH-5.13.2-1.cdh5.13.2.p0.3/lib/spark). WARNING: Running spark-class from user-defined location. 18/10/19 17:41:58 INFO SparkContext: Running Spark version 1.6.0 18/10/19 17:41:59 ERROR Configuration: error parsing conf core-default.xml javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized. at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2694) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2653) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2559) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1078) at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1132) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1540) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:85) at org.apache.hadoop.security.SecurityUtil.(SecurityUtil.java:74) at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:316) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:304) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:891) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:857) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:724) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2214) at org.apache.spark.SparkContext.(SparkContext.scala:324) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59) at com.abc.xyz.ShapeFileDataProcessor.main(ShapeFileDataProcessor.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Exception in thread "main" java.lang.ExceptionInInitializerError at org.apache.hadoop.security.UserGroupInformation.initialize(UserGroupInformation.java:316) at org.apache.hadoop.security.UserGroupInformation.ensureInitialized(UserGroupInformation.java:304) at org.apache.hadoop.security.UserGroupInformation.loginUserFromSubject(UserGroupInformation.java:891) at org.apache.hadoop.security.UserGroupInformation.getLoginUser(UserGroupInformation.java:857) at org.apache.hadoop.security.UserGroupInformation.getCurrentUser(UserGroupInformation.java:724) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214) at org.apache.spark.util.Utils$$anonfun$getCurrentUserName$1.apply(Utils.scala:2214) at scala.Option.getOrElse(Option.scala:120) at org.apache.spark.util.Utils$.getCurrentUserName(Utils.scala:2214) at org.apache.spark.SparkContext.(SparkContext.scala:324) at org.apache.spark.api.java.JavaSparkContext.(JavaSparkContext.scala:59) at com.abc.xyz.ShapeFileDataProcessor.main(ShapeFileDataProcessor.java:36) at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:606) at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:730) at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181) at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206) at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121) at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) Caused by: java.lang.RuntimeException: javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized. at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2820) at org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2653) at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2559) at org.apache.hadoop.conf.Configuration.get(Configuration.java:1078) at org.apache.hadoop.conf.Configuration.getTrimmed(Configuration.java:1132) at org.apache.hadoop.conf.Configuration.getBoolean(Configuration.java:1540) at org.apache.hadoop.security.SecurityUtil.setConfigurationInternal(SecurityUtil.java:85) at org.apache.hadoop.security.SecurityUtil.(SecurityUtil.java:74) ... 21 more Caused by: javax.xml.parsers.ParserConfigurationException: Feature 'http://apache.org/xml/features/xinclude' is not recognized. at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown Source) at org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2694) ... 28 more

Here is the spark-submit command

spark-submit --name ShapeFileProcessor --master yarn-client --files application.properties --conf "spark.driver.extraJavaOptions=-XX:+UseConcMarkSweepGC -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/adp-spark-stream/ " --conf "spark.eventLog.enabled=true" --conf "spark.executor.extraJavaOptions=-XX:+UseConcMarkSweepGC -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/adp-spark-stream/ " --class com.abc.xyz.ShapeFileDataProcessor CustomShapeFileAggregator-0.0.1.jar

here is code snippet from gradle for repository and dependency

    repositories {
            mavenLocal()
            maven { url 'http://maven.geo-solutions.it' }
            maven { url 'http://download.java.net/maven/2' }
            maven { url 'http://download.osgeo.org/webdav/geotools/' }
    }


   task shadowJar(type: Jar) {
     manifest {
        attributes 'Implementation-Title': 'My Application',
                   'Implementation-Version': version
     }
    baseName = project.name
    from { 
           configurations.compile.collect { 
              it.isDirectory() ? it : zipTree(it) 
           } 
         }
    with jar
  }  

dependencies {
        compile    group: 'org.geotools',    name: 'gt-shapefile',    version: '14.5'
        compile    group: 'org.geotools',    name: 'gt-swing',    version: '14.5'

        provided    group: 'org.apache.spark',  name: 'spark-core_2.10',   version: '1.6.0'
        provided    group: 'org.apache.spark',  name: 'spark-sql_2.10',    version: '1.6.0'
        provided    group: 'org.apache.spark',  name: 'spark-hive_2.10',   version: '1.6.0'
}
1
I'm not seeing any GeoTools code in there? did you miss tag the question?Ian Turton
Does your shadow jar include any XML parsing libraries? Perhaps there's a conflicting XML library which is conflicting with Hadoop's XML parsing libraries...GeoMesaJim
@GeoMesaJim geotool reads the sahpe file that is in xml format. May be it has something to do with it when all code combined into shadow jarFaisal Ahmed Siddiqui
@IanTurton I have geotool code that i integrated with spark application. When i build shadow jar without geotool dependency then this error does not occur. Hence it has certainly something todo with geotool when it combined all dependencies into shadow jarFaisal Ahmed Siddiqui
which geotools dependencies do you have in your pom file? how did you build the shadow jar?Ian Turton

1 Answers

0
votes

For me it was a dependency issue. There were jars of "xerces" imported from some other dependencies. Excluding these dependencies of "xerces" from my pom.xml, solved the issue.

<exclusions>
            <exclusion>
                <artifactId>xercesImpl</artifactId>
                <groupId>xerces</groupId>
            </exclusion>
            <exclusion>
                <artifactId>xmlParserAPIs</artifactId>
                <groupId>xerces</groupId>
            </exclusion>
        </exclusions>