1
votes

I'm trying to Install oozie on my Ubuntu Machine.

Here is my core-site.xml

<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<!--
  Licensed under the Apache License, Version 2.0 (the "License");
  you may not use this file except in compliance with the License.
  You may obtain a copy of the License at

    http://www.apache.org/licenses/LICENSE-2.0

  Unless required by applicable law or agreed to in writing, software
  distributed under the License is distributed on an "AS IS" BASIS,
  WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
  See the License for the specific language governing permissions and
  limitations under the License. See accompanying LICENSE file.
-->

<!-- Put site-specific property overrides in this file. -->


<configuration>
 <property>
  <name>hadoop.tmp.dir</name>
  <value>/app/hadoop/tmp</value>
  <description>A base for other temporary directories.</description>
 </property>

 <property>
  <name>fs.default.name</name>
  <value>hdfs://localhost:54310</value>
  <description>The name of the default file system.  A URI whose
  scheme and authority determine the FileSystem implementation.  The
  uri's scheme determines the config property (fs.SCHEME.impl) naming
  the FileSystem implementation class.  The uri's authority is used to
  determine the host, port, etc. for a filesystem.</description>
 </property>
</configuration>

And while installation Ozzie, when am doing this step :

Creating Sharelib directory in HDFS, with following command

./oozie-setup.sh sharelib create -fs hdfs://localhost:9000

Am getting bellow error:

java.lang.IllegalArgumentException: Wrong FS: hdfs://localhost:54310/user/hduser/share/lib/lib_20190803003111, expected: hdfs://localhost:9000
    at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:644)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getPathName(DistributedFileSystem.java:187)
    at org.apache.hadoop.hdfs.DistributedFileSystem.access$000(DistributedFileSystem.java:98)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1112)
    at org.apache.hadoop.hdfs.DistributedFileSystem$17.doCall(DistributedFileSystem.java:1108)
    at org.apache.hadoop.fs.FileSystemLinkResolver.resolve(FileSystemLinkResolver.java:81)
    at org.apache.hadoop.hdfs.DistributedFileSystem.getFileStatus(DistributedFileSystem.java:1108)
    at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:1399)
    at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:496)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:348)
    at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:338)
    at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1904)
    at org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1872)
    at org.apache.oozie.tools.OozieSharelibCLI.run(OozieSharelibCLI.java:165)
    at org.apache.oozie.tools.OozieSharelibCLI.main(OozieSharelibCLI.java:56)

Why its giving me this error ? Why its showing error on 54310 ? I'm not even using it ...!

Any Suggestion please..!

1

1 Answers

0
votes

In your core-site.xml, you have set:

  <name>fs.default.name</name>
  <value>hdfs://localhost:54310</value>

Set fs.default.name in your core-site.xml to hdfs://localhost:9000 (if that is the address of the NameNode), and restart the Oozie server, and try install sharelib with oozie-setup.sh sharelib create -fs hdfs://localhost:9000 -locallib share (assuming oozie-sharelib tar.gz is extracted to the share directory).