My system environment are as follows:
Using 2 node HDP 2.5 cluster with Kafka/ZK running on each.
Node1: Ambari-server, Ambari-agent
Node2 : Ambari-agent
I have set up both as kafka brokers with Zookeeper server running on the first. I have also installed and started NIFI as service successfully on both the nodes.
But while trying to add NIFI as Ambari service the NIFI server install is failing with the below error (see Error Log:)
What I can figure out is that the main reason behind the failure is
**resource_management.core.exceptions.Fail: Configuration parameter 'kafka_broker_hosts' was not found in configurations dictionary!**
I have checked under /usr/hdp/2.5.0.0-1245/nifi/conf/nifi.properties and changed the web port for nifi from default 8080 to 9090 following the suggestions from
https://community.hortonworks.com/questions/44042/adding-nifi-server-to-hdp-25-sandbox.html
Then I restarted the ambari server and stopped and started the nifi service but the same problem remained.
# web properties #
nifi.web.war.directory=./lib
nifi.web.http.host=
nifi.web.http.port=9090
nifi.web.https.host=
nifi.web.https.port=
nifi.web.jetty.working.directory=./work/jetty
nifi.web.jetty.threads=200
I checked in the below file which checks the configuration params and searched with the term "kafka_broker_host"
/var/lib/ambari-server/resources/common-services/ATLAS/0.1.0.2.3/package/scripts/params.py
and these are the relevant codes that I can figure out
**245 # ToDo: Kafka port to Atlas
246 # Used while upgrading the stack in a kerberized cluster and running kafka-acls.sh
247 hosts_with_kafka = default('/clusterHostInfo/kafka_broker_hosts', [])
248 host_with_kafka = hostname in hosts_with_kafka**
but the link https://issues.apache.org/jira/secure/attachment/12834513/AMBARI-16693.patch talks of some differnt code patch.
I am confused and is struggling to understand is there some code changes required in params.py as per the AMBARI-16693.patch which is in Apache JIRA or is it so that Apache NIFI is not compatible at all as an Ambari service.
Error Log:
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/NIFI/package/scripts/master.py", line 130, in <module>
Master().execute()
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/script.py", line 280, in execute
method(env)
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/NIFI/package/scripts/master.py", line 14, in install
import params
File "/var/lib/ambari-agent/cache/stacks/HDP/2.5/services/NIFI/package/scripts/params.py", line 90, in <module>
kafka_broker_host = str(master_configs['kafka_broker_hosts'][0])
File "/usr/lib/python2.6/site-packages/resource_management/libraries/script/config_dictionary.py", line 73, in __getattr__
raise Fail("Configuration parameter '" + self.name + "' was not found in configurations dictionary!")
resource_management.core.exceptions.Fail: Configuration parameter 'kafka_broker_hosts' was not found in configurations dictionary!
stdout:
2016-12-08 19:42:46,802 - The hadoop conf dir /usr/hdp/current/hadoop-client/conf exists, will call conf-select on it for version 2.5.0.0-1245
2016-12-08 19:42:46,806 - Checking if need to create versioned conf dir /etc/hadoop/2.5.0.0-1245/0
2016-12-08 19:42:46,811 - call[('ambari-python-wrap', u'/usr/bin/conf-select', 'create-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False, 'stderr': -1}
2016-12-08 19:42:46,855 - call returned (1, '/etc/hadoop/2.5.0.0-1245/0 exist already', '')
2016-12-08 19:42:46,856 - checked_call[('ambari-python-wrap', u'/usr/bin/conf-select', 'set-conf-dir', '--package', 'hadoop', '--stack-version', '2.5.0.0-1245', '--conf-version', '0')] {'logoutput': False, 'sudo': True, 'quiet': False}
2016-12-08 19:42:46,897 - checked_call returned (0, '')
2016-12-08 19:42:46,898 - Ensuring that hadoop has the correct symlink structure
2016-12-08 19:42:46,898 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-12-08 19:42:46,901 - Group['hadoop'] {}
2016-12-08 19:42:46,904 - Group['nifi'] {}
2016-12-08 19:42:46,904 - Group['users'] {}
2016-12-08 19:42:46,905 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,907 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,908 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-12-08 19:42:46,910 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-12-08 19:42:46,911 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,912 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,914 - User['nifi'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,915 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,917 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-12-08 19:42:46,919 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-12-08 19:42:46,923 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-12-08 19:42:46,936 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-12-08 19:42:46,937 - Group['hdfs'] {}
2016-12-08 19:42:46,938 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-12-08 19:42:46,939 - FS Type:
2016-12-08 19:42:46,939 - Directory['/etc/hadoop'] {'mode': 0755}
2016-12-08 19:42:46,970 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-12-08 19:42:46,972 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-12-08 19:42:46,999 - Initializing 2 repositories
2016-12-08 19:42:47,001 - Repository['HDP-2.5'] {'base_url': 'http://bigdata.persistent.co.in/HDP/HDP-2.5.0.0-centos7/', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-12-08 19:42:47,016 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://bigdata.persistent.co.in/HDP/HDP-2.5.0.0-centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-12-08 19:42:47,018 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://bigdata.persistent.co.in/HDP/HDP-UTILS-1.1.0.21-centos7/', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{{repo_id}}]\nname={{repo_id}}\n{% if mirror_list %}mirrorlist={{mirror_list}}{% else %}baseurl={{base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-12-08 19:42:47,025 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://bigdata.persistent.co.in/HDP/HDP-UTILS-1.1.0.21-centos7/\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-12-08 19:42:47,026 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-12-08 19:42:47,161 - Skipping installation of existing package unzip
2016-12-08 19:42:47,162 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-12-08 19:42:47,181 - Skipping installation of existing package curl
2016-12-08 19:42:47,182 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-12-08 19:42:47,199 - Skipping installation of existing package hdp-select
Command failed after 1 tries
