0
votes

017-12-21 13:46:55,297 - Stack Feature Version Info: Cluster Stack=2.6, Cluster Current Version=None, Command Stack=None, Command Version=None -> 2.6 2017-12-21 13:46:55,317 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf User Group mapping (user_group) is missing in the hostLevelParams 2017-12-21 13:46:55,319 - Group['hadoop'] {} 2017-12-21 13:46:55,320 - Group['users'] {} 2017-12-21 13:46:55,321 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,323 - call['/var/lib/ambari-agent/tmp/changeUid.sh zookeeper'] {} 2017-12-21 13:46:55,334 - call returned (0, '1002') 2017-12-21 13:46:55,335 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1002} 2017-12-21 13:46:55,337 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,338 - call['/var/lib/ambari-agent/tmp/changeUid.sh infra-solr'] {} 2017-12-21 13:46:55,349 - call returned (0, '1013') 2017-12-21 13:46:55,350 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1013} 2017-12-21 13:46:55,351 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,351 - call['/var/lib/ambari-agent/tmp/changeUid.sh oozie'] {} 2017-12-21 13:46:55,360 - call returned (0, '1003') 2017-12-21 13:46:55,360 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': 1003} 2017-12-21 13:46:55,363 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,364 - call['/var/lib/ambari-agent/tmp/changeUid.sh ams'] {} 2017-12-21 13:46:55,375 - call returned (0, '1004') 2017-12-21 13:46:55,376 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1004} 2017-12-21 13:46:55,378 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': None} 2017-12-21 13:46:55,380 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,381 - call['/var/lib/ambari-agent/tmp/changeUid.sh kafka'] {} 2017-12-21 13:46:55,390 - call returned (0, '1020') 2017-12-21 13:46:55,390 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1020} 2017-12-21 13:46:55,391 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,392 - call['/var/lib/ambari-agent/tmp/changeUid.sh tez'] {} 2017-12-21 13:46:55,401 - call returned (0, '1006') 2017-12-21 13:46:55,402 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users'], 'uid': 1006} 2017-12-21 13:46:55,403 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,404 - call['/var/lib/ambari-agent/tmp/changeUid.sh hdfs'] {} 2017-12-21 13:46:55,415 - call returned (0, '1007') 2017-12-21 13:46:55,416 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1007} 2017-12-21 13:46:55,418 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,418 - call['/var/lib/ambari-agent/tmp/changeUid.sh sqoop'] {} 2017-12-21 13:46:55,426 - call returned (0, '1008') 2017-12-21 13:46:55,427 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1008} 2017-12-21 13:46:55,429 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,429 - call['/var/lib/ambari-agent/tmp/changeUid.sh yarn'] {} 2017-12-21 13:46:55,439 - call returned (0, '1009') 2017-12-21 13:46:55,440 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1009} 2017-12-21 13:46:55,441 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,442 - call['/var/lib/ambari-agent/tmp/changeUid.sh mapred'] {} 2017-12-21 13:46:55,452 - call returned (0, '1010') 2017-12-21 13:46:55,452 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop'], 'uid': 1010} 2017-12-21 13:46:55,453 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555} 2017-12-21 13:46:55,456 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'} 2017-12-21 13:46:55,462 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa 0'] due to not_if 2017-12-21 13:46:55,462 - Group['hdfs'] {} 2017-12-21 13:46:55,463 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']} 2017-12-21 13:46:55,463 - FS Type: 2017-12-21 13:46:55,464 - Directory['/etc/hadoop'] {'mode': 0755} 2017-12-21 13:46:55,465 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777} 2017-12-21 13:46:55,483 - Initializing 2 repositories 2017-12-21 13:46:55,484 - Repository['HDP-2.6'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.3.0', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP', 'mirror_list': None} 2017-12-21 13:46:55,491 - File['/tmp/tmp8SS8_V'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP/ubuntu16/2.x/updates/2.6.3.0 HDP main'} 2017-12-21 13:46:55,492 - Writing File['/tmp/tmp8SS8_V'] because contents don't match 2017-12-21 13:46:55,492 - File['/tmp/tmpceZ8Dh'] {'content': StaticFile('/etc/apt/sources.list.d/HDP.list')} 2017-12-21 13:46:55,493 - Writing File['/tmp/tmpceZ8Dh'] because contents don't match 2017-12-21 13:46:55,570 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '{{package_type}} {{base_url}} {{components}}', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None} 2017-12-21 13:46:55,573 - File['/tmp/tmpabpRd8'] {'content': 'deb http://public-repo-1.hortonworks.com/HDP-UTILS-1.1.0.21/repos/ubuntu16 HDP-UTILS main'} 2017-12-21 13:46:55,573 - Writing File['/tmp/tmpabpRd8'] because contents don't match 2017-12-21 13:46:55,574 - File['/tmp/tmpAKCJ9S'] {'content': StaticFile('/etc/apt/sources.list.d/HDP-UTILS.list')} 2017-12-21 13:46:55,574 - Writing File['/tmp/tmpAKCJ9S'] because contents don't match 2017-12-21 13:46:55,616 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:55,671 - Skipping installation of existing package unzip 2017-12-21 13:46:55,672 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:55,720 - Skipping installation of existing package curl 2017-12-21 13:46:55,721 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:55,767 - Skipping installation of existing package hdp-select 2017-12-21 13:46:56,025 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-12-21 13:46:56,028 - Stack Feature Version Info: Cluster Stack=2.6, Cluster Current Version=None, Command Stack=None, Command Version=None -> 2.6 2017-12-21 13:46:56,067 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf 2017-12-21 13:46:56,091 - checked_call['dpkg -s hdp-select | grep Version | awk '{print $2}''] {'stderr': -1} 2017-12-21 13:46:56,136 - checked_call returned (0, '2.6.1.0-129', '') 2017-12-21 13:46:56,146 - Package['hadoop-2-6-1-0-129-client'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,195 - Skipping installation of existing package hadoop-2-6-1-0-129-client 2017-12-21 13:46:56,196 - Package['hadoop-2-6-1-0-129-hdfs-datanode'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,245 - Skipping installation of existing package hadoop-2-6-1-0-129-hdfs-datanode 2017-12-21 13:46:56,246 - Package['hadoop-2-6-1-0-129-hdfs-journalnode'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,293 - Skipping installation of existing package hadoop-2-6-1-0-129-hdfs-journalnode 2017-12-21 13:46:56,294 - Package['hadoop-2-6-1-0-129-hdfs-namenode'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,348 - Skipping installation of existing package hadoop-2-6-1-0-129-hdfs-namenode 2017-12-21 13:46:56,349 - Package['hadoop-2-6-1-0-129-hdfs-secondarynamenode'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,396 - Skipping installation of existing package hadoop-2-6-1-0-129-hdfs-secondarynamenode 2017-12-21 13:46:56,397 - Package['hadoop-2-6-1-0-129-hdfs-zkfc'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,443 - Skipping installation of existing package hadoop-2-6-1-0-129-hdfs-zkfc 2017-12-21 13:46:56,444 - Package['libsnappy1'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:46:56,489 - Installing package libsnappy1 ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install libsnappy1') 2017-12-21 13:47:05,876 - Package['libsnappy-dev'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:47:05,922 - Installing package libsnappy-dev ('/usr/bin/apt-get -q -o Dpkg::Options::=--force-confdef --allow-unauthenticated --assume-yes install libsnappy-dev') 2017-12-21 13:47:14,593 - Package['libhdfs0-2-6-1-0-129'] {'retry_on_repo_unavailability': False, 'retry_count': 5} 2017-12-21 13:47:14,640 - Skipping installation of existing package libhdfs0-2-6-1-0-129 2017-12-21 13:47:14,642 - Directory['/etc/security/limits.d'] {'owner': 'root', 'create_parents': True, 'group': 'root'} 2017-12-21 13:47:14,658 - File['/etc/security/limits.d/hdfs.conf'] {'content': Template('hdfs.conf.j2'), 'owner': 'root', 'group': 'root', 'mode': 0644} 2017-12-21 13:47:14,658 - XmlConfig['hadoop-policy.xml'] {'owner': 'hdfs', 'group': 'hadoop', 'conf_dir': '/usr/hdp/current/hadoop-client/conf', 'configuration_attributes': {}, 'configurations': ...} 2017-12-21 13:47:14,670 - Generating config: /usr/hdp/current/hadoop-client/conf/hadoop-policy.xml 2017-12-21 13:47:14,671 - File['/usr/hdp/current/hadoop-client/conf/hadoop-policy.xml'] {'owner': 'hdfs', 'content': InlineTemplate(...), 'group': 'hadoop', 'mode': None, 'encoding': 'UTF-8'}

Command failed after 1 tries

1

1 Answers

0
votes

It looks like the problem with repository. Try to configure and use local repo.

Check this: Setting Up a Local Repository with No Internet Access.