Std Err: /usr/sbin/hst: line 460: install-activity-analyzer.sh: command not found

本文记录了在卸载并重新安装AMBARI2.7.3过程中遇到的持续性错误,包括详细的错误日志及尝试解决的过程。在安装HDP使用AMBARI时,遇到无法解决的错误,寻求社区帮助。

卸载后重新安装AMBARI 2.7.3 报错:

While trying to install HDP with Ambari, I often end up getting this error.
I have tried multiple times and even downloaded tar.gz file and extracted it as per documentation and did an installation without internet but error seems to be persistent.
 
Any help will be appreciated. Below are the logs 
 
 
 
 
2016-09-12 16:34:16,665 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2016-09-12 16:34:16,666 - Group['livy'] {}
2016-09-12 16:34:16,667 - Group['spark'] {}
2016-09-12 16:34:16,668 - Group['zeppelin'] {}
2016-09-12 16:34:16,668 - Group['hadoop'] {}
2016-09-12 16:34:16,668 - Group['users'] {}
2016-09-12 16:34:16,668 - Group['knox'] {}
2016-09-12 16:34:16,668 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,669 - User['storm'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,670 - User['infra-solr'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,671 - User['zookeeper'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,671 - User['atlas'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,672 - User['oozie'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-09-12 16:34:16,673 - User['ams'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,674 - User['falcon'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-09-12 16:34:16,674 - User['tez'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-09-12 16:34:16,675 - User['zeppelin'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,676 - User['accumulo'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,676 - User['livy'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,677 - User['mahout'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,678 - User['spark'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,678 - User['ambari-qa'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'users']}
2016-09-12 16:34:16,679 - User['flume'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,680 - User['kafka'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,680 - User['hdfs'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,681 - User['sqoop'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,682 - User['yarn'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,682 - User['hbase'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,683 - User['hcat'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,683 - User['mapred'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,684 - User['knox'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': [u'hadoop']}
2016-09-12 16:34:16,685 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-12 16:34:16,686 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] {'not_if': '(test $(id -u ambari-qa) -gt 1000) || (false)'}
2016-09-12 16:34:16,691 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh ambari-qa /tmp/hadoop-ambari-qa,/tmp/hsperfdata_ambari-qa,/home/ambari-qa,/tmp/ambari-qa,/tmp/sqoop-ambari-qa'] due to not_if
2016-09-12 16:34:16,691 - Directory['/tmp/hbase-hbase'] {'owner': 'hbase', 'create_parents': True, 'mode': 0775, 'cd_access': 'a'}
2016-09-12 16:34:16,692 - File['/var/lib/ambari-agent/tmp/changeUid.sh'] {'content': StaticFile('changeToSecureUid.sh'), 'mode': 0555}
2016-09-12 16:34:16,693 - Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] {'not_if': '(test $(id -u hbase) -gt 1000) || (false)'}
2016-09-12 16:34:16,697 - Skipping Execute['/var/lib/ambari-agent/tmp/changeUid.sh hbase /home/hbase,/tmp/hbase,/usr/bin/hbase,/var/log/hbase,/tmp/hbase-hbase'] due to not_if
2016-09-12 16:34:16,697 - Group['hdfs'] {}
2016-09-12 16:34:16,697 - User['hdfs'] {'fetch_nonlocal_groups': True, 'groups': [u'hadoop', u'hdfs']}
2016-09-12 16:34:16,698 - FS Type: 
2016-09-12 16:34:16,698 - Directory['/etc/hadoop'] {'mode': 0755}
2016-09-12 16:34:16,712 - File['/usr/hdp/current/hadoop-client/conf/hadoop-env.sh'] {'content': InlineTemplate(...), 'owner': 'hdfs', 'group': 'hadoop'}
2016-09-12 16:34:16,712 - Directory['/var/lib/ambari-agent/tmp/hadoop_java_io_tmpdir'] {'owner': 'hdfs', 'group': 'hadoop', 'mode': 01777}
2016-09-12 16:34:16,728 - Initializing 2 repositories
2016-09-12 16:34:16,728 - Repository['HDP-2.5'] {'base_url': 'http://centos7.satyam.biz/hdp/HDP/centos7', 'action': ['create'], 'components': [u'HDP', 'main'], 'repo_template': '[{
      {repo_id}}]\nname={
      {repo_id}}\n{% if mirror_list %}mirrorlist={
      {mirror_list}}{% else %}baseurl={
      {base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP', 'mirror_list': None}
2016-09-12 16:34:16,736 - File['/etc/yum.repos.d/HDP.repo'] {'content': '[HDP-2.5]\nname=HDP-2.5\nbaseurl=http://centos7.satyam.biz/hdp/HDP/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-09-12 16:34:16,736 - Repository['HDP-UTILS-1.1.0.21'] {'base_url': 'http://centos7.satyam.biz/hdp/HDP-UTILS-1.1.0.21/repos/centos7', 'action': ['create'], 'components': [u'HDP-UTILS', 'main'], 'repo_template': '[{
      {repo_id}}]\nname={
      {repo_id}}\n{% if mirror_list %}mirrorlist={
      {mirror_list}}{% else %}baseurl={
      {base_url}}{% endif %}\n\npath=/\nenabled=1\ngpgcheck=0', 'repo_file_name': 'HDP-UTILS', 'mirror_list': None}
2016-09-12 16:34:16,740 - File['/etc/yum.repos.d/HDP-UTILS.repo'] {'content': '[HDP-UTILS-1.1.0.21]\nname=HDP-UTILS-1.1.0.21\nbaseurl=http://centos7.satyam.biz/hdp/HDP-UTILS-1.1.0.21/repos/centos7\n\npath=/\nenabled=1\ngpgcheck=0'}
2016-09-12 16:34:16,740 - Package['unzip'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-09-12 16:34:16,822 - Skipping installation of existing package unzip
2016-09-12 16:34:16,822 - Package['curl'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-09-12 16:34:16,843 - Skipping installation of existing package curl
2016-09-12 16:34:16,843 - Package['hdp-select'] {'retry_on_repo_unavailability': False, 'retry_count': 5}
2016-09-12 16:34:16,863 - Skipping installation of existing package hdp-select
installing using command: {sudo} rpm -qa | grep smartsense- || {sudo} yum -y install smartsense-hst || {sudo} rpm -i /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
Command:  rpm -qa | grep smartsense- ||  yum -y install smartsense-hst ||  rpm -i /var/lib/ambari-agent/cache/stacks/HDP/2.1/services/SMARTSENSE/package/files/rpm/*.rpm
Exit code: 0
Std Out: smartsense-hst-1.3.0.0-1.x86_64
 
Std Err: None
2016-09-12 16:34:18,905 - User['activity_analyzer'] {'gid': 'hadoop', 'groups': [u'hdfs']}
Deploying activity analyzer
Command:  /usr/sbin/hst activity-analyzer setup root:root '/etc/rc.d/init.d'
Exit code: 127
Std Out: None
Std Err: /usr/sbin/hst: line 321: install-activity-analyzer.sh: command not found
 
Command failed after 1 tries

卸载的时候文件删多了

yum remove smartsense-hst
rm -rf /var/log/smartsense/
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值