1、安装过程还算顺利,但是中间出了点问题,主机名要小写,数据库连接用ip地址,不用主机名
在安装服务的时候报错
stderr:
Traceback (most recent call last):
File "/var/lib/ambari-agent/cache/stack-hooks/before-INSTALL/scripts/hook.py", line 37, in
BeforeInstallHook().execute()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 363, in execute
self.save_component_version_to_structured_out(self.command_name)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/script/script.py", line 222, in save_component_version_to_structured_out
stack_select_package_name = stack_select.get_package_name()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 109, in get_package_name
package = get_packages(PACKAGE_SCOPE_STACK_SELECT, service_name, component_name)
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 223, in get_packages
supported_packages = get_supported_packages()
File "/usr/lib/ambari-agent/lib/resource_management/libraries/functions/stack_select.py", line 147, in get_supported_packages
raise Fail("Unable to query for supported packages using {0}".format(stack_selector_path))
resource_management.core.exceptions.Fail: Unable to query for supported packages using /usr/bin/hdp-select
stdout:
2018-07-15 13:37:13,688 - Stack Feature Version Info: Cluster Stack=3.0, Command Stack=None, Command Version=None -> 3.0
2018-07-15 13:37:13,694 - Using hadoop conf dir: /usr/hdp/current/hadoop-client/conf
2018-07-15 13:37:13,696 - Group['kms'] {}
2018-07-15 13:37:13,697 - Group['livy'] {}
2018-07-15 13:37:13,697 - Group['spark'] {}
2018-07-15 13:37:13,697 - Group['ranger'] {}
2018-07-15 13:37:13,697 - Group['hdfs'] {}
2018-07-15 13:37:13,697 - Group['zeppelin'] {}
2018-07-15 13:37:13,698 - Group['hadoop'] {}
2018-07-15 13:37:13,698 - Group['users'] {}
2018-07-15 13:37:13,698 - Group['knox'] {}
2018-07-15 13:37:13,711 - User['yarn-ats'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['hadoop'], 'uid': None}
2018-07-15 13:37:13,721 - User['hive'] {'gid': 'hadoop', 'fetch_nonlocal_groups': True, 'groups': ['had