1.when i create a hive table in hue, there errors comes

Solution:#hadoop dfsadmin -safemode leave
http://www.linkedin.com/groups/Creating-table-in-Hive-getting-4547204.S.225243871
2.error

solution:
edit hadoop-env.sh
# The maximum amount of heap to use, in MB. Default is 1000.
export HADOOP_HEAPSIZE=2000
#export HADOOP_NAMENODE_INIT_HEAPSIZE=""
3. error hadoop \001 job.xml Character reference "&#
https://hadoopified.wordpress.com/2011/06/24/unicode-charactersctrl-g-or-ctrl-a-as-textoutputformat-hadoop-delimiter/
Another hack, would be to provide the delimiter through an XML resource file. The xml version needs to be marked 1.1, since 1.0 fails to recognize the special unicode characters. The XML 1.0 spec explicitly omitted most of the non-printing characters in the range 0x00 to 0x1F.
<?xml version="1.1"?>
<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>
<configuration>
<property>
<name>hadoop.user</name>
<value>${user.name}</value>
</property>
<property>
<name>mapred.textoutputformat.separator</name>
<value>\u0007</value>
</property>
</configuration>
job.xml
This file is never created explicitly by the user. The map/reduce application creates a JobConf, which is serialized when the job is submitted.
本文解决在Hue中创建Hive表时出现的错误,包括使用`hadoop dfsadmin -safemode leave`命令解决错误,编辑`hadoop-env.sh`配置文件以增加HeapSize,以及通过XML资源文件提供自定义分隔符来解决Unicode字符问题。
1113

被折叠的 条评论
为什么被折叠?



