common errors solution

本文解决在Hue中创建Hive表时出现的错误,包括使用`hadoop dfsadmin -safemode leave`命令解决错误,编辑`hadoop-env.sh`配置文件以增加HeapSize,以及通过XML资源文件提供自定义分隔符来解决Unicode字符问题。

1.when i create a hive table in hue, there errors comes



 

Solution:#hadoop dfsadmin -safemode leave

http://www.linkedin.com/groups/Creating-table-in-Hive-getting-4547204.S.225243871

 

2.error



 

 solution:

edit hadoop-env.sh

# The maximum amount of heap to use, in MB. Default is 1000.
export HADOOP_HEAPSIZE=2000
#export HADOOP_NAMENODE_INIT_HEAPSIZE=""

 

3. error hadoop \001 job.xml  Character reference "&#

https://hadoopified.wordpress.com/2011/06/24/unicode-charactersctrl-g-or-ctrl-a-as-textoutputformat-hadoop-delimiter/

 

Another hack, would be to provide the delimiter through an XML resource file. The xml version needs to be marked 1.1, since 1.0 fails to recognize the special unicode characters. The XML 1.0 spec explicitly omitted most of the non-printing characters in the range 0x00 to 0x1F.

 

 

Name: mapred.textoutputformat.separator
Value: \u0007

 

 

 

<?xml version="1.1"?>

<?xml-stylesheet type="text/xsl" href="configuration.xsl"?>

<configuration>

<property>

<name>hadoop.user</name>

<value>${user.name}</value>

</property>

<property>

<name>mapred.textoutputformat.separator</name>

<value>\u0007</value>

</property>

</configuration>

 

 

job.xml

This file is never created explicitly by the user. The map/reduce application creates a JobConf, which is serialized when the job is submitted.

 

评论
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值