hadoop 2.x安装:不能加载本地库 - java.library.path错误

在启动Hadoop 2.x时遇到`不能加载本地库 - java.library.path`错误,通过在hadoop-env.sh中添加DEBUG,观察日志发现库路径错误。将`java.library.path`指向正确路径后,尽管解决了路径问题,但新出现`GLIBC_2.14`未找到的错误,需要进一步解决。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

我们在启动hadoop2.x是可能会有下面这个警告:

WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

这个警告实际上就是不能加载本地库。因为这个错误的原因有很多,因此这里给出一种方案

1. 在hadoop-env.sh中加入DEBUG
现在我们无从判定问题在哪里,因此我们在tiny1(master)的hadoop-env.sh中加入DEBUG,即在该文件下加入这一行:

export HADOOP_ROOT_LOGGER=DEBUG,console

2.重启hadoop查看日志输出

17/03/09 18:49:14 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/03/09 18:49:14 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: no hadoop in java.library.path
17/03/09 18:49:14 DEBUG util.NativeCodeLoader: java.library.path=/usr/java/packages/lib/amd64:/usr/lib64:/lib64:/lib:/usr/lib
17/03/09 18:49:14 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

日志中显示他的 java.library.path是hadoop-2.7.2/lib,但是他的实际位置是hadoop-2.7.2/lib/native

3.修改hadoop-env.sh解决当前问题
添加:

export HADOOP_COMMON_LIB_NATIVE_DIR=/home/hadoop/hadoop/hadoop-2.7.2/lib/native

修改

export HADOOP_OPTS="-Djava.library.path=/home/hadoop/hadoop/hadoop-2.7.2/lib/native"

至此我们在hadoop-env.sh中的三个添加或修改的配置文件如下:

export HADOOP_COMMON_LIB_NATIVE_DIR=/home/hadoop/hadoop/hadoop-2.7.2/lib/native
export HADOOP_OPTS="-Djava.library.path=/home/hadoop/hadoop/hadoop-2.7.2/lib/native"
export HADOOP_ROOT_LOGGER=DEBUG,console

4.重启hadoop
我们关闭hadoop并重新开启hadoop

hadoop/hadoop-2.7.2/sbin/stop-all.sh
hadoop/hadoop-2.7.2/sbin/start-all.sh

我们查看日志:

17/03/09 19:11:22 DEBUG util.NativeCodeLoader: Trying to load the custom-built native-hadoop library...
17/03/09 19:11:22 DEBUG util.NativeCodeLoader: Failed to load native-hadoop with error: java.lang.UnsatisfiedLinkError: /home/hadoop/hadoop/hadoop-2.7.2/lib/native/libhadoop.so.1.0.0: /lib64/libc.so.6: version `GLIBC_2.14' not found (required by /home/hadoop/hadoop/hadoop-2.7.2/lib/native/libhadoop.so.1.0.0)
17/03/09 19:11:22 DEBUG util.NativeCodeLoader: java.library.path=/home/hadoop/hadoop/hadoop-2.7.2/lib/native
17/03/09 19:11:22 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable

还是有错…错误信息为version `GLIBC_2.14’ not found,我们在另外一篇中解决

java.lang.Exception: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem reading HFile Trailer from file hdfs://192.168.8.201:8020/apps/hbase/data/data/gt_dw/profile_gid_lbs_locvalue/7ef0422f73082b2d140d755a08ab6904/lbs/75c83b238e0b4be496eecf33eed5e5c3     at org.apache.hadoop.mapred.LocalJobRunner$Job.runTasks(LocalJobRunner.java:462) ~[hadoop-mapreduce-client-common-2.7.2.jar:na]     at org.apache.hadoop.mapred.LocalJobRunner$Job.run(LocalJobRunner.java:522) ~[hadoop-mapreduce-client-common-2.7.2.jar:na] Caused by: org.apache.hadoop.hbase.io.hfile.CorruptHFileException: Problem reading HFile Trailer from file hdfs://192.168.8.201:8020/apps/hbase/data/data/gt_dw/profile_gid_lbs_locvalue/7ef0422f73082b2d140d755a08ab6904/lbs/75c83b238e0b4be496eecf33eed5e5c3     at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:463) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFile.createReader(HFile.java:506) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at com.glab.fz.etl.hfile.util.HFileInputFormat$HFileRecordReader.initialize(HFileInputFormat.java:60) ~[classes/:na]     at org.apache.hadoop.mapred.MapTask$NewTrackingRecordReader.initialize(MapTask.java:548) ~[hadoop-mapreduce-client-core-2.7.2.jar:na]     at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:786) ~[hadoop-mapreduce-client-core-2.7.2.jar:na]     at org.apache.hadoop.mapred.MapTask.run(MapTask.java:341) ~[hadoop-mapreduce-client-core-2.7.2.jar:na]     at org.apache.hadoop.mapred.LocalJobRunner$Job$MapTaskRunnable.run(LocalJobRunner.java:243) ~[hadoop-mapreduce-client-common-2.7.2.jar:na]     at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511) ~[na:1.8.0_431]     at java.util.concurrent.FutureTask.run(FutureTask.java:266) ~[na:1.8.0_431]     at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) ~[na:1.8.0_431]     at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) ~[na:1.8.0_431]     at java.lang.Thread.run(Thread.java:750) ~[na:1.8.0_431] Caused by: java.lang.RuntimeException: native snappy library not available: this version of libhadoop was built without snappy support.     at org.apache.hadoop.io.compress.SnappyCodec.checkNativeCodeLoaded(SnappyCodec.java:65) ~[hadoop-common-2.7.2.jar:na]     at org.apache.hadoop.io.compress.SnappyCodec.getDecompressorType(SnappyCodec.java:193) ~[hadoop-common-2.7.2.jar:na]     at org.apache.hadoop.io.compress.CodecPool.getDecompressor(CodecPool.java:178) ~[hadoop-common-2.7.2.jar:na]     at org.apache.hadoop.hbase.io.compress.Compression$Algorithm.getDecompressor(Compression.java:327) ~[hbase-common-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.compress.Compression.decompress(Compression.java:422) ~[hbase-common-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.encoding.HFileBlockDefaultDecodingContext.prepareDecoding(HFileBlockDefaultDecodingContext.java:91) ~[hbase-common-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFileBlock.unpack(HFileBlock.java:507) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlock(HFileBlock.java:1255) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFileBlock$AbstractFSReader$1.nextBlockWithBlockType(HFileBlock.java:1261) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFileReaderV2.<init>(HFileReaderV2.java:147) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFileReaderV3.<init>(HFileReaderV3.java:73) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]     at org.apache.hadoop.hbase.io.hfile.HFile.pickReaderVersion(HFile.java:453) ~[hbase-server-0.98.13-hadoop2.jar:0.98.13-hadoop2]
06-16
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值