环境:
ubuntu-linux 16.04
spark-2.3.1-bin-hadoop2.7
hadoop-2.7.7
可能的原因:
1.so文件版本不对
查看命令:
file libhadoop.so.1.0.0
libhadoop.so.1.0.0: ELF 64-bit LSB shared object, x86-64, version 1 (SYSV), dynamically linked, BuildID[sha1]=c08a9ec9d1c3cf9bccf3ea87ed51d077b5651b1a, not stripped
2.ldd以后缺少连接:
查看命令:
(python2.7) appleyuchi@ubuntu:~/bigdata/hadoop-2.7.7/lib/native$ ldd libhdfs.so.0.0.0
linux-vdso.so.1 => (0x00007fff80b18000)
libjvm.so => /home/appleyuchi/Java/jdk1.8.0_131/jre/lib/amd64/server/libjvm.so (0x00007febd438b000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007febd4187000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007febd3f6a000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007febd3ba0000)
/lib64/ld-linux-x86-64.so.2 (0x00007febd558d000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007febd3897000)
(python2.7) appleyuchi@ubuntu:~/bigdata/hadoop-2.7.7/lib/native$ ldd libhdfs.so
linux-vdso.so.1 => (0x00007ffc710e4000)
libjvm.so => /home/appleyuchi/Java/jdk1.8.0_131/jre/lib/amd64/server/libjvm.so (0x00007f744d082000)
libdl.so.2 => /lib/x86_64-linux-gnu/libdl.so.2 (0x00007f744ce7e000)
libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007f744cc61000)
libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007f744c897000)
/lib64/ld-linux-x86-64.so.2 (0x00007f744e284000)
libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007f744c58e000)
要确保每个=>右侧都不能出现not found
否则就修改.bashrc中LD_LIBRARY_PATH来增加路径
3.如果想自己编译,想要安装protobuf,那么可以参考以下连接
https://blog.youkuaiyun.com/blue_it/article/details/53996216
https://blog.youkuaiyun.com/appleyuchi/article/details/81667992
spark出现Unable to load native-hadoop library for your platform
修改spark-env.sh如下:
export HADOOP_HOME=~/bigdata/hadoop-2.7.7
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
#export SPARK_MASTER_IP=master
export SPARK_LOCAL_IP=127.0.0.1
export HIVE_HOME=~/bigdata/apache-hive-3.0.0-bin
export SPARK_CLASSPATH=$HIVE_HOME/lib/*:$SPARK_CLASSPATH
hadoop启动出现Unable to load native-hadoop library for your platform
修改hadoop-env.sh,不要去.bashrc还有什么/etc/下面的那个profile修改,没用的
export HADOOP_HOME=~/bigdata/hadoop-2.7.7
export JAVA_HOME=~/Java/jdk1.8.0_131
export LD_LIBRARY_PATH=$HADOOP_HOME/lib/native
export HADOOP_COMMON_LIB_NATIVE_DIR=$HADOOP_HOME/lib/native
export HADOOP_OPTS="-Djava.library.path=${HADOOP_HOME}/lib/native"