1、下载,zeppelin有两种,一种是集成了全部解释器的,一种是需要自己安装解释器的(其实里面也集成了spark和python),我下载的第二种
2、修改zeppelin-env.sh,我这边是spark on yarn的模式,然后需要用到pyspark
export JAVA_HOME=/home/java
export MASTER=yarn-client
export SPARK_HOME=/home/spark
export SPARK_SUBMIT_OPTIONS="–deploy-mode client --driver-memory 512M --executor-memory 1G --executor-cores 1"
export HADOOP_CONF_DIR=/home/hadoop/etc/hadoop
export PYTHONPATH=/app/anaconda3/bin
export PYSPARK_PYTHON=/app/anaconda3/bin/pathon3.7
3、修改zeppelin-site.xml
<property>
<name>zeppelin.server.addr</name>
<value>192.168.188.18</value>
<description>Server address</description>
</property>
<property>
<name>zeppelin.server.port</name>
<value>9090</value>
<description>Server port.</description>
</property>
4、zeppelin WEB界面的Interpreter界面添加spark.home,修改zeppelin.pyspark.python
zeppelin.pyspark.python : /app/anaconda3/bin/python3.7
5、遇到的错误:
java.lang.NoSuchMethodError: io.netty.buffer.PooledByteBufAllocator.metric()Lio/netty/buffer/PooledByteBufAllocatorMetric;
at org.apache.spark.network.util.NettyMemoryMetrics.registerMetrics(NettyMemoryMetrics.java:80)
at org.apache.spark.network.util.NettyMemoryMetrics.(NettyMemoryMetrics.java:76)
at org.apache.spark.network.client.TransportClientFactory.(TransportClientFactory.java:109)
at org.apache.spark.network.TransportContext.createClientFactory(TransportContext.java:99)
at org.apache.spark.rpc.netty.NettyRpcEnv.(NettyRpcEnv.scala:71)
at org.apache.spark.rpc.netty.NettyRpcEnvFactory.create(NettyRpcEnv.scala:461)
原因: spark与zeppelin的netty jar包版本不一致导致
解决办法:
将zeppelin的netty:netty-all-4.0.23.Final.jar复制到spark的jars目录下,同时删除spark原来的netty包
5、可能遇到的错误:
JsonMappingException: Incompatible Jackson version: 2.11.8
原因:zeppelin的jacson包与spark jackson包版本不一致
解决办法:将spark的jackson包复制到zeppelin中,删除zeppelin中原来的jackson包
6、添加HIVE Interpreters,需要将hive安装目录下面的lib中的hive-exec-2.1.1.jar、hive-service-2.1.1.jar、hive-jdbc-2.1.1.jar拷入${ZEPPELIIN_HOME}/interpreter/jdbc/中,同时还要添加依赖hadoop-common-2.6.0.jar。
注意:hive-exec-2.1.1.jar、hive-service-2.1.1.jar、hive-jdbc-2.1.1.jar如果版本不一致会有java.lang.NoSuchFieldError: HIVE_CLI_SERVICE_PROTOCOL_V8错误
7、添加jdbc Interpreters,需要将依赖mysql-connector-java-5.1.35.jar拷入${ZEPPELIIN_HOME}/interpreter/jdbc/中
8、如果spark中集成了hive,那有可能还会遇到下面这个错误:
java.lang.NoSuchMethodError:com.facebook.fb303.FacebookService C l i e n t . s e n d B a s e O n e w a y ( L j a v a / l a n g / S t r i n g ; L o r g / a p a c h e / t h r i f t / T B a s a t c o m . f a c e b o o k . f b 303. F a c e b o o k S e r v i c e Client.sendBaseOneway (Ljava/lang/String;Lorg/apache/thrift/TBas at com.facebook.fb303.FacebookService Client.sendBaseOneway(Ljava/lang/String;Lorg/apache/thrift/TBasatcom.facebook.