spark 3.1.x支持(兼容)hive 1.2.x以及hadoop cdh版本的尝试

本文介绍了在Spark 3.1.x中尝试使用Hive 1.2.x版本的方法及遇到的问题。通过修改POM文件、编译Hadoop版本等手段进行了兼容性测试,并探讨了动态加载Hive版本包的可行性。

版本

spark 3.1.x
hive 1.2.x
hadoop 2.6.0-cdh-5.13.1

背景

由于好多公司的yarn集群用的是cdh版本的,用Cloudera Manager管理的。而截止到目前为止,spark的最新版本已经到了3.1.1。而对于cdh 2.6.0-5.13.1来说目前支持的hive版本为1.2.1,所以我们做一下集中尝试:

  • 直接修改pom文件中的hive对应的版本
  • 直接修改编译的hadoop版本
  • 在spark运行的时候,动态加载hive对应的版本包

直接修改pom文件中的hive对应的版本

直接在spark的父pom文件增加如下proflie信息:

<profile>
    <id>hive-1.2</id>
    <properties>
      <hive.version>1.2.1</hive.version>
      <!-- Version used for internal directory structure -->
      <hive.version.short>1.2</hive.version.short>
      <hive.storage.version>2.6.0</hive.storage.version>
      <datanucleus-core.version>3.2.10</datanucleus-core.version>
    </properties>
  </profile>

运行

./dev/make-distribution.sh --name 2.6.0-cdh5.13.1  --pip  --tgz  -Phive-1.2 -Phive-thriftserver -Pyarn

报错:

[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-hive_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 29 Scala sources and 2 Java sources to spark/sql/hive/target/scala-2.12/classes ...
[ERROR] [Error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveShim.scala:29: object SerializationUtilities is not a msmber of package org.apache.hadoop.hive.ql.exec
[ERROR] [Error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveShim.scala:150: not found: value SerializationUtilities
[ERROR] [Error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/HiveShim.scala:154: not found: value SerializationUtilities
[ERROR] [Error] spark/sql/hive/src/main/scala/org/apache/spark/sql/hive/hiveUDFs.scala:350: too many arguments (4) for constructor SimpleGenericUDAFParameterInfo: (x$1: Array[org.apache.hadoop.hive.serde2.objectinspector.ObjectInspector], x$2: Boolean, x$3: Boolean)org.apache.hadoop.hive.ql.udf.generic.SimpleGenericUDAFParameterInfo
[ERROR] four errors found

说明hive1.2.1版本的不兼容

直接修改编译的hadoop版本

直接修改hadoop的版本为2.6.0-cdh5.13.1
运行如下命令:

./dev/make-distribution.sh --name 2.6.0-cdh5.13.1  --pip  --tgz  -Phive-1.2 -Phive-thriftserver -Pyarn -Dhadoop.version=2.6.0-cdh5.13.1

报错:

[INFO] --- scala-maven-plugin:4.3.0:compile (scala-compile-first) @ spark-core_2.12 ---
[INFO] Using incremental compilation using Mixed compile order
[INFO] Compiler bridge file: .sbt/1.0/zinc/org.scala-sbt/org.scala-sbt-compiler-bridge_2.12-1.3.1-bin_2.12.10__52.0-1.3.1_20191012T045515.jar
[INFO] compiler plugin: BasicArtifact(com.github.ghik,silencer-plugin_2.12.10,1.6.0,null)
[INFO] Compiling 560 Scala sources and 99 Java sources to spark/core/target/scala-2.12/classes ...
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: type mismatch;
 found   : K where type K
 required: String
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: value map is not a member of V
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/HttpSecurityFilter.scala:107: missing argument list for method stripXSS in class XssSafeRequest
Unapplied methods are only converted to functions when a function type is expected.
You can make this conversion explicit by writing `stripXSS _` or `stripXSS(_)` instead of `stripXSS`.
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/ui/PagedTable.scala:307: value startsWith is not a member of K
[ERROR] [Error] spark/core/src/main/scala/org/apache/spark/util/Utils.scala:580: value toLowerCase is not a member of object org.apache.hadoop.util.StringUtils
[ERROR] 5 errors found

说明对2.6.0-cdh5.13.1版本的不兼容

在spark运行的时候,动态加载hive对应的版本包

根据官网的说明 ,spark从1.4.0 开始就能和不同的hive元数据进行交互,也就是说spark编译的hive内部版本和spark访问hive的元数据是独立的,可以配置不同的hive版本进行对应元数据的访问。具体的配置可以参考以上官网配置。

评论 2
成就一亿技术人!
拼手气红包6.0元
还能输入1000个字符
 
红包 添加红包
表情包 插入表情
 条评论被折叠 查看
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值