compile spark2.0.0 with maven

本文介绍如何安装配置 Spark 2.0.0 版本,并提供详细的步骤说明,包括 Maven 的配置及 Spark 相关插件的启用。此外还提供了针对 Debian GNU/Linux 特别是 8/jessie 版本的解决方案。
spark version : spark-2.0.0
maven : Apache Maven 3.5.0 

cd /usr/local/spark-2.0.0
1
export MAVEN_OPTS="-Xmx2g -XX:MaxPermSize=512M -XX:ReservedCodeCacheSize=512m"

Note:
For Java 8 and above this step is not required.
If using build/mvn with no MAVEN_OPTS set, the script will automate this for you.

2 
mvn -Pyarn -Phadoop-2.7 -Pspark-ganglia-lgpl -Pkinesis-asl -Phive -DskipTests clean package


3
./dev/make-distribution.sh --name custom-spark --tgz -Psparkr -Phadoop-2.7 -Phive -Phive-thriftserver -Pyarn

Note:
if you come across spark-core_2.11: Command execution failed
may be you should try some ways the following 
On Debian GNU/Linux (in particular 8/jessie) you would need to run 
apt-get install r-base r-base-dev
or 
aptitude install r-base r-base-dev
Scanning for projects... [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Building Spark 1.0-SNAPSHOT [INFO] ------------------------------------------------------------------------ [INFO] [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ Spark --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] skip non existing resourceDirectory /data/workspace/myshixun/Spark/src/main/resources [INFO] [INFO] --- scala-maven-plugin:3.1.5:add-source (scala-compile-first) @ Spark --- [INFO] Add Source directory: /data/workspace/myshixun/Spark/src/main/scala [INFO] Add Test Source directory: /data/workspace/myshixun/Spark/src/test/scala [INFO] [INFO] --- scala-maven-plugin:3.1.5:compile (scala-compile-first) @ Spark --- [WARNING] Expected all dependencies to require Scala version: 2.11.8 [WARNING] com.twitter:chill_2.10:0.8.0 requires scala version: 2.10.5 [WARNING] Multiple versions of scala libraries detected! [INFO] /data/workspace/myshixun/Spark/src/main/scala:-1: info: compiling [INFO] Compiling 6 source files to /data/workspace/myshixun/Spark/target/classes at 1761898075413 [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:3: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:14: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:19: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:21: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:32: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] /data/workspace/myshixun/Spark/src/main/scala/EduCoder1.scala:34: error: illegal character '\u200b' [ERROR] ​ [ERROR] ^ [ERROR] 6 errors found [INFO] ------------------------------------------------------------------------ [INFO] BUILD FAILURE [INFO] ------------------------------------------------------------------------ [INFO] Total time: 12.415 s [INFO] Finished at: 2025-10-31T08:07:57Z [INFO] Final Memory: 25M/832M [INFO] ------------------------------------------------------------------------ [ERROR] Failed to execute goal net.alchim31.maven:scala-maven-plugin:3.1.5:compile (scala-compile-first) on project Spark: wrap: org.apache.commons.exec.ExecuteException: Process exited with an error: 1 (Exit value: 1) -> [Help 1] [ERROR] [ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. [ERROR] Re-run Maven using the -X switch to enable full debug logging. [ERROR] [ERROR] For more information about the errors and possible solutions, please read the following articles: [ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoExecutionException
11-01
### 解决Scala版本冲突 #### 1. 检查和统一Scala版本 在`pom.xml`文件中明确指定Scala版本,确保项目中所有依赖的Scala版本一致。可以在`<properties>`标签中指定Scala版本,示例如下: ```xml <properties> <scala.version>2.10.5</scala.version> </properties> ``` 然后在所有依赖中使用统一的Scala版本,例如: ```xml <dependencies> <!-- Spark依赖 --> <dependency> <groupId>org.apache.spark</groupId> <artifactId>spark-core_${scala.version}</artifactId> <version>2.0.0</version> <!-- 根据实际情况调整Spark版本 --> </dependency> <!-- chill依赖 --> <dependency> <groupId>com.twitter</groupId> <artifactId>chill_${scala.version}</artifactId> <version>0.8.0</version> </dependency> </dependencies> ``` #### 2. 使用Maven依赖排除 如果某个依赖引入了不兼容的Scala版本,可以使用`<exclusions>`标签排除该依赖中的Scala相关依赖。示例如下: ```xml <dependency> <groupId>some.group</groupId> <artifactId>some-artifact</artifactId> <version>1.0.0</version> <exclusions> <exclusion> <groupId>org.scala-lang</groupId> <artifactId>scala-library</artifactId> </exclusion> </exclusions> </dependency> ``` #### 3. 使用Maven Enforcer插件 可以使用Maven Enforcer插件来强制检查Scala版本的一致性。在`pom.xml`中添加以下配置: ```xml <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-enforcer-plugin</artifactId> <version>3.0.0-M3</version> <executions> <execution> <id>enforce-scala-version</id> <goals> <goal>enforce</goal> </goals> <configuration> <rules> <requireProperty> <property>scala.version</property> <message>Scala version must be specified in properties.</message> </requireProperty> </rules> </configuration> </execution> </executions> </plugin> </plugins> </build> ``` ### 解决非法字符错误 #### 1. 检查文件编码 确保项目文件(如`.scala`文件)的编码为UTF-8。可以在IDE中设置文件编码,例如在IntelliJ IDEA中,可以通过`File -> Settings -> File Encodings`将`Project Encoding`和`Default encoding for properties files`都设置为UTF-8。 #### 2. 手动删除非法字符 使用文本编辑器打开包含非法字符的文件,查找并手动删除非法字符`​`。注意有些非法字符可能是不可见的,可以通过替换功能将其替换为空字符串。 #### 3. 使用Maven插件过滤非法字符 可以使用Maven的`maven-resources-plugin`来过滤非法字符。在`pom.xml`中添加以下配置: ```xml <build> <plugins> <plugin> <groupId>org.apache.maven.plugins</groupId> <artifactId>maven-resources-plugin</artifactId> <version>3.2.0</version> <configuration> <encoding>UTF-8</encoding> <nonFilteredFileExtensions> <nonFilteredFileExtension>scala</nonFilteredFileExtension> </nonFilteredFileExtensions> </configuration> </plugin> </plugins> </build> ```
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值