No configuration setting found for key 'akka.version' 在配置设置中找不到akka.version这个key
方式一:
Akka的配置方法在很大程度上依赖于每个模块/jar都有自己的reference.conf文件的概念,所有这些都将由配置发现并加载。不幸的是,这也意味着如果你将多个jar放入/合并到同一个jar中,你也需要合并所有的reference.confs。否则所有默认值都将丢失,Akka将无法运行。
在pom.xml下添加如下插件
maven-shade-plugin这个插件的1.5版本可能在maven仓库中找不到,可以改成其他版本比如2.3
<plugin>
<groupId>org.apache.maven.plugins</groupId>
<artifactId>maven-shade-plugin</artifactId>
<version>1.5</version>
<executions>
<execution>
<phase>package</phase>
<goals>
<goal>shade</goal>
</goals>
<configuration>
<shadedArtifactAttached>true</shadedArtifactAttached>
<shadedClassifierName>allinone</shadedClassifierName>
<artifactSet>
<includes>
<include>*:*</include>
</includes>
</artifactSet>
<transformers>
<transformer
implementation="org.apache.maven.plugins.shade.resource.AppendingTransformer">
<resource>reference.conf</resource>
</transformer>
<transformer
implementation="org.apache.maven.plugins.shade.resource.ManifestResourceTransformer">
<manifestEntries>
<Main-Class>akka.Main</Main-Class>
</manifestEntries>
</transformer>
</transformers>
</configuration>
</execution>
</executions>
</plugin>
如果加了上边的插件并重新打包上传到服务器运行还是出现该问题的话,请看方式二
方式二:
进入$SPARK_HOME/lib/下面把spark-assembly-x.x.x-hadoopx.x.x.jar添加到ClassPath中 java -classpath 即 java -cp
java -cp spark-assembly-x.x.x-hadoopx.x.x.jar:xxx.jar com.xxx.xxx
例子
java -cp spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/jars/weather-analysis-1.0-SNAPSH-jar-with-dependencies.jar com.weather.syrtweather.SyRealTimeWeather_Ops
如果运行过程中又报内存不足的错误System memory xxxxxxmust be at least 4.718592E8. Please use a larger heap size.
就在命令中指定jar包运行的最大内存
java -Xmx512m -cp spark-assembly-1.6.2-hadoop2.6.0.jar:/opt/jars/weather-analysis-1.0-SNAPSH-jar-with-dependencies.jar com.weather.syrtweather.SyRealTimeWeather_Ops
参考网址:
https://stackoverflow.com/questions/31011243/no-configuration-setting-found-for-key-akka-version
https://blog.youkuaiyun.com/ouyang111222/article/details/50583756