Scala读取*.properties配置文件

临时接到任务,要求工程的配置文件由.conf改为.properties.

具体原因是获取conf的api在值为xx,xx时必须加引号,否则出错,与南京项目组冲突,鉴于api已经固化在jar里,所以要修改.查了下java里的api,小小修改为scala代码

import java.util.Properties
import java.io.FileInputStream

//test.properties 里的内容为"ddd=5.6,1.2"

  def loadProperties():Unit = {
    val properties = new Properties()
    val path = Thread.currentThread().getContextClassLoader.getResource("test.properties").getPath //文件要放到resource文件夹下
    properties.load(new FileInputStream(path))
    println(properties.getProperty("ddd"))//读取键为ddd的数据的值
    println(properties.getProperty("ddd","没有值"))//如果ddd不存在,则返回第二个参数
    properties.setProperty("ddd","123")//添加或修改属性值
  }

实现很简单,更多操作以后再查询.

java.io.FileNotFoundException: redis.properties not found at cn.itcast.processdata.RedisClient$.<init>(RedisClient.scala:13) at cn.itcast.processdata.RedisClient$.<clinit>(RedisClient.scala) at cn.itcast.processdata.StreamingProcessdata$$anonfun$main$1$$anonfun$apply$2$$anonfun$apply$3.apply(StreamingProcessdata.scala:51) at cn.itcast.processdata.StreamingProcessdata$$anonfun$main$1$$anonfun$apply$2$$anonfun$apply$3.apply(StreamingProcessdata.scala:47) at scala.collection.Iterator$class.foreach(Iterator.scala:893) at scala.collection.AbstractIterator.foreach(Iterator.scala:1336) at cn.itcast.processdata.StreamingProcessdata$$anonfun$main$1$$anonfun$apply$2.apply(StreamingProcessdata.scala:47) at cn.itcast.processdata.StreamingProcessdata$$anonfun$main$1$$anonfun$apply$2.apply(StreamingProcessdata.scala:46) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:980) at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1$$anonfun$apply$28.apply(RDD.scala:980) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2107) at org.apache.spark.SparkContext$$anonfun$runJob$5.apply(SparkContext.scala:2107) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:123) at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:411) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:417) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) 25/06/10 11:11:03 ERROR Executor: Exception in task 1.0 in stage 1.0 (TID 3)
06-11
评论 3
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值