scala:val和def定义变量的区别

本文深入探讨了Scala编程语言中val与def声明变量的区别。通过具体代码示例,展示了val声明的变量在执行时仅赋值一次,而def声明的变量每次获取或使用时都会重新计算赋值。

先看def p:

object Json4sTest {
    def main(args: Array[String]): Unit = {
        def p = f
        
        println(p)
        println(p)
    }
    
    def f() = {
        println("function f run...")
        "f return"
    }
}
/*
输出:
function f run...
f return
function f run...
f return
*/

下面是val p:

object Json4sTest {
    def main(args: Array[String]): Unit = {
        val p = f
        
        println(p)
        println(p)
    }
    
    def f() = {
        println("function f run...")
        "f return"
    }
}
/*
输出:
function f run...
f return
f return
*/
总结:

val声明的变量只会在执行时赋一次值,再次使用时只能获取其已经赋过的值。
def声明的变量每一次获取或使用都会重新赋值。

object Test {
    def main(args: Array[String]): Unit = {
        def p = {
            println("p..."); 
            val prop = new java.util.Properties()
            prop.put("k1", "v1")
            prop
        }
        
        println(p)
        println(p)
        
        p.put("k2", "v2")
        
        println(p)
        println(p)
    }
}
/*
输出:
p...
{k1=v1}
p...
{k1=v1}
p...
p...
{k1=v1}
p...
{k1=v1}
*/

详细内容可以参考:https://blog.youkuaiyun.com/qq_29343201/article/details/56281777

转载于:https://www.cnblogs.com/xuejianbest/p/10285082.html

Exception in thread "main" org.apache.spark.SparkException: Task not serializable at org.apache.spark.util.ClosureCleaner$.ensureSerializable(ClosureCleaner.scala:444) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:416) at org.apache.spark.util.ClosureCleaner$.clean(ClosureCleaner.scala:163) at org.apache.spark.SparkContext.clean(SparkContext.scala:2669) at org.apache.spark.rdd.RDD.$anonfun$mapPartitionsWithIndex$1(RDD.scala:907) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:410) at org.apache.spark.rdd.RDD.mapPartitionsWithIndex(RDD.scala:906) at org.apache.spark.sql.execution.WholeStageCodegenExec.doExecute(WholeStageCodegenExec.scala:759) at org.apache.spark.sql.execution.SparkPlan.$anonfun$execute$1(SparkPlan.scala:195) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:246) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.sql.execution.SparkPlan.executeQuery(SparkPlan.scala:243) at org.apache.spark.sql.execution.SparkPlan.execute(SparkPlan.scala:191) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD$lzycompute(ShuffleExchangeExec.scala:141) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.inputRDD(ShuffleExchangeExec.scala:141) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture$lzycompute(ShuffleExchangeExec.scala:146) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec.mapOutputStatisticsFuture(ShuffleExchangeExec.scala:145) at org.apache.spark.sql.execution.exchange.ShuffleExchangeLike.$anonfun$submitShuffleJob$1(ShuffleExchangeExec.scala:73) at org.apache.spark.sql.execution.SparkPlan.$anonfun$executeQuery$1(SparkPlan.scala:246) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDO
最新发布
11-24
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值