Scala 学习

这篇博客探讨了Scala编程语言的一个核心特性——一切皆对象。通过实例展示了变量可以拥有方法,如`readLine`用于读取数据,以及如何操作数组和ArrayBuffer。这揭示了Scala强大的类型系统和面向对象的编程思想。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

一、scala一切皆对象,变量也是有方法的。

1.

scala> 0.to(5)
res1: scala.collection.immutable.Range.Inclusive = Range(0, 1, 2, 3, 4, 5)
scala中没有++ --方法
scala> var age = 10
age: Int = 10
scala> age += 1
scala> age

res5: Int = 11

2‘apply是scala构建实体的方法
scala> Array(1,2,3,4)
res7: Array[Int] = Array(1, 2, 3, 4)
scala> val array = Array(1,2,3,4)
array: Array[Int] = Array(1, 2, 3, 4)
scala> array
res8: Array[Int] = Array(1, 2, 3, 4)
scala> val age = 19
age: Int = 19
scala> if(age >= 18) "man" else "boy"
res9: String = man
scala> val result = if(age >= 18) "man" else "boy"
result: String = man
scala> result
res10: String = man
scala中块表达式的最后一行的值就是代码块的值。


3.readLine读取数据

scala> readLine
warning: there was one deprecation warning; re-run with -deprecation for details
res11: String = please enter u
scala> readLine("how are u")
warning: there was one deprecation warning; re-run with -deprecation for details
scala> val n = 10
n: Int = 10
scala> def f2 : Any = {
     | for(t <- 1 to n){
     | if(t==n) return t
     | println(t)
     | }
     | }
f2: Any
scala> f2
1
2
3
4
5
6
7
8
9
res20: Any = 10
scala>

4. 变长参数问题
scala> def sum(numbers : Int*) = {var result = 0; for(elsement <- numbers) resul
t += elsement; result}
sum: (numbers: Int*)Int
scala> sum(1,2,3,4,5,6,7,8,9)
res26: Int = 45
scala> sum(1 to 100 : _*)
res27: Int = 5050
参数为range会报错
scala> sum(1 to 100)
<console>:11: error: type mismatch;
 found   : scala.collection.immutable.Range.Inclusive
 required: Int
              sum(1 to 100)
                    ^
过程就是没有返回值的函数
5. lazy
在第一次调用的时候才会赋值。
scala> val arr = new Array[Int](5)
arr: Array[Int] = Array(0, 0, 0, 0, 0)
scala> arr[3]
<console>:1: error: identifier expected but integer literal found.
       arr[3]
           ^
scala> arr(3)
res30: Int = 0
scala> val arr1 = Array("scala", "spark")
arr1: Array[String] = Array(scala, spark)
scala> arr1
res31: Array[String] = Array(scala, spark)
scala> array

res32: Array[Int] = Array(1, 2, 3, 4)


6.ArrayBuffer

scala> val arrBuffer = ArrayBuffer[Int]()
<console>:7: error: not found: value ArrayBuffer
       val arrBuffer = ArrayBuffer[Int]()
                       ^
scala> import scala.collection.mutable.ArrayBuffer
import scala.collection.mutable.ArrayBuffer
scala> val arrBuffer = ArrayBuffer[Int]()
arrBuffer: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer()
scala> arrBuffer = (20)
<console>:9: error: reassignment to val
       arrBuffer = (20)
                 ^
scala> arrBuffer += (20)
res33: arrBuffer.type = ArrayBuffer(20)
scala> arrBuffer += (20,1,2,3,4)
res34: arrBuffer.type = ArrayBuffer(20, 20, 1, 2, 3, 4)
scala> arrBuffer.trimEnd(3)
scala> arrBuffer
res36: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 20, 1)
scala> arrBuffer.insert(4,200)
java.lang.IndexOutOfBoundsException: 4
  at scala.collection.mutable.ArrayBuffer.insertAll(ArrayBuffer.scala:139)
  at scala.collection.mutable.BufferLike$class.insert(BufferLike.scala:167)
  at scala.collection.mutable.AbstractBuffer.insert(Buffer.scala:49)
  ... 33 elided
scala> arrBuffer.insert(3,200)
scala> arrBuffer
res39: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 20, 1, 200)
scala> arrBuffer.remove(10)
res42: Int = 200
scala> arrBuffer
res43: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 5, 2, 7, 3, 8
, 4, 9, 20, 1)
scala> arrBuffer.remove(6,5)
java.lang.IndexOutOfBoundsException: 6
  at scala.collection.mutable.ArrayBuffer.remove(ArrayBuffer.scala:158)
  ... 33 elided
scala> arrBuffer.remove(6,4)
scala> arrBuffer
res46: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 5, 2, 7, 3, 8
)
scala> arrBuffer.toArray
res47: Array[Int] = Array(20, 5, 2, 7, 3, 8)
scala> arrBuffer
res48: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 5, 2, 7, 3, 8
)
scala> arrBuffer.toArray
res49: Array[Int] = Array(20, 5, 2, 7, 3, 8)
scala> arrBuffer
res50: scala.collection.mutable.ArrayBuffer[Int] = ArrayBuffer(20, 5, 2, 7, 3, 8
)


scala> val arr2 = Array(1,4,6,78,9,0)
arr2: Array[Int] = Array(1, 4, 6, 78, 9, 0)
scala> for(t <- 0 until (arr.length,2)) println(arr2(t))
1
6
9
scala> val arr2 = Array(1, 1, 3, 4, 10, 11, 100)
arr2: Array[Int] = Array(1, 1, 3, 4, 10, 11, 100)
scala> arr2.mkString
res0: String = 11341011100
scala> arr2.mkString(", ")
res1: String = 1, 1, 3, 4, 10, 11, 100
scala> val arr3 = for(t <- arr2) yield t * t
arr3: Array[Int] = Array(1, 1, 9, 16, 100, 121, 10000)
scala> val arr3 = for(t <- arr2 if t % 3 == 0) yield t * t
arr3: Array[Int] = Array(9)
scala> arr2.filter(_%3 == 0).map(t +> t * t)
<console>:9: error: not found: value t
              arr2.filter(_%3 == 0).map(t +> t * t)
                                        ^
<console>:9: error: not found: value t
              arr2.filter(_%3 == 0).map(t +> t * t)
                                           ^
<console>:9: error: not found: value t
              arr2.filter(_%3 == 0).map(t +> t * t)
                                                 
scala> arr2.filter(_%3 == 0).map(t => t * t)
res3: Array[Int] = Array(9)
scala> arr2.filter{_%3 == 0}.map(t => t * t)
res4: Array[Int] = Array(9)
scala> arr2.filter{_%3 == 0}map(t => t * t)
res5: Array[Int] = Array(9)
scala> val person = Map("hadoop" -> 11, "spark" -> 6)
person: scala.collection.immutable.Map[String,Int] = Map(hadoop -> 11, spark -
6)

scala> person("hadoop")
res53: Int = 11


scala> person += ("Flink" -> 5)
<console>:10: error: value += is not a member of scala.collection.immutable.Ma
String,Int]
              person += ("Flink" -> 5)
                     ^
scala> val sparkValue = if(person.contains("spark")) person("spark") else 100
sparkValue: Int = 6
scala> val sparkValue = erson.getOrElse("spark", 1000)
<console>:8: error: not found: value erson
       val sparkValue = erson.getOrElse("spark", 1000)
                        ^
scala> val sparkValue = person.getOrElse("spark", 1000)
sparkValue: Int = 6
scala> for ((key , value) <- person) println (key ":" value)
<console>:1: error: ')' expected but string literal found.
       for ((key , value) <- person) println (key ":" value)
                                                  ^
scala> for ((key , value) <- person) println (key + ":" + value)
hadoop:11
spark:6
scala> for (key  <- person.keySet) println (key)
hadoop
spark

7. tuple
scala> val tuple = ("Spark", 6, 99.9)
tuple: (String, Int, Double) = (Spark,6,99.9)
scala> tuple._1
res6: String = Spark
scala> tuple._2
res7: Int = 6
scala> tuple._3
res8: Double = 99.9
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值