Array(20)与Array.apply(null, {length: 20})的区别

没有检索到摘要

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

起因:看vue官方文档的时候,发现有段代码是Array.apply(null,{length:20}).map(() => {})这样的

解释:这段代码按照表面理解肯定是循环一个20长度的数组,返回一个新的20位数组

思考:1.为什么写Array.appply(null, {length:20})而不是写Array(20)?

2.xxx.apply(a,b) 这个b不应该是个数组么?为什么一个对象也可以?

解答:1.var arr = Array(20)是创建一个长度为20的数组,但是数组每个位置上都没有值,返回的是[empty × 20],仅仅只是占位作用,如果用arr[0]=undefined... arr[19]=undefined 就和Array.apply(null, {length:20})完全一样了。

所以如果用Array(20).map()是不会执行的;Array.apply(null, {length:20})是也是创建一个长度20的数组,返回的是[undefined,undefined,undefined,undefined...]这是的的确确有值的,只不过值为undefined,故使用Array.apply(null, {length:20})是可以完成遍历的。

2.{length:20}简单看似是个对象,但实际想想这是个类数组,0->19位都是undefined,length = 20,类数组是可以作为apply的第二个参数的,如:

function a () {

Array.prototype.push.apply(arguments)

};这里的arguments就是类数组

这样疑惑就都解开了。

实际测试中发现Array.apply(null,{name: 'a'})会返回一个[],这应该是js内部做了什么操作,我暂时还不太清楚,有清楚的大佬可以留言解答一下,多谢。总之没有length的普通对象就会返回[]

现有udf val pearsonCorr = udf { (x: Seq[Double], y: Seq[Double]) => require(x.length == y.length, "数组长度必须相同") val n = x.length if (n <= 1) Double.NaN // 需要至少2个点计算相关性 else { val sumX = x.sum val sumY = y.sum val sumXY = (x zip y).map { case (a, b) => a * b }.sum val sumX2 = x.map(a => a * a).sum val sumY2 = y.map(b => b * b).sum val numerator = n * sumXY - sumX * sumY val denominator = math.sqrt(n * sumX2 - sumX * sumX) * math.sqrt(n * sumY2 - sumY * sumY) if (denominator == 0) Double.NaN else numerator / denominator } } 在dataworks上跑scala编写的spark任务,报错 2025-07-29 17:07:02,618 ERROR org.apache.spark.deploy.yarn.ApplicationMaster - User class threw exception: org.apache.spark.SparkException: Job aborted due to stage failure: Task 41 in stage 638.0 failed 4 times, most recent failure: Lost task 41.3 in stage 638.0 (TID 27945, 3010c5210.cloud.c8.am301, executor 12): org.apache.spark.SparkException: Failed to execute user defined function($anonfun$48: (array<double>, array<double>) => double) at org.apache.spark.sql.catalyst.expressions.GeneratedClass$GeneratedIteratorForCodegenStage1.processNext(Unknown Source) at org.apache.spark.sql.execution.BufferedRowIterator.hasNext(BufferedRowIterator.java:43) at org.apache.spark.sql.execution.WholeStageCodegenExec$$anonfun$10$$anon$1.hasNext(WholeStageCodegenExec.scala:614) at org.apache.spark.sql.execution.UnsafeExternalRowSorter.sort(UnsafeExternalRowSorter.java:216) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:279) at org.apache.spark.sql.execution.exchange.ShuffleExchangeExec$$anonfun$2.apply(ShuffleExchangeExec.scala:250) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) at org.apache.spark.rdd.RDD$$anonfun$mapPartitionsInternal$1$$anonfun$apply$25.apply(RDD.scala:830) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324) at org.apache.spark.rdd.RDD.iterator(RDD.scala:288) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:96) at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:53) at org.apache.spark.scheduler.Task.run(Task.scala:109) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:345) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) at java.lang.Thread.run(Thread.java:745) Caused by: java.lang.IllegalArgumentException: requirement failed: ???????? at scala.Predef$.require(Predef.scala:224) at com.bmsoft.operate.VsZUserOverlimitEventPeriodNew$$anonfun$48.apply(VsZUserOverlimitEventPeriodNew.scala:1894) at com.bmsoft.operate.VsZUserOverlimitEventPeriodNew$$anonfun$48.apply(VsZUserOverlimitEventPeriodNew.scala:1893) ... 21 more 怎么将udf方法改下
最新发布
07-30
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值