SPARK : failure: ``)'' expected but `(' found

本文介绍在使用SparkSQL执行ROW_NUMBER()开窗函数时遇到的错误及其解决办法。错误出现在尝试在非HiveContext环境下执行SparkSQL,适用于Spark版本小于等于1.6且大于等于1.4的情况。文章提供了创建HiveContext的具体步骤。

摘要生成于 C知道 ,由 DeepSeek-R1 满血版支持, 前往体验 >

问题:

当通过sparkSQL执行 ROW_NUMBER() OVER()开窗函数的时候,报了这个错,具体sql如下

 select data from (SELECT *, ROW_NUMBER() OVER (partition by id ORDER BY time,data)num FROM operate_test)a where num=1

但是报了以下的错误

[1.29] failure: failure: ``)'' expected but `(' found

解决办法

翻阅资料,了解到,在2.0+的版本中,可以通过SQLContext直接执行sparkSql,但是如果spark的版本<=1.6,并且>=1.4 那么执行的主题应该是 HiveContext

HiveContext创建方法如下,其中sc是SparkContext对象

import org.apache.spark.sql.hive.HiveContext
val sqlContext = new HiveContext(sc)
sqlContext.sql("xxx")

 

开始执行数据清洗流程... 2025-06-27 18:08:35 [main] INFO c.s.service.SparkStreamingPipeline - runFromRabbitMQ被调用,参数数量: 2 2025-06-27 18:08:35 [main] INFO c.s.service.SparkStreamingPipeline - 准备从RabbitMQ批量拉取 10 条消息... 2025-06-27 18:08:36 [main] INFO com.suwei.service.RabbitMQService - 批量接收消息完成,共接收 10 条消息 2025-06-27 18:08:36 [main] INFO c.s.service.SparkStreamingPipeline - 成功拉取到 10 条原始消息,开始数据清洗... 2025-06-27 18:08:36 [main] INFO o.a.spark.sql.internal.SharedState - Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir. log4j:WARN No appenders could be found for logger (org.apache.htrace.core.Tracer). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. 2025-06-27 18:08:36 [main] INFO o.a.spark.sql.internal.SharedState - Warehouse path is 'file:/E:/suwei/Model/BigData/spark-warehouse'. 2025-06-27 18:08:36 [main] INFO o.s.j.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@6b4fd7d{/SQL,null,AVAILABLE,@Spark} 2025-06-27 18:08:36 [main] INFO o.s.j.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@12723c5d{/SQL/json,null,AVAILABLE,@Spark} 2025-06-27 18:08:36 [main] INFO o.s.j.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@7747cc1b{/SQL/execution,null,AVAILABLE,@Spark} 2025-06-27 18:08:36 [main] INFO o.s.j.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@74bfdd66{/SQL/execution/json,null,AVAILABLE,@Spark} 2025-06-27 18:08:36 [main] INFO o.s.j.server.handler.ContextHandler - Started o.s.j.s.ServletContextHandler@1e9d7366{/static/sql,null,AVAILABLE,@Spark} 2025-06-27 18:08:37 [main] WARN o.a.spark.sql.SparkSession$Builder - Using an existing SparkSession; some spark core configurations may not take effect. 2025-06-27 18:08:38 [main] INFO o.a.s.s.c.e.codegen.CodeGenerator - Code generated in 185.4571 ms 2025-06-27 18:08:39 [main] INFO o.a.s.s.c.e.codegen.CodeGenerator - Code
最新发布
06-28
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值