java.sql.SQLException: No suitable driver -- sparkshell读取mysql数据load失败 未解决 (但Properties方式可以)

本文探讨了在Spark中使用JDBC格式读取数据时遇到的问题及解决方案,详细记录了不同配置方法的效果,包括直接使用option配置和通过Properties对象设置数据库连接属性。

spark使用jdbc格式读取数据内容
要将驱动jar包复制到spark的jars目录下
注意是单节点的spark还是集群的spark
要将jar包复制到每个节点。

加载jar包方法有几个

1.启动spark shell 时,加上 --jars

[root@hadoop01 spark-2.2.0-bin-hadoop2.7]# 
bin/spark-shell --jars mysql-connector-java-5.1.7-bin.jar --driver--class-path --jars mysql-connector-java-5.1.7-bin.jar(要写完整路径)

bin/spark-shell --jars /usr/local/spark-2.2.0-bin-hadoop2.7/mysql-connector-java-5.1.7-bin.jar --driver-class-path /usr/local/spark-2.2.0-bin-hadoop2.7/mysql-connector-java-5.1.7-bin.jar
2.使用option配置

val jdbcDF = spark.read.format("jdbc")
.option("driver","com.mysql.jdbc.Driver")
.option("url", "jdbc:mysql//hadoop01:3306/test")
.option("dbtable", "u")
.option("user","root")
.option("password","root").load()

但是最后还是没什么用


使用命令出错:

scala> val jdbcDF = spark.read.format("jdbc").option("url", "jdbc:mysql//hadoop01:3306/test").option("dbtable", "u").option("user","root").option("password","root").load()

报错:java.sql.SQLException: No suitable driver

java.sql.SQLException: No suitable driver
  at java.sql.DriverManager.getDriver(DriverManager.java:315)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:83)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:34)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
  ... 48 elided

然后将mysql的jar包cp到spark的jars目录下
但是:再次启动还是同样的错误

使用.option(“driver”,“com.mysql.jdbc.Driver”)照样找不到,这个直接报空指针了,再换个方法
val jdbcDF = spark.read.format("jdbc")
.option("driver","com.mysql.jdbc.Driver")
.option("url", "jdbc:mysql//hadoop01:3306/test")
.option("dbtable", "u")
.option("user","root")
.option("password","root").load()

所以第二次启动:使用spark-shell --jars

[root@hadoop01 spark-2.2.0-bin-hadoop2.7]# 
bin/spark-shell --jars mysql-connector-java-5.1.7-bin.jar

报错:

java.io.FileNotFoundException:
 Jar /usr/local/spark-2.2.0-bin-hadoop2.7/mysql-connector-java-5.1.7-bin.jar 
 not found

在这里插入图片描述
发现到sppark的根目录去找jar包 没有到jars目录下找

所以将mysql驱动jar包再次cp到spark根目录下。

再次启动bin/spark-shell --jars mysql-connector-java-5.1.7-bin.jar

成功
在这里插入图片描述

再次读取数据,仍然报没有驱动的错误

最后关闭spaark时发现

注意是112节点 我启动sparkshell是111节点

scala> 19/11/19 09:33:58 ERROR TaskSchedulerImpl: Lost executor 0 on 192.168.37.112: Remote RPC client disassociated. Likely due to containers exceeding thresholds, or network issues. Check driver logs for WARN messages.

所以
在这里插入图片描述
我启动了spark集群,然后再启动spark shell,连接的就可能不是本机的spark
而是其他节点的spark
而我其他节点没有mysql驱动
所以
就一直出错 不管我在111节点再怎么搞都没用。
贼坑。

在这里插入图片描述
改完后仍然报错!!!!未解决

spark sql 读取jdbc的两种方式 第一种不管怎么改都不行 不知道怎么办???

val jdbcDF = spark.read.format("jdbc").option("driver","com.mysql.jdbc.Driver").option("url", "jdbc:mysql//hadoop01:3306/test").option("dbtable", "u").option("user","root").option("password","root").load()
val jdbcDF = spark.read.format("jdbc").option("url", "jdbc:mysql//hadoop01:3306/test").option("dbtable", "u").option("user","root").option("password","root").load()



val jdbcDF = spark.read.format("jdbc")
.option("url", "jdbc:mysql//hadoop01:3306/test")
.option("dbtable", "u")
.option("user","root")
.option("password","root")
.load()

但是这种配置Properties方法就可以使用!!

在这里插入图片描述

val connectionProperties = new java.util.Properties()
connectionProperties.put("user", "root")
connectionProperties.put("password", "root")
val jdbcDF2 = spark.read.jdbc("jdbc:mysql://hadoop01:3306/test", "u", connectionProperties)

Using Scala version 2.11.8 (Java HotSpot(TM) 64-Bit Server VM, Java 1.8.0_181)
Type in expressions to have them evaluated.
Type :help for more information.

scala> val jdbcDF = spark.read.format("jdbc").option("url", "jdbc:mysql//hadoop01:3306/test").option("dbtable", "u").option("user","root").option("password","root").load()
java.sql.SQLException: No suitable driver
  at java.sql.DriverManager.getDriver(DriverManager.java:315)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:83)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:34)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
  ... 48 elided

scala> val jdbcDF = spark.read.format("jdbc").option("driver","com.mysql.jdbc.Driver").option("url", "jdbc:mysql//hadoop01:3306/test").option("dbtable", "u").option("user","root").option("password","root").load()
java.lang.NullPointerException
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCRDD$.resolveTable(JDBCRDD.scala:72)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCRelation.<init>(JDBCRelation.scala:113)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:47)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
  ... 48 elided

scala> val connectionProperties = new java.util.Properties()
connectionProperties: java.util.Properties = {}

scala> connectionProperties.put("user", "root")
res0: Object = null

scala> connectionProperties.put("password", "root")
res1: Object = null

scala> val jdbcDF2 = spark.read.jdbc("jdbc:mysql://hadoop01:3306/test", "u", connectionProperties)
jdbcDF2: org.apache.spark.sql.DataFrame = [id: int, name: string]

scala> val jdbcDF = spark.read.format("jdbc")
jdbcDF: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@399ac1a3

scala> .option("url", "jdbc:mysql//hadoop01:3306/test")
res2: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@399ac1a3

scala> .option("dbtable", "u")
res3: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@399ac1a3

scala> .option("user","root")
res4: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@399ac1a3

scala> .option("password","root")
res5: org.apache.spark.sql.DataFrameReader = org.apache.spark.sql.DataFrameReader@399ac1a3

scala> .load()
java.sql.SQLException: No suitable driver
  at java.sql.DriverManager.getDriver(DriverManager.java:315)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions$$anonfun$7.apply(JDBCOptions.scala:84)
  at scala.Option.getOrElse(Option.scala:121)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:83)
  at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.<init>(JDBCOptions.scala:34)
  at org.apache.spark.sql.execution.datasources.jdbc.JdbcRelationProvider.createRelation(JdbcRelationProvider.scala:32)
  at org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:306)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:178)
  at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:146)
  ... 48 elided

scala>
WARNING: Using incubator modules: jdk.incubator.vector Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel). 25/08/07 11:16:29 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable ❌ 驱动加载失败: An error occurred while calling z:java.lang.Class.forName. : java.lang.ClassNotFoundException: com.mysql.cj.jdbc.Driver at java.base/java.net.URLClassLoader.findClass(URLClassLoader.java:445) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:592) at java.base/java.lang.ClassLoader.loadClass(ClassLoader.java:525) at java.base/java.lang.Class.forName0(Native Method) at java.base/java.lang.Class.forName(Class.java:467) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:77) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:568) at py4j.reflection.MethodInvoker.invoke(MethodInvoker.java:244) at py4j.reflection.ReflectionEngine.invoke(ReflectionEngine.java:374) at py4j.Gateway.invoke(Gateway.java:282) at py4j.commands.AbstractCommand.invokeMethod(AbstractCommand.java:132) at py4j.commands.CallCommand.execute(CallCommand.java:79) at py4j.ClientServerConnection.waitForCommands(ClientServerConnection.java:184) at py4j.ClientServerConnection.run(ClientServerConnection.java:108) at java.base/java.lang.Thread.run(Thread.java:842) ⚠️ 请检查: 1. jar包位置 2. spark-defaults.conf配置 3. 文件权限 进程已结束,退出代码为 0 指定jar包位置后 config("spark.jars", "/export/server/spark/jars/mysql-connector-java-8.0.28.jar"). 25/08/07 11:20:46 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties Setting default log level to "WARN". To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).MySQL驱动已加载: com.mysql.cj.jdbc.Driver Traceback (most recent call last): File "/tmp/pycharm_project_49/00_example/test.py", line 26, in <module> .load() ^^^^^^ File "/root/.virtualenvs/pySpark/lib/python3.12/site-packages/pyspark/sql/readwriter.py", line 318, in load return self._df(self._jreader.load()) ^^^^^^^^^^^^^^^^^^^^ File "/root/.virtualenvs/pySpark/lib/python3.12/site-packages/py4j/java_gateway.py", line 1362, in __call__ return_value = get_return_value( ^^^^^^^^^^^^^^^^^ File "/root/.virtualenvs/pySpark/lib/python3.12/site-packages/pyspark/errors/exceptions/captured.py", line 282, in deco return f(*a, **kw) ^^^^^^^^^^^ File "/root/.virtualenvs/pySpark/lib/python3.12/site-packages/py4j/protocol.py", line 327, in get_return_value raise Py4JJavaError( py4j.protocol.Py4JJavaError: An error occurred while calling o39.load. : java.sql.SQLException: No suitable driver at java.sql/java.sql.DriverManager.getDriver(DriverManager.java:299) at org.apache.spark.sql.execution.datasources.jdbc.JDBCOptions.$anonfun$driverClass$2(JDBCOptions.scala:118) at scala.Option.getOrElse(Option.scala:201)
08-08
linx启动nacos报错025-03-15 15:13:22,942 WARN Exception encountered during context initialization - cancelling refresh attempt: org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'namespaceControllerV2' defined in URL [jar:file:/soft/tar/nacos/target/nacos-server.jar!/BOOT-INF/classes!/com/alibaba/nacos/console/controller/v2/NamespaceControllerV2.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'namespaceOperationService' defined in URL [jar:file:/soft/tar/nacos/target/nacos-server.jar!/BOOT-INF/lib/nacos-core-2.3.2.jar!/com/alibaba/nacos/core/service/NamespaceOperationService.class]: Unsatisfied dependency expressed through constructor parameter 0; nested exception is org.springframework.beans.factory.BeanCreationException: Error creating bean with name 'externalOtherPersistServiceImpl' defined in URL [jar:file:/soft/tar/nacos/target/nacos-server.jar!/BOOT-INF/lib/nacos-core-2.3.2.jar!/com/alibaba/nacos/core/namespace/repository/ExternalNamespacePersistServiceImpl.class]: Instantiation of bean failed; nested exception is org.springframework.beans.BeanInstantiationException: Failed to instantiate [com.alibaba.nacos.core.namespace.repository.ExternalNamespacePersistServiceImpl]: Constructor threw exception; nested exception is java.lang.RuntimeException: java.lang.RuntimeException: [db-load-error]load jdbc.properties error
03-16
出现 `java.sql.SQLException: No suitable driver found for jdbc:mysql` 错误通常意味着 Java 应用程序在尝试建立 JDBC 连接时,无法找到合适的 MySQL JDBC 驱动程序。以下是一些可能的解决方法: ### 1. 检查 MySQL JDBC 驱动是否存在 确保 MySQL JDBC 驱动(通常是 `mysql-connector-java.jar`)已经被正确添加到项目的类路径中。如果是使用 Maven 或 Gradle 等构建工具,可以在项目配置文件中添加相应的依赖。 #### Maven 依赖 ```xml <dependency> <groupId>mysql</groupId> <artifactId>mysql-connector-java</artifactId> <version>8.0.26</version> <!-- 根据实际情况选择合适的版本 --> </dependency> ``` #### Gradle 依赖 ```groovy implementation 'mysql:mysql-connector-java:8.0.26' ``` 如果是手动管理依赖,需要将 `mysql-connector-java.jar` 文件复制到项目的类路径下,例如 `WEB-INF/lib` 目录(对于 Web 项目)或 `lib` 目录(对于普通 Java 项目)。 ### 2. 检查驱动类名 在代码中使用正确的驱动类名。从 MySQL Connector/J 8.0 开始,驱动类名变为 `com.mysql.cj.jdbc.Driver`。确保在代码中使用了正确的类名,示例如下: ```java try { Class.forName("com.mysql.cj.jdbc.Driver"); String url = "jdbc:mysql://localhost:3306/seandb?useUnicode=true&characterEncoding=UTF-8"; java.sql.Connection conn = java.sql.DriverManager.getConnection(url); // 后续操作 } catch (ClassNotFoundException | java.sql.SQLException e) { e.printStackTrace(); } ``` ### 3. 检查 JDBC URL 确保 JDBC URL 格式正确,包括主机名、端口号、数据库名等信息。示例 URL 如下: ```java String url = "jdbc:mysql://localhost:3306/seandb?useUnicode=true&characterEncoding=UTF-8"; ``` ### 4. 检查 Mycat 配置 确保 Mycat 的配置文件中正确配置了 MySQL 数据源,包括 JDBC URL、用户名和密码等信息。 ### 5. 检查环境变量 如果是在服务器环境中运行,确保 `CLASSPATH` 环境变量包含了 MySQL JDBC 驱动的路径。 ### 6. 检查驱动版本兼容性 确保使用的 MySQL JDBC 驱动版本与 MySQL 服务器版本兼容。不同版本的 MySQL 服务器可能需要不同版本的 JDBC 驱动。 ### 7. 重启应用程序和服务器 有时候,简单地重启应用程序或服务器可以解决一些临时的问题。
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值