Value cannot be null. Parameter name: dataSet

在做一个会员登陆认证时遇到的问题,在csdn的论坛里贴出来没人能解决
真不知道怎么回事
不知道是不是微软的问题
利用OleDbDataReader来获取数据同样是出问题

做了一个简单的会员登录认证
可是却出现了上面的报错
但在另一个页面中却可以执行

sqlStr = "SELECT id,userName,usergroup FROM [admin] WHERE userName='" & userName & "' AND userPass='" & md5Pass & "'"

        dbconn = New OleDb.OleDbConnection(dbLink.ConnectionString)
        dbAdapter = New OleDb.OleDbDataAdapter(sqlStr, dbconn)
        dbAdapter.Fill(dst)

出错的就是最后一句
dbAdapter.Fill(dst)

不知道怎么回事?

我在access数据库的SQl视图下查询是可以查到结果的
说明sql语句没有错
问题应该在 OleDbDataAdapter
大家帮着给分析一下

msdn的示例

Public Function SelectOleDbSrvRows(dataSet As DataSet, connection As String, query As String) As DataSet
    Dim conn As New OleDbConnection(connection)
    Dim adapter As New OleDbDataAdapter()
    adapter.SelectCommand = new OleDbCommand(query, conn)
    adapter.Fill(dataset)
    Return dataset
End Function

即使按照他的写法仍旧出错
奇怪了
相同的代码在另一个asp.net Web应用程序中却可以执行

D:\software\develop\IDEA\jdk\bin\java.exe "-javaagent:D:\software\develop\IDEA\IntelliJ IDEA 2021.2.2\lib\idea_rt.jar=51119:D:\software\develop\IDEA\IntelliJ IDEA 2021.2.2\bin" -Dfile.encoding=UTF-8 -classpath D:\software\develop\IDEA\jdk\jre\lib\charsets.jar;D:\software\develop\IDEA\jdk\jre\lib\deploy.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\access-bridge-64.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\cldrdata.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\dnsns.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\jaccess.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\jfxrt.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\localedata.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\nashorn.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\sunec.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\sunjce_provider.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\sunmscapi.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\sunpkcs11.jar;D:\software\develop\IDEA\jdk\jre\lib\ext\zipfs.jar;D:\software\develop\IDEA\jdk\jre\lib\javaws.jar;D:\software\develop\IDEA\jdk\jre\lib\jce.jar;D:\software\develop\IDEA\jdk\jre\lib\jfr.jar;D:\software\develop\IDEA\jdk\jre\lib\jfxswt.jar;D:\software\develop\IDEA\jdk\jre\lib\jsse.jar;D:\software\develop\IDEA\jdk\jre\lib\management-agent.jar;D:\software\develop\IDEA\jdk\jre\lib\plugin.jar;D:\software\develop\IDEA\jdk\jre\lib\resources.jar;D:\software\develop\IDEA\jdk\jre\lib\rt.jar;D:\workspace\Spark\spark_demo_bigdata01\target\scala-2.12\classes;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\clearspring\analytics\stream\2.9.6\stream-2.9.6.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\esotericsoftware\kryo-shaded\4.0.2\kryo-shaded-4.0.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\esotericsoftware\minlog\1.3.0\minlog-1.3.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\fasterxml\jackson\core\jackson-annotations\2.12.3\jackson-annotations-2.12.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\fasterxml\jackson\core\jackson-core\2.12.3\jackson-core-2.12.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\fasterxml\jackson\core\jackson-databind\2.12.3\jackson-databind-2.12.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\fasterxml\jackson\module\jackson-module-scala_2.12\2.12.3\jackson-module-scala_2.12-2.12.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\github\luben\zstd-jni\1.5.0-4\zstd-jni-1.5.0-4.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\code\findbugs\jsr305\3.0.2\jsr305-3.0.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\code\gson\gson\2.8.6\gson-2.8.6.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\crypto\tink\tink\1.6.0\tink-1.6.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\flatbuffers\flatbuffers-java\1.9.0\flatbuffers-java-1.9.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\guava\guava\16.0.1\guava-16.0.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\google\protobuf\protobuf-java\3.14.0\protobuf-java-3.14.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\thoughtworks\paranamer\paranamer\2.8\paranamer-2.8.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\twitter\chill-java\0.10.0\chill-java-0.10.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\twitter\chill_2.12\0.10.0\chill_2.12-0.10.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\com\univocity\univocity-parsers\2.9.1\univocity-parsers-2.9.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-codec\commons-codec\1.15\commons-codec-1.15.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-io\commons-io\2.8.0\commons-io-2.8.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-logging\commons-logging\1.1.3\commons-logging-1.1.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\commons-net\commons-net\3.1\commons-net-3.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\airlift\aircompressor\0.21\aircompressor-0.21.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\dropwizard\metrics\metrics-core\4.2.0\metrics-core-4.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\dropwizard\metrics\metrics-graphite\4.2.0\metrics-graphite-4.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\dropwizard\metrics\metrics-jmx\4.2.0\metrics-jmx-4.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\dropwizard\metrics\metrics-json\4.2.0\metrics-json-4.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\dropwizard\metrics\metrics-jvm\4.2.0\metrics-jvm-4.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-all\4.1.68.Final\netty-all-4.1.68.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-buffer\4.1.50.Final\netty-buffer-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-codec\4.1.50.Final\netty-codec-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-common\4.1.50.Final\netty-common-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-handler\4.1.50.Final\netty-handler-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-resolver\4.1.50.Final\netty-resolver-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-transport-native-epoll\4.1.50.Final\netty-transport-native-epoll-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-transport-native-unix-common\4.1.50.Final\netty-transport-native-unix-common-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\io\netty\netty-transport\4.1.50.Final\netty-transport-4.1.50.Final.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\jakarta\annotation\jakarta.annotation-api\1.3.5\jakarta.annotation-api-1.3.5.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\jakarta\servlet\jakarta.servlet-api\4.0.3\jakarta.servlet-api-4.0.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\jakarta\validation\jakarta.validation-api\2.0.2\jakarta.validation-api-2.0.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\jakarta\ws\rs\jakarta.ws.rs-api\2.1.6\jakarta.ws.rs-api-2.1.6.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\javax\activation\activation\1.1.1\activation-1.1.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\javax\annotation\javax.annotation-api\1.3.2\javax.annotation-api-1.3.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\javax\xml\bind\jaxb-api\2.2.11\jaxb-api-2.2.11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\log4j\log4j\1.2.17\log4j-1.2.17.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\mysql\mysql-connector-java\8.0.26\mysql-connector-java-8.0.26.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\net\razorvine\pyrolite\4.30\pyrolite-4.30.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\net\sf\py4j\py4j\0.10.9.2\py4j-0.10.9.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\antlr\antlr4-runtime\4.8\antlr4-runtime-4.8.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\arrow\arrow-format\2.0.0\arrow-format-2.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\arrow\arrow-memory-core\2.0.0\arrow-memory-core-2.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\arrow\arrow-memory-netty\2.0.0\arrow-memory-netty-2.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\arrow\arrow-vector\2.0.0\arrow-vector-2.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\avro\avro-ipc\1.10.2\avro-ipc-1.10.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\avro\avro-mapred\1.10.2\avro-mapred-1.10.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\avro\avro\1.10.2\avro-1.10.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\commons\commons-compress\1.20\commons-compress-1.20.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\commons\commons-crypto\1.1.0\commons-crypto-1.1.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\commons\commons-lang3\3.12.0\commons-lang3-3.12.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\commons\commons-text\1.6\commons-text-1.6.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\curator\curator-client\2.13.0\curator-client-2.13.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\curator\curator-framework\2.13.0\curator-framework-2.13.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\curator\curator-recipes\2.13.0\curator-recipes-2.13.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-client-api\3.3.1\hadoop-client-api-3.3.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hadoop\hadoop-client-runtime\3.3.1\hadoop-client-runtime-3.3.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\hive\hive-storage-api\2.7.2\hive-storage-api-2.7.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\htrace\htrace-core4\4.1.0-incubating\htrace-core4-4.1.0-incubating.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\ivy\ivy\2.5.0\ivy-2.5.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\orc\orc-core\1.6.11\orc-core-1.6.11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\orc\orc-mapreduce\1.6.11\orc-mapreduce-1.6.11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\orc\orc-shims\1.6.11\orc-shims-1.6.11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-column\1.12.1\parquet-column-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-common\1.12.1\parquet-common-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-encoding\1.12.1\parquet-encoding-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-format-structures\1.12.1\parquet-format-structures-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-hadoop\1.12.1\parquet-hadoop-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\parquet\parquet-jackson\1.12.1\parquet-jackson-1.12.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-catalyst_2.12\3.2.0\spark-catalyst_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-core_2.12\3.2.0\spark-core_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-kvstore_2.12\3.2.0\spark-kvstore_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-launcher_2.12\3.2.0\spark-launcher_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-network-common_2.12\3.2.0\spark-network-common_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-network-shuffle_2.12\3.2.0\spark-network-shuffle_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sketch_2.12\3.2.0\spark-sketch_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-sql_2.12\3.2.0\spark-sql_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-tags_2.12\3.2.0\spark-tags_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\spark\spark-unsafe_2.12\3.2.0\spark-unsafe_2.12-3.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\xbean\xbean-asm9-shaded\4.20\xbean-asm9-shaded-4.20.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\yetus\audience-annotations\0.12.0\audience-annotations-0.12.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\zookeeper\zookeeper-jute\3.6.2\zookeeper-jute-3.6.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\apache\zookeeper\zookeeper\3.6.2\zookeeper-3.6.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\codehaus\janino\commons-compiler\3.0.16\commons-compiler-3.0.16.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\codehaus\janino\janino\3.0.16\janino-3.0.16.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\external\aopalliance-repackaged\2.6.1\aopalliance-repackaged-2.6.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\external\jakarta.inject\2.6.1\jakarta.inject-2.6.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\hk2-api\2.6.1\hk2-api-2.6.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\hk2-locator\2.6.1\hk2-locator-2.6.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\hk2-utils\2.6.1\hk2-utils-2.6.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\hk2\osgi-resource-locator\1.0.3\osgi-resource-locator-1.0.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\containers\jersey-container-servlet-core\2.34\jersey-container-servlet-core-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\containers\jersey-container-servlet\2.34\jersey-container-servlet-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\core\jersey-client\2.34\jersey-client-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\core\jersey-common\2.34\jersey-common-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\core\jersey-server\2.34\jersey-server-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\glassfish\jersey\inject\jersey-hk2\2.34\jersey-hk2-2.34.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\javassist\javassist\3.25.0-GA\javassist-3.25.0-GA.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\jetbrains\annotations\17.0.0\annotations-17.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\json4s\json4s-ast_2.12\3.7.0-M11\json4s-ast_2.12-3.7.0-M11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\json4s\json4s-core_2.12\3.7.0-M11\json4s-core_2.12-3.7.0-M11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\json4s\json4s-jackson_2.12\3.7.0-M11\json4s-jackson_2.12-3.7.0-M11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\json4s\json4s-scalap_2.12\3.7.0-M11\json4s-scalap_2.12-3.7.0-M11.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\lz4\lz4-java\1.7.1\lz4-java-1.7.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\objenesis\objenesis\2.5.1\objenesis-2.5.1.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\roaringbitmap\RoaringBitmap\0.9.0\RoaringBitmap-0.9.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\roaringbitmap\shims\0.9.0\shims-0.9.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\rocksdb\rocksdbjni\6.20.3\rocksdbjni-6.20.3.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\scala-lang\modules\scala-parser-combinators_2.12\1.1.2\scala-parser-combinators_2.12-1.1.2.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\scala-lang\modules\scala-xml_2.12\1.2.0\scala-xml_2.12-1.2.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\scala-lang\scala-library\2.12.15\scala-library-2.12.15.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\scala-lang\scala-reflect\2.12.15\scala-reflect-2.12.15.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\slf4j\jcl-over-slf4j\1.7.30\jcl-over-slf4j-1.7.30.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\slf4j\jul-to-slf4j\1.7.30\jul-to-slf4j-1.7.30.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\slf4j\slf4j-api\1.7.30\slf4j-api-1.7.30.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\slf4j\slf4j-log4j12\1.7.30\slf4j-log4j12-1.7.30.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\threeten\threeten-extra\1.5.0\threeten-extra-1.5.0.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\tukaani\xz\1.8\xz-1.8.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\org\xerial\snappy\snappy-java\1.1.8.4\snappy-java-1.1.8.4.jar;C:\Users\lzz19\AppData\Local\Coursier\cache\v1\https\repo1.maven.org\maven2\oro\oro\2.0.8\oro-2.0.8.jar SparkWriteMySQL Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties 25/05/12 14:32:30 INFO SparkContext: Running Spark version 3.2.0 25/05/12 14:32:30 INFO ResourceUtils: ============================================================== 25/05/12 14:32:30 INFO ResourceUtils: No custom resources configured for spark.driver. 25/05/12 14:32:30 INFO ResourceUtils: ============================================================== 25/05/12 14:32:30 INFO SparkContext: Submitted application: SparkWriteMySQL Application 25/05/12 14:32:30 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0) 25/05/12 14:32:30 INFO ResourceProfile: Limiting resource is cpu 25/05/12 14:32:30 INFO ResourceProfileManager: Added ResourceProfile id: 0 25/05/12 14:32:30 INFO SecurityManager: Changing view acls to: lzz19 25/05/12 14:32:30 INFO SecurityManager: Changing modify acls to: lzz19 25/05/12 14:32:30 INFO SecurityManager: Changing view acls groups to: 25/05/12 14:32:30 INFO SecurityManager: Changing modify acls groups to: 25/05/12 14:32:30 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(lzz19); groups with view permissions: Set(); users with modify permissions: Set(lzz19); groups with modify permissions: Set() 25/05/12 14:32:31 INFO Utils: Successfully started service 'sparkDriver' on port 51140. 25/05/12 14:32:31 INFO SparkEnv: Registering MapOutputTracker 25/05/12 14:32:31 INFO SparkEnv: Registering BlockManagerMaster 25/05/12 14:32:31 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information 25/05/12 14:32:31 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up 25/05/12 14:32:31 INFO SparkEnv: Registering BlockManagerMasterHeartbeat 25/05/12 14:32:31 INFO DiskBlockManager: Created local directory at C:\Users\lzz19\AppData\Local\Temp\blockmgr-b1355077-a194-4c9b-90d1-1f7ee6d75fc1 25/05/12 14:32:31 INFO MemoryStore: MemoryStore started with capacity 1952.4 MiB 25/05/12 14:32:31 INFO SparkEnv: Registering OutputCommitCoordinator 25/05/12 14:32:31 INFO Utils: Successfully started service 'SparkUI' on port 4040. 25/05/12 14:32:31 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://Lzz:4040 25/05/12 14:32:32 INFO Executor: Starting executor ID driver on host Lzz 25/05/12 14:32:32 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 51155. 25/05/12 14:32:32 INFO NettyBlockTransferService: Server created on Lzz:51155 25/05/12 14:32:32 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy 25/05/12 14:32:32 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, Lzz, 51155, None) 25/05/12 14:32:32 INFO BlockManagerMasterEndpoint: Registering block manager Lzz:51155 with 1952.4 MiB RAM, BlockManagerId(driver, Lzz, 51155, None) 25/05/12 14:32:32 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, Lzz, 51155, None) 25/05/12 14:32:32 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, Lzz, 51155, None) 25/05/12 14:32:32 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir. 25/05/12 14:32:33 INFO SharedState: Warehouse path is 'file:/D:/workspace/Spark/spark_demo_bigdata01/spark-warehouse'. Loading class `com.mysql.jdbc.Driver'. This is deprecated. The new driver class is `com.mysql.cj.jdbc.Driver'. The driver is automatically registered via the SPI and manual loading of the driver class is generally unnecessary. 25/05/12 14:32:37 INFO CodeGenerator: Code generated in 166.3655 ms 25/05/12 14:32:37 INFO SparkContext: Starting job: jdbc at SparkWriteMySQL.scala:25 25/05/12 14:32:37 INFO DAGScheduler: Got job 0 (jdbc at SparkWriteMySQL.scala:25) with 1 output partitions 25/05/12 14:32:37 INFO DAGScheduler: Final stage: ResultStage 0 (jdbc at SparkWriteMySQL.scala:25) 25/05/12 14:32:37 INFO DAGScheduler: Parents of final stage: List() 25/05/12 14:32:37 INFO DAGScheduler: Missing parents: List() 25/05/12 14:32:37 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[7] at jdbc at SparkWriteMySQL.scala:25), which has no missing parents 25/05/12 14:32:37 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 27.7 KiB, free 1952.4 MiB) 25/05/12 14:32:37 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 12.1 KiB, free 1952.4 MiB) 25/05/12 14:32:37 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on Lzz:51155 (size: 12.1 KiB, free: 1952.4 MiB) 25/05/12 14:32:37 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1427 25/05/12 14:32:37 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[7] at jdbc at SparkWriteMySQL.scala:25) (first 15 tasks are for partitions Vector(0)) 25/05/12 14:32:37 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks resource profile 0 25/05/12 14:32:37 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0) (Lzz, executor driver, partition 0, PROCESS_LOCAL, 4476 bytes) taskResourceAssignments Map() 25/05/12 14:32:37 INFO Executor: Running task 0.0 in stage 0.0 (TID 0) 25/05/12 14:32:38 INFO CodeGenerator: Code generated in 19.8574 ms 25/05/12 14:32:38 INFO CodeGenerator: Code generated in 61.1349 ms 25/05/12 14:32:38 INFO Executor: Finished task 0.0 in stage 0.0 (TID 0). 1261 bytes result sent to driver 25/05/12 14:32:38 INFO TaskSetManager: Finished task 0.0 in stage 0.0 (TID 0) in 612 ms on Lzz (executor driver) (1/1) 25/05/12 14:32:38 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool 25/05/12 14:32:38 INFO DAGScheduler: ResultStage 0 (jdbc at SparkWriteMySQL.scala:25) finished in 1.053 s 25/05/12 14:32:38 INFO DAGScheduler: Job 0 is finished. Cancelling potential speculative or zombie tasks for this job 25/05/12 14:32:38 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage finished 25/05/12 14:32:38 INFO DAGScheduler: Job 0 finished: jdbc at SparkWriteMySQL.scala:25, took 1.112423 s 25/05/12 14:32:38 INFO CodeGenerator: Code generated in 12.4399 ms 25/05/12 14:32:38 INFO DAGScheduler: Registering RDD 10 (first at SparkWriteMySQL.scala:31) as input to shuffle 0 25/05/12 14:32:38 INFO DAGScheduler: Got map stage job 1 (first at SparkWriteMySQL.scala:31) with 1 output partitions 25/05/12 14:32:38 INFO DAGScheduler: Final stage: ShuffleMapStage 1 (first at SparkWriteMySQL.scala:31) 25/05/12 14:32:38 INFO DAGScheduler: Parents of final stage: List() 25/05/12 14:32:38 INFO DAGScheduler: Missing parents: List() 25/05/12 14:32:38 INFO DAGScheduler: Submitting ShuffleMapStage 1 (MapPartitionsRDD[10] at first at SparkWriteMySQL.scala:31), which has no missing parents 25/05/12 14:32:38 INFO MemoryStore: Block broadcast_1 stored as values in memory (estimated size 13.9 KiB, free 1952.3 MiB) 25/05/12 14:32:38 INFO MemoryStore: Block broadcast_1_piece0 stored as bytes in memory (estimated size 7.2 KiB, free 1952.3 MiB) 25/05/12 14:32:38 INFO BlockManagerInfo: Added broadcast_1_piece0 in memory on Lzz:51155 (size: 7.2 KiB, free: 1952.4 MiB) 25/05/12 14:32:38 INFO SparkContext: Created broadcast 1 from broadcast at DAGScheduler.scala:1427 25/05/12 14:32:38 INFO DAGScheduler: Submitting 1 missing tasks from ShuffleMapStage 1 (MapPartitionsRDD[10] at first at SparkWriteMySQL.scala:31) (first 15 tasks are for partitions Vector(0)) 25/05/12 14:32:38 INFO TaskSchedulerImpl: Adding task set 1.0 with 1 tasks resource profile 0 25/05/12 14:32:38 INFO TaskSetManager: Starting task 0.0 in stage 1.0 (TID 1) (Lzz, executor driver, partition 0, PROCESS_LOCAL, 4288 bytes) taskResourceAssignments Map() 25/05/12 14:32:38 INFO Executor: Running task 0.0 in stage 1.0 (TID 1) 25/05/12 14:32:38 INFO JDBCRDD: closed connection 25/05/12 14:32:38 INFO Executor: Finished task 0.0 in stage 1.0 (TID 1). 1907 bytes result sent to driver 25/05/12 14:32:38 INFO TaskSetManager: Finished task 0.0 in stage 1.0 (TID 1) in 108 ms on Lzz (executor driver) (1/1) 25/05/12 14:32:38 INFO TaskSchedulerImpl: Removed TaskSet 1.0, whose tasks have all completed, from pool 25/05/12 14:32:38 INFO DAGScheduler: ShuffleMapStage 1 (first at SparkWriteMySQL.scala:31) finished in 0.126 s 25/05/12 14:32:38 INFO DAGScheduler: looking for newly runnable stages 25/05/12 14:32:38 INFO DAGScheduler: running: Set() 25/05/12 14:32:38 INFO DAGScheduler: waiting: Set() 25/05/12 14:32:38 INFO DAGScheduler: failed: Set() 25/05/12 14:32:38 INFO CodeGenerator: Code generated in 12.6453 ms 25/05/12 14:32:38 INFO SparkContext: Starting job: first at SparkWriteMySQL.scala:31 25/05/12 14:32:38 INFO DAGScheduler: Got job 2 (first at SparkWriteMySQL.scala:31) with 1 output partitions 25/05/12 14:32:38 INFO DAGScheduler: Final stage: ResultStage 3 (first at SparkWriteMySQL.scala:31) 25/05/12 14:32:38 INFO DAGScheduler: Parents of final stage: List(ShuffleMapStage 2) 25/05/12 14:32:38 INFO DAGScheduler: Missing parents: List() 25/05/12 14:32:38 INFO DAGScheduler: Submitting ResultStage 3 (MapPartitionsRDD[13] at first at SparkWriteMySQL.scala:31), which has no missing parents 25/05/12 14:32:38 INFO MemoryStore: Block broadcast_2 stored as values in memory (estimated size 11.5 KiB, free 1952.3 MiB) 25/05/12 14:32:38 INFO MemoryStore: Block broadcast_2_piece0 stored as bytes in memory (estimated size 5.7 KiB, free 1952.3 MiB) 25/05/12 14:32:38 INFO BlockManagerInfo: Added broadcast_2_piece0 in memory on Lzz:51155 (size: 5.7 KiB, free: 1952.4 MiB) 25/05/12 14:32:38 INFO SparkContext: Created broadcast 2 from broadcast at DAGScheduler.scala:1427 25/05/12 14:32:38 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 3 (MapPartitionsRDD[13] at first at SparkWriteMySQL.scala:31) (first 15 tasks are for partitions Vector(0)) 25/05/12 14:32:38 INFO TaskSchedulerImpl: Adding task set 3.0 with 1 tasks resource profile 0 25/05/12 14:32:38 INFO TaskSetManager: Starting task 0.0 in stage 3.0 (TID 2) (Lzz, executor driver, partition 0, NODE_LOCAL, 4453 bytes) taskResourceAssignments Map() 25/05/12 14:32:38 INFO Executor: Running task 0.0 in stage 3.0 (TID 2) 25/05/12 14:32:39 INFO ShuffleBlockFetcherIterator: Getting 1 (60.0 B) non-empty blocks including 1 (60.0 B) local and 0 (0.0 B) host-local and 0 (0.0 B) push-merged-local and 0 (0.0 B) remote blocks 25/05/12 14:32:39 INFO ShuffleBlockFetcherIterator: Started 0 remote fetches in 13 ms 25/05/12 14:32:39 INFO Executor: Finished task 0.0 in stage 3.0 (TID 2). 2648 bytes result sent to driver 25/05/12 14:32:39 INFO TaskSetManager: Finished task 0.0 in stage 3.0 (TID 2) in 71 ms on Lzz (executor driver) (1/1) 25/05/12 14:32:39 INFO TaskSchedulerImpl: Removed TaskSet 3.0, whose tasks have all completed, from pool 25/05/12 14:32:39 INFO DAGScheduler: ResultStage 3 (first at SparkWriteMySQL.scala:31) finished in 0.088 s 25/05/12 14:32:39 INFO DAGScheduler: Job 2 is finished. Cancelling potential speculative or zombie tasks for this job 25/05/12 14:32:39 INFO TaskSchedulerImpl: Killing all running tasks in stage 3: Stage finished 25/05/12 14:32:39 INFO DAGScheduler: Job 2 finished: first at SparkWriteMySQL.scala:31, took 0.099324 s 25/05/12 14:32:39 INFO CodeGenerator: Code generated in 8.0341 ms Exception in thread "main" java.lang.ClassCastException: java.lang.Integer cannot be cast to java.lang.Long at scala.runtime.BoxesRunTime.unboxToLong(BoxesRunTime.java:107) at org.apache.spark.sql.Row.getLong(Row.scala:254) at org.apache.spark.sql.Row.getLong$(Row.scala:254) at org.apache.spark.sql.catalyst.expressions.GenericRow.getLong(rows.scala:166) at SparkWriteMySQL$.main(SparkWriteMySQL.scala:31) at SparkWriteMySQL.main(SparkWriteMySQL.scala) 25/05/12 14:32:39 INFO SparkContext: Invoking stop() from shutdown hook 25/05/12 14:32:39 INFO SparkUI: Stopped Spark web UI at http://Lzz:4040 25/05/12 14:32:39 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped! 25/05/12 14:32:39 INFO MemoryStore: MemoryStore cleared 25/05/12 14:32:39 INFO BlockManager: BlockManager stopped 25/05/12 14:32:39 INFO BlockManagerMaster: BlockManagerMaster stopped 25/05/12 14:32:39 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped! 25/05/12 14:32:39 INFO SparkContext: Successfully stopped SparkContext 25/05/12 14:32:39 INFO ShutdownHookManager: Shutdown hook called 25/05/12 14:32:39 INFO ShutdownHookManager: Deleting directory C:\Users\lzz19\AppData\Local\Temp\spark-6c1cc0e3-83db-4706-aaef-b3328e1f915d Process finished with exit code 1
05-14
评论 1
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值