Step into Scala - 07 - 异常

本文深入探讨了Scala语言中异常处理的机制,包括try...catch结构的使用方式、异常类型的定义及处理策略。文章通过示例代码展示了如何捕获并处理不同类型的异常,以及如何使用finally块确保代码的正确执行。

目录

摘要

try…catch

异常

  • Scala 没有检查异常
  • throw 表达式的类型是 Nothing
try {
  throw new IOException()
} catch {
  //not care exception
  case _: IOException => println("io exception")
  //care exception
  case e: Exception => e.printStackTrace();
} finally {
}

以上,_ 用于不需要使用产生的异常对象时使用。

"C:\Program Files\Java\jdk1.8.0_261\bin\java.exe" "-javaagent:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2022.2.2\lib\idea_rt.jar=62562:C:\Program Files\JetBrains\IntelliJ IDEA Community Edition 2022.2.2\bin" -Dfile.encoding=UTF-8 -classpath "C:\Program Files\Java\jdk1.8.0_261\jre\lib\charsets.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\deploy.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\access-bridge-64.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\cldrdata.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\dnsns.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\jaccess.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\jfxrt.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\localedata.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\nashorn.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\sunec.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\sunjce_provider.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\sunmscapi.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\sunpkcs11.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\ext\zipfs.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\javaws.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\jce.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\jfr.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\jfxswt.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\jsse.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\management-agent.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\plugin.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\resources.jar;C:\Program Files\Java\jdk1.8.0_261\jre\lib\rt.jar;D:\yunjisuan\lcy\target\classes;D:\yunjisuan\.m2\repository\org\apache\spark\spark-core_2.11\2.4.6\spark-core_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\com\thoughtworks\paranamer\paranamer\2.8\paranamer-2.8.jar;D:\yunjisuan\.m2\repository\org\apache\avro\avro\1.8.2\avro-1.8.2.jar;D:\yunjisuan\.m2\repository\org\codehaus\jackson\jackson-core-asl\1.9.13\jackson-core-asl-1.9.13.jar;D:\yunjisuan\.m2\repository\org\codehaus\jackson\jackson-mapper-asl\1.9.13\jackson-mapper-asl-1.9.13.jar;D:\yunjisuan\.m2\repository\org\apache\commons\commons-compress\1.8.1\commons-compress-1.8.1.jar;D:\yunjisuan\.m2\repository\org\tukaani\xz\1.5\xz-1.5.jar;D:\yunjisuan\.m2\repository\org\apache\avro\avro-mapred\1.8.2\avro-mapred-1.8.2-hadoop2.jar;D:\yunjisuan\.m2\repository\org\apache\avro\avro-ipc\1.8.2\avro-ipc-1.8.2.jar;D:\yunjisuan\.m2\repository\com\twitter\chill_2.11\0.9.3\chill_2.11-0.9.3.jar;D:\yunjisuan\.m2\repository\com\esotericsoftware\kryo-shaded\4.0.2\kryo-shaded-4.0.2.jar;D:\yunjisuan\.m2\repository\com\esotericsoftware\minlog\1.3.0\minlog-1.3.0.jar;D:\yunjisuan\.m2\repository\org\objenesis\objenesis\2.5.1\objenesis-2.5.1.jar;D:\yunjisuan\.m2\repository\com\twitter\chill-java\0.9.3\chill-java-0.9.3.jar;D:\yunjisuan\.m2\repository\org\apache\xbean\xbean-asm6-shaded\4.8\xbean-asm6-shaded-4.8.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-client\2.6.5\hadoop-client-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-common\2.6.5\hadoop-common-2.6.5.jar;D:\yunjisuan\.m2\repository\commons-cli\commons-cli\1.2\commons-cli-1.2.jar;D:\yunjisuan\.m2\repository\xmlenc\xmlenc\0.52\xmlenc-0.52.jar;D:\yunjisuan\.m2\repository\commons-httpclient\commons-httpclient\3.1\commons-httpclient-3.1.jar;D:\yunjisuan\.m2\repository\commons-collections\commons-collections\3.2.2\commons-collections-3.2.2.jar;D:\yunjisuan\.m2\repository\commons-configuration\commons-configuration\1.6\commons-configuration-1.6.jar;D:\yunjisuan\.m2\repository\commons-digester\commons-digester\1.8\commons-digester-1.8.jar;D:\yunjisuan\.m2\repository\commons-beanutils\commons-beanutils\1.7.0\commons-beanutils-1.7.0.jar;D:\yunjisuan\.m2\repository\com\google\code\gson\gson\2.2.4\gson-2.2.4.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-auth\2.6.5\hadoop-auth-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\httpcomponents\httpclient\4.2.5\httpclient-4.2.5.jar;D:\yunjisuan\.m2\repository\org\apache\httpcomponents\httpcore\4.2.4\httpcore-4.2.4.jar;D:\yunjisuan\.m2\repository\org\apache\directory\server\apacheds-kerberos-codec\2.0.0-M15\apacheds-kerberos-codec-2.0.0-M15.jar;D:\yunjisuan\.m2\repository\org\apache\directory\server\apacheds-i18n\2.0.0-M15\apacheds-i18n-2.0.0-M15.jar;D:\yunjisuan\.m2\repository\org\apache\directory\api\api-asn1-api\1.0.0-M20\api-asn1-api-1.0.0-M20.jar;D:\yunjisuan\.m2\repository\org\apache\directory\api\api-util\1.0.0-M20\api-util-1.0.0-M20.jar;D:\yunjisuan\.m2\repository\org\apache\curator\curator-client\2.6.0\curator-client-2.6.0.jar;D:\yunjisuan\.m2\repository\org\htrace\htrace-core\3.0.4\htrace-core-3.0.4.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-hdfs\2.6.5\hadoop-hdfs-2.6.5.jar;D:\yunjisuan\.m2\repository\org\mortbay\jetty\jetty-util\6.1.26\jetty-util-6.1.26.jar;D:\yunjisuan\.m2\repository\xerces\xercesImpl\2.9.1\xercesImpl-2.9.1.jar;D:\yunjisuan\.m2\repository\xml-apis\xml-apis\1.3.04\xml-apis-1.3.04.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-app\2.6.5\hadoop-mapreduce-client-app-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-common\2.6.5\hadoop-mapreduce-client-common-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-yarn-client\2.6.5\hadoop-yarn-client-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-yarn-server-common\2.6.5\hadoop-yarn-server-common-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-shuffle\2.6.5\hadoop-mapreduce-client-shuffle-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-yarn-api\2.6.5\hadoop-yarn-api-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-core\2.6.5\hadoop-mapreduce-client-core-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-yarn-common\2.6.5\hadoop-yarn-common-2.6.5.jar;D:\yunjisuan\.m2\repository\javax\xml\bind\jaxb-api\2.2.2\jaxb-api-2.2.2.jar;D:\yunjisuan\.m2\repository\javax\xml\stream\stax-api\1.0-2\stax-api-1.0-2.jar;D:\yunjisuan\.m2\repository\org\codehaus\jackson\jackson-jaxrs\1.9.13\jackson-jaxrs-1.9.13.jar;D:\yunjisuan\.m2\repository\org\codehaus\jackson\jackson-xc\1.9.13\jackson-xc-1.9.13.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-mapreduce-client-jobclient\2.6.5\hadoop-mapreduce-client-jobclient-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\hadoop\hadoop-annotations\2.6.5\hadoop-annotations-2.6.5.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-launcher_2.11\2.4.6\spark-launcher_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-kvstore_2.11\2.4.6\spark-kvstore_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\fusesource\leveldbjni\leveldbjni-all\1.8\leveldbjni-all-1.8.jar;D:\yunjisuan\.m2\repository\com\fasterxml\jackson\core\jackson-core\2.6.7\jackson-core-2.6.7.jar;D:\yunjisuan\.m2\repository\com\fasterxml\jackson\core\jackson-annotations\2.6.7\jackson-annotations-2.6.7.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-network-common_2.11\2.4.6\spark-network-common_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-network-shuffle_2.11\2.4.6\spark-network-shuffle_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-unsafe_2.11\2.4.6\spark-unsafe_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\javax\activation\activation\1.1.1\activation-1.1.1.jar;D:\yunjisuan\.m2\repository\org\apache\curator\curator-recipes\2.6.0\curator-recipes-2.6.0.jar;D:\yunjisuan\.m2\repository\org\apache\curator\curator-framework\2.6.0\curator-framework-2.6.0.jar;D:\yunjisuan\.m2\repository\com\google\guava\guava\16.0.1\guava-16.0.1.jar;D:\yunjisuan\.m2\repository\org\apache\zookeeper\zookeeper\3.4.6\zookeeper-3.4.6.jar;D:\yunjisuan\.m2\repository\javax\servlet\javax.servlet-api\3.1.0\javax.servlet-api-3.1.0.jar;D:\yunjisuan\.m2\repository\org\apache\commons\commons-lang3\3.5\commons-lang3-3.5.jar;D:\yunjisuan\.m2\repository\org\apache\commons\commons-math3\3.4.1\commons-math3-3.4.1.jar;D:\yunjisuan\.m2\repository\com\google\code\findbugs\jsr305\1.3.9\jsr305-1.3.9.jar;D:\yunjisuan\.m2\repository\org\slf4j\slf4j-api\1.7.16\slf4j-api-1.7.16.jar;D:\yunjisuan\.m2\repository\org\slf4j\jul-to-slf4j\1.7.16\jul-to-slf4j-1.7.16.jar;D:\yunjisuan\.m2\repository\org\slf4j\jcl-over-slf4j\1.7.16\jcl-over-slf4j-1.7.16.jar;D:\yunjisuan\.m2\repository\log4j\log4j\1.2.17\log4j-1.2.17.jar;D:\yunjisuan\.m2\repository\org\slf4j\slf4j-log4j12\1.7.16\slf4j-log4j12-1.7.16.jar;D:\yunjisuan\.m2\repository\com\ning\compress-lzf\1.0.3\compress-lzf-1.0.3.jar;D:\yunjisuan\.m2\repository\org\xerial\snappy\snappy-java\1.1.7.5\snappy-java-1.1.7.5.jar;D:\yunjisuan\.m2\repository\org\lz4\lz4-java\1.4.0\lz4-java-1.4.0.jar;D:\yunjisuan\.m2\repository\com\github\luben\zstd-jni\1.3.2-2\zstd-jni-1.3.2-2.jar;D:\yunjisuan\.m2\repository\org\roaringbitmap\RoaringBitmap\0.7.45\RoaringBitmap-0.7.45.jar;D:\yunjisuan\.m2\repository\org\roaringbitmap\shims\0.7.45\shims-0.7.45.jar;D:\yunjisuan\.m2\repository\commons-net\commons-net\3.1\commons-net-3.1.jar;D:\yunjisuan\.m2\repository\org\json4s\json4s-jackson_2.11\3.5.3\json4s-jackson_2.11-3.5.3.jar;D:\yunjisuan\.m2\repository\org\json4s\json4s-core_2.11\3.5.3\json4s-core_2.11-3.5.3.jar;D:\yunjisuan\.m2\repository\org\json4s\json4s-ast_2.11\3.5.3\json4s-ast_2.11-3.5.3.jar;D:\yunjisuan\.m2\repository\org\json4s\json4s-scalap_2.11\3.5.3\json4s-scalap_2.11-3.5.3.jar;D:\yunjisuan\.m2\repository\org\scala-lang\modules\scala-xml_2.11\1.0.6\scala-xml_2.11-1.0.6.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\core\jersey-client\2.22.2\jersey-client-2.22.2.jar;D:\yunjisuan\.m2\repository\javax\ws\rs\javax.ws.rs-api\2.0.1\javax.ws.rs-api-2.0.1.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\hk2-api\2.4.0-b34\hk2-api-2.4.0-b34.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\hk2-utils\2.4.0-b34\hk2-utils-2.4.0-b34.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\external\aopalliance-repackaged\2.4.0-b34\aopalliance-repackaged-2.4.0-b34.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\external\javax.inject\2.4.0-b34\javax.inject-2.4.0-b34.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\hk2-locator\2.4.0-b34\hk2-locator-2.4.0-b34.jar;D:\yunjisuan\.m2\repository\org\javassist\javassist\3.18.1-GA\javassist-3.18.1-GA.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\core\jersey-common\2.22.2\jersey-common-2.22.2.jar;D:\yunjisuan\.m2\repository\javax\annotation\javax.annotation-api\1.2\javax.annotation-api-1.2.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\bundles\repackaged\jersey-guava\2.22.2\jersey-guava-2.22.2.jar;D:\yunjisuan\.m2\repository\org\glassfish\hk2\osgi-resource-locator\1.0.1\osgi-resource-locator-1.0.1.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\core\jersey-server\2.22.2\jersey-server-2.22.2.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\media\jersey-media-jaxb\2.22.2\jersey-media-jaxb-2.22.2.jar;D:\yunjisuan\.m2\repository\javax\validation\validation-api\1.1.0.Final\validation-api-1.1.0.Final.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet\2.22.2\jersey-container-servlet-2.22.2.jar;D:\yunjisuan\.m2\repository\org\glassfish\jersey\containers\jersey-container-servlet-core\2.22.2\jersey-container-servlet-core-2.22.2.jar;D:\yunjisuan\.m2\repository\io\netty\netty-all\4.1.47.Final\netty-all-4.1.47.Final.jar;D:\yunjisuan\.m2\repository\io\netty\netty\3.9.9.Final\netty-3.9.9.Final.jar;D:\yunjisuan\.m2\repository\com\clearspring\analytics\stream\2.7.0\stream-2.7.0.jar;D:\yunjisuan\.m2\repository\io\dropwizard\metrics\metrics-core\3.1.5\metrics-core-3.1.5.jar;D:\yunjisuan\.m2\repository\io\dropwizard\metrics\metrics-jvm\3.1.5\metrics-jvm-3.1.5.jar;D:\yunjisuan\.m2\repository\io\dropwizard\metrics\metrics-json\3.1.5\metrics-json-3.1.5.jar;D:\yunjisuan\.m2\repository\io\dropwizard\metrics\metrics-graphite\3.1.5\metrics-graphite-3.1.5.jar;D:\yunjisuan\.m2\repository\com\fasterxml\jackson\core\jackson-databind\2.6.7.3\jackson-databind-2.6.7.3.jar;D:\yunjisuan\.m2\repository\com\fasterxml\jackson\module\jackson-module-scala_2.11\2.6.7.1\jackson-module-scala_2.11-2.6.7.1.jar;D:\yunjisuan\.m2\repository\org\scala-lang\scala-reflect\2.11.8\scala-reflect-2.11.8.jar;D:\yunjisuan\.m2\repository\com\fasterxml\jackson\module\jackson-module-paranamer\2.7.9\jackson-module-paranamer-2.7.9.jar;D:\yunjisuan\.m2\repository\org\apache\ivy\ivy\2.4.0\ivy-2.4.0.jar;D:\yunjisuan\.m2\repository\oro\oro\2.0.8\oro-2.0.8.jar;D:\yunjisuan\.m2\repository\net\razorvine\pyrolite\4.13\pyrolite-4.13.jar;D:\yunjisuan\.m2\repository\net\sf\py4j\py4j\0.10.7\py4j-0.10.7.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-tags_2.11\2.4.6\spark-tags_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\commons\commons-crypto\1.0.0\commons-crypto-1.0.0.jar;D:\yunjisuan\.m2\repository\org\spark-project\spark\unused\1.0.0\unused-1.0.0.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-mllib_2.11\2.4.6\spark-mllib_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\scala-lang\modules\scala-parser-combinators_2.11\1.1.0\scala-parser-combinators_2.11-1.1.0.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-streaming_2.11\2.4.6\spark-streaming_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-graphx_2.11\2.4.6\spark-graphx_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\com\github\fommil\netlib\core\1.1.2\core-1.1.2.jar;D:\yunjisuan\.m2\repository\net\sourceforge\f2j\arpack_combined_all\0.1\arpack_combined_all-0.1.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-mllib-local_2.11\2.4.6\spark-mllib-local_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\scalanlp\breeze_2.11\0.13.2\breeze_2.11-0.13.2.jar;D:\yunjisuan\.m2\repository\org\scalanlp\breeze-macros_2.11\0.13.2\breeze-macros_2.11-0.13.2.jar;D:\yunjisuan\.m2\repository\net\sf\opencsv\opencsv\2.3\opencsv-2.3.jar;D:\yunjisuan\.m2\repository\com\github\rwl\jtransforms\2.4.0\jtransforms-2.4.0.jar;D:\yunjisuan\.m2\repository\org\spire-math\spire_2.11\0.13.0\spire_2.11-0.13.0.jar;D:\yunjisuan\.m2\repository\org\spire-math\spire-macros_2.11\0.13.0\spire-macros_2.11-0.13.0.jar;D:\yunjisuan\.m2\repository\org\typelevel\machinist_2.11\0.6.1\machinist_2.11-0.6.1.jar;D:\yunjisuan\.m2\repository\com\chuusai\shapeless_2.11\2.3.2\shapeless_2.11-2.3.2.jar;D:\yunjisuan\.m2\repository\org\typelevel\macro-compat_2.11\1.1.1\macro-compat_2.11-1.1.1.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-sql_2.11\2.4.6\spark-sql_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\com\univocity\univocity-parsers\2.7.3\univocity-parsers-2.7.3.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-sketch_2.11\2.4.6\spark-sketch_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\apache\spark\spark-catalyst_2.11\2.4.6\spark-catalyst_2.11-2.4.6.jar;D:\yunjisuan\.m2\repository\org\codehaus\janino\janino\3.0.16\janino-3.0.16.jar;D:\yunjisuan\.m2\repository\org\codehaus\janino\commons-compiler\3.0.16\commons-compiler-3.0.16.jar;D:\yunjisuan\.m2\repository\org\antlr\antlr4-runtime\4.7\antlr4-runtime-4.7.jar;D:\yunjisuan\.m2\repository\org\apache\orc\orc-core\1.5.5\orc-core-1.5.5-nohive.jar;D:\yunjisuan\.m2\repository\org\apache\orc\orc-shims\1.5.5\orc-shims-1.5.5.jar;D:\yunjisuan\.m2\repository\com\google\protobuf\protobuf-java\2.5.0\protobuf-java-2.5.0.jar;D:\yunjisuan\.m2\repository\commons-lang\commons-lang\2.6\commons-lang-2.6.jar;D:\yunjisuan\.m2\repository\io\airlift\aircompressor\0.10\aircompressor-0.10.jar;D:\yunjisuan\.m2\repository\org\apache\orc\orc-mapreduce\1.5.5\orc-mapreduce-1.5.5-nohive.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-column\1.10.1\parquet-column-1.10.1.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-common\1.10.1\parquet-common-1.10.1.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-encoding\1.10.1\parquet-encoding-1.10.1.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-hadoop\1.10.1\parquet-hadoop-1.10.1.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-format\2.4.0\parquet-format-2.4.0.jar;D:\yunjisuan\.m2\repository\org\apache\parquet\parquet-jackson\1.10.1\parquet-jackson-1.10.1.jar;D:\yunjisuan\.m2\repository\org\apache\arrow\arrow-vector\0.10.0\arrow-vector-0.10.0.jar;D:\yunjisuan\.m2\repository\org\apache\arrow\arrow-format\0.10.0\arrow-format-0.10.0.jar;D:\yunjisuan\.m2\repository\org\apache\arrow\arrow-memory\0.10.0\arrow-memory-0.10.0.jar;D:\yunjisuan\.m2\repository\joda-time\joda-time\2.9.9\joda-time-2.9.9.jar;D:\yunjisuan\.m2\repository\com\carrotsearch\hppc\0.7.2\hppc-0.7.2.jar;D:\yunjisuan\.m2\repository\com\vlkan\flatbuffers\1.2.0-3f79e055\flatbuffers-1.2.0-3f79e055.jar;D:\yunjisuan\.m2\repository\org\apache\maven\plugins\maven-assembly-plugin\3.0.0\maven-assembly-plugin-3.0.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-plugin-api\3.0\maven-plugin-api-3.0.jar;D:\yunjisuan\.m2\repository\org\sonatype\sisu\sisu-inject-plexus\1.4.2\sisu-inject-plexus-1.4.2.jar;D:\yunjisuan\.m2\repository\org\sonatype\sisu\sisu-inject-bean\1.4.2\sisu-inject-bean-1.4.2.jar;D:\yunjisuan\.m2\repository\org\sonatype\sisu\sisu-guice\2.1.7\sisu-guice-2.1.7-noaop.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-core\3.0\maven-core-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-settings\3.0\maven-settings-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-settings-builder\3.0\maven-settings-builder-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-repository-metadata\3.0\maven-repository-metadata-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-model-builder\3.0\maven-model-builder-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-aether-provider\3.0\maven-aether-provider-3.0.jar;D:\yunjisuan\.m2\repository\org\sonatype\aether\aether-impl\1.7\aether-impl-1.7.jar;D:\yunjisuan\.m2\repository\org\sonatype\aether\aether-spi\1.7\aether-spi-1.7.jar;D:\yunjisuan\.m2\repository\org\sonatype\aether\aether-api\1.7\aether-api-1.7.jar;D:\yunjisuan\.m2\repository\org\sonatype\aether\aether-util\1.7\aether-util-1.7.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-classworlds\2.2.3\plexus-classworlds-2.2.3.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-component-annotations\1.5.5\plexus-component-annotations-1.5.5.jar;D:\yunjisuan\.m2\repository\org\sonatype\plexus\plexus-sec-dispatcher\1.3\plexus-sec-dispatcher-1.3.jar;D:\yunjisuan\.m2\repository\org\sonatype\plexus\plexus-cipher\1.4\plexus-cipher-1.4.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-artifact\3.0\maven-artifact-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-model\3.0\maven-model-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\maven-common-artifact-filters\3.0.1\maven-common-artifact-filters-3.0.1.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\maven-shared-utils\3.1.0\maven-shared-utils-3.1.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\maven-artifact-transfer\0.9.0\maven-artifact-transfer-0.9.0.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-interpolation\1.24\plexus-interpolation-1.24.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-archiver\3.4\plexus-archiver-3.4.jar;D:\yunjisuan\.m2\repository\org\iq80\snappy\snappy\0.4\snappy-0.4.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\file-management\3.0.0\file-management-3.0.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\maven-shared-io\3.0.0\maven-shared-io-3.0.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-compat\3.0\maven-compat-3.0.jar;D:\yunjisuan\.m2\repository\org\apache\maven\wagon\wagon-provider-api\2.10\wagon-provider-api-2.10.jar;D:\yunjisuan\.m2\repository\commons-io\commons-io\2.5\commons-io-2.5.jar;D:\yunjisuan\.m2\repository\org\apache\maven\shared\maven-filtering\3.1.1\maven-filtering-3.1.1.jar;D:\yunjisuan\.m2\repository\org\sonatype\plexus\plexus-build-api\0.0.7\plexus-build-api-0.0.7.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-io\2.7.1\plexus-io-2.7.1.jar;D:\yunjisuan\.m2\repository\org\apache\maven\maven-archiver\3.1.1\maven-archiver-3.1.1.jar;D:\yunjisuan\.m2\repository\org\codehaus\plexus\plexus-utils\3.0.24\plexus-utils-3.0.24.jar;D:\yunjisuan\.m2\repository\commons-codec\commons-codec\1.6\commons-codec-1.6.jar;D:\yunjisuan\.m2\repository\org\scala-lang\scala-library\2.11.12\scala-library-2.11.12.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-actors-2.11.0.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-actors-migration_2.11-1.1.0.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-library.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-parser-combinators_2.11-1.0.4.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-reflect.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-swing_2.11-1.0.2.jar;D:\yunjisuan\scala-2.11.12\scala-2.11.12\lib\scala-xml_2.11-1.0.5.jar" wazi.waye Exception in thread "main" org.apache.spark.sql.AnalysisException: cannot resolve '`date`' given input columns: [货号, 规格, 日期, 出库];; 'Project [货号#10, 规格#11, 日期#12, 出库#13, to_date('date, Some(yyyy-MM-dd)) AS date#18] +- Relation[货号#10,规格#11,日期#12,出库#13] csv at org.apache.spark.sql.catalyst.analysis.package$AnalysisErrorAt.failAnalysis(package.scala:42) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$3.applyOrElse(CheckAnalysis.scala:111) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1$$anonfun$apply$3.applyOrElse(CheckAnalysis.scala:108) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:280) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$transformUp$1.apply(TreeNode.scala:280) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:279) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$3.apply(TreeNode.scala:277) at org.apache.spark.sql.catalyst.trees.TreeNode$$anonfun$4.apply(TreeNode.scala:328) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.trees.TreeNode.mapChildren(TreeNode.scala:326) at org.apache.spark.sql.catalyst.trees.TreeNode.transformUp(TreeNode.scala:277) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:93) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$transformExpressionsUp$1.apply(QueryPlan.scala:93) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:105) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$1.apply(QueryPlan.scala:105) at org.apache.spark.sql.catalyst.trees.CurrentOrigin$.withOrigin(TreeNode.scala:69) at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpression$1(QueryPlan.scala:104) at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:116) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1$2.apply(QueryPlan.scala:121) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234) at scala.collection.immutable.List.foreach(List.scala:392) at scala.collection.TraversableLike$class.map(TraversableLike.scala:234) at scala.collection.immutable.List.map(List.scala:296) at org.apache.spark.sql.catalyst.plans.QueryPlan.org$apache$spark$sql$catalyst$plans$QueryPlan$$recursiveTransform$1(QueryPlan.scala:121) at org.apache.spark.sql.catalyst.plans.QueryPlan$$anonfun$2.apply(QueryPlan.scala:126) at org.apache.spark.sql.catalyst.trees.TreeNode.mapProductIterator(TreeNode.scala:186) at org.apache.spark.sql.catalyst.plans.QueryPlan.mapExpressions(QueryPlan.scala:126) at org.apache.spark.sql.catalyst.plans.QueryPlan.transformExpressionsUp(QueryPlan.scala:93) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:108) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$$anonfun$checkAnalysis$1.apply(CheckAnalysis.scala:86) at org.apache.spark.sql.catalyst.trees.TreeNode.foreachUp(TreeNode.scala:126) at org.apache.spark.sql.catalyst.analysis.CheckAnalysis$class.checkAnalysis(CheckAnalysis.scala:86) at org.apache.spark.sql.catalyst.analysis.Analyzer.checkAnalysis(Analyzer.scala:95) at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:108) at org.apache.spark.sql.catalyst.analysis.Analyzer$$anonfun$executeAndCheck$1.apply(Analyzer.scala:105) at org.apache.spark.sql.catalyst.plans.logical.AnalysisHelper$.markInAnalyzer(AnalysisHelper.scala:201) at org.apache.spark.sql.catalyst.analysis.Analyzer.executeAndCheck(Analyzer.scala:105) at org.apache.spark.sql.execution.QueryExecution.analyzed$lzycompute(QueryExecution.scala:58) at org.apache.spark.sql.execution.QueryExecution.analyzed(QueryExecution.scala:56) at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:48) at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:78) at org.apache.spark.sql.Dataset.org$apache$spark$sql$Dataset$$withPlan(Dataset.scala:3412) at org.apache.spark.sql.Dataset.select(Dataset.scala:1340) at org.apache.spark.sql.Dataset.withColumns(Dataset.scala:2258) at org.apache.spark.sql.Dataset.withColumn(Dataset.scala:2225) at wazi.waye$.main(waye.scala:23) at wazi.waye.main(waye.scala) 进程已结束,退出代码1
最新发布
05-25
### 解决 Spark SQL 中无法解析列 `date` 的问题 #### 1. 错误原因分析 当遇到错误提示 `cannot resolve 'date' given input columns` 时,通常表明在当前 DataFrame 的 Schema 中不存在名为 `date` 的列。这可能是由于以下几个原因之一引起的: - CSV 文件中实际的列名与代码中使用的名称不匹配。 - 在加载 CSV 文件时未正确设置 `header` 参数,导致默认将首行数据作为列名。 - 数据文件本身存在问题,例如缺少特定字段或格式不符合预期。 --- #### 2. 验证输入数据结构 为确保输入数据无误,在执行任何操作之前可以先打印出原始 DataFrame 的 Schema 和前几条记录以确认其内容是否符合预期。 示例代码如下: ```scala val rawData = spark.read.option("header", "true").option("inferSchema", "true").csv("D:/kucun.txt") rawData.printSchema() // 打印 Schema 查看是否存在 date 列 rawData.show(5) // 显示前五行数据以便进一步验证 ``` 如果发现 `date` 并不在 Schema 中,则需要重新核对源文件的实际列名并与之同步调整[^9]。 --- #### 3. 处理日期转换逻辑 假设经过上述检查已确认存在名为 `date` 的列但仍发生异常,则可能是因为某些行内的值为空或其他非法字符所致。此时可通过以下方式增强健壮性: ##### (1)过滤掉无效记录 利用 `filter` 方法排除那些可能导致失败的情况后再继续后续步骤: ```scala val filteredData = rawData.filter(col("date").isNotNull && col("date") !== "") ``` ##### (2)应用安全模式进行日期解析 通过设定 `try_to_date` 函数代替标准版从而允许跳过有问题的部分而不中断整个程序运行流: ```scala import org.apache.spark.sql.functions._ val processedData = filteredData.withColumn( "parsedDate", expr("try_to_date(date, 'yyyy-MM-dd')") ).na.drop(Seq("parsedDate")) // 删除未能成功转换成有效日期类型的行 ``` 此处引入了 Hive 内置表达式支持更灵活地应对多种潜在格式差异情况的同时还额外加入了清理机制去除残留噪声点[^10]。 --- #### 4. 更新完整工作流程 基于前面讨论的内容整理得到最终版本的工作管线定义如下所示: ```scala package wazi import org.apache.spark.sql.SparkSession import org.apache.spark.sql.functions._ object DateParsingFix { def main(args: Array[String]): Unit = { val spark = SparkSession.builder() .appName("ResolveCannotParseDateIssue") .master("local[*]") .getOrCreate() import spark.implicits._ try { // Step 1: Load raw data from CSV file. val rawData = spark.read .option("header", "true") .option("inferSchema", "true") .csv("D:/kucun.txt") // Debugging step: Print schema and sample rows to verify structure. rawData.printSchema() rawData.show(5) // Step 2: Filter out null or empty values in the target column. val filteredData = rawData.filter(col("date").isNotNull && col("date") =!= "") // Step 3: Safely attempt parsing dates while handling errors gracefully. val processedData = filteredData.withColumn( "parsedDate", expr("try_to_date(date, 'yyyy-MM-dd')") // Use safe mode for conversion. ) .na.drop(Seq("parsedDate")) // Drop any remaining bad records. // Optional debugging output after processing. processedData.printSchema() processedData.show(5) // Save results back into storage if needed... } catch { case e: Exception => e.printStackTrace() } finally { spark.stop() } } } ``` 此脚本不仅解决了原初提到的具体技术难题而且还增加了必要的防护措施防止未来再次遭遇相似困境时造成更大范围影响[^11]. --- ### 总结说明 针对所描述场景下发生的 “cannot resolve ‘date’” 类型报错现象提供了全面诊断思路以及针对性解决方案。主要包括但不限于以下几个方面要点回顾总结: 1. **仔细审查源头资料质量状况**——借助工具辅助快速定位根本症结所在位置; 2. **合理运用内置功能特性规避常见陷阱风险**——比如采用容错能力强的方法实施复杂变换动作; 3. **构建稳健可靠的自动化流水线架构体系**——保障长期稳定高效运作能力不受外界干扰因素制约。 希望这些指导原则对你有所帮助! ---
评论
添加红包

请填写红包祝福语或标题

红包个数最小为10个

红包金额最低5元

当前余额3.43前往充值 >
需支付:10.00
成就一亿技术人!
领取后你会自动成为博主和红包主的粉丝 规则
hope_wisdom
发出的红包
实付
使用余额支付
点击重新获取
扫码支付
钱包余额 0

抵扣说明:

1.余额是钱包充值的虚拟货币,按照1:1的比例进行支付金额的抵扣。
2.余额无法直接购买下载,可以购买VIP、付费专栏及课程。

余额充值