from pyspark.context import SparkContext from pyspark.sql.session import SparkSession sc = SparkContext('local') spark = SparkSession(sc)df = spark.read.csv('aaa.csv')
使用spark.read.csv ,出现NameError: name 'spark' is not defined
最新推荐文章于 2025-05-12 14:30:59 发布