我想在函数pyspark中写入这部分
df = (df.withColumn("January", F.lit(None).cast('double'))
.withColumn("February", F.lit(None).cast('double'))
.withColumn("March", F.lit(None).cast('double'))
.withColumn("April", F.lit(None).cast('double'))
.withColumn("May", F.lit(None).cast('double'))
.withColumn("June", F.lit(None).cast('double'))
.withColumn("July", F.lit(None).cast('double'))
.withColumn("August", F.lit(None).cast('double'))
.withColumn("September", F.lit(None).cast('double'))
.withColumn("November", F.lit(None).cast('double'))
.withColumn("December", F.lit(None).cast('double'))
您可以使用
withColumns
代替组withColumn
文档在这里: https://spark.apache.org/docs/latest/api/python/reference/pyspark.sql/api/pyspark.sql.DataFrame.withColumns.html