[ https://issues.apache.org/jira/browse/SPARK-26379?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Apache Spark reassigned SPARK-26379: ------------------------------------ Assignee: Apache Spark (was: Jungtaek Lim) > Fix issue on adding current_timestamp/current_date to streaming query > --------------------------------------------------------------------- > > Key: SPARK-26379 > URL: https://issues.apache.org/jira/browse/SPARK-26379 > Project: Spark > Issue Type: Bug > Components: Structured Streaming > Affects Versions: 2.3.0, 2.3.1, 2.3.2, 2.4.0, 3.0.0 > Reporter: Kailash Gupta > Assignee: Apache Spark > Priority: Major > Fix For: 2.4.1, 3.0.0 > > > While using withColumn to add a column to a structured streaming Dataset, I > am getting following exception: > org.apache.spark.sql.catalyst.analysis.UnresolvedException: Invalid call to > dataType on unresolved object, tree: 'timestamp > Following is sample code > {code:java} > final String path = "path_to_input_directory"; > final StructType schema = new StructType(new StructField[] { new > StructField("word", StringType, false, Metadata.empty()), new > StructField("count", DataTypes.IntegerType, false, Metadata.empty()) }); > SparkSession sparkSession = > SparkSession.builder().appName("StructuredStreamingIssue").master("local").getOrCreate(); > Dataset<Row> words = sparkSession.readStream().option("sep", > ",").schema(schema).csv(path); > Dataset<Row> wordsWithTimestamp = words.withColumn("timestamp", > functions.current_timestamp()); > // wordsWithTimestamp.explain(true); > StreamingQuery query = > wordsWithTimestamp.writeStream().outputMode("update").option("truncate", > "false").format("console").trigger(Trigger.ProcessingTime("2 > seconds")).start(); > query.awaitTermination();{code} > Following are the contents of the file present at _path_ > {code:java} > a,2 > c,4 > d,2 > r,1 > t,9 > {code} > This seems working with 2.2.0 release, but not with 2.3.0 and 2.4.0 -- This message was sent by Atlassian JIRA (v7.6.3#76005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org