Hi,

I am trying example of structured streaming in spark using following piece
of code,

val spark = SparkSession
.builder
.appName("testingSTructuredQuery")
.master("local")
.getOrCreate()
import spark.implicits._
val userSchema = new StructType()
.add("name", "string").add("age", "integer")

val csvDF = spark
.readStream
.option("sep", ",")
.schema(userSchema) // Specify schema of the parquet files
.csv("hdfs://192.168.23.107:9000/structuredStreaming/")
csvDF.show

When I run this piece of code, following exception is raised.

Exception in thread "main" java.lang.IllegalArgumentException:
java.net.URISyntaxException: Relative path in absolute URI:
file:E:/Scala-Eclips/workspace/spark2/spark-warehouse
at org.apache.hadoop.fs.Path.initialize(Path.java:206)
at org.apache.hadoop.fs.Path.<init>(Path.java:172)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:114)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)
at
org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)
at
org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)
at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)
at
org.apache.spark.sql.internal.SessionState$$anon$1.<init>(SessionState.scala:112)
at
org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)
at
org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)
at
org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at
org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:142)
at
org.apache.spark.sql.streaming.DataStreamReader.load(DataStreamReader.scala:153)
at
org.apache.spark.sql.streaming.DataStreamReader.csv(DataStreamReader.scala:251)
at com.platalytics.spark.two.test.App$.main(App.scala:22)
at com.platalytics.spark.two.test.App.main(App.scala)


Please guide me in this regard.

Thanks

Reply via email to