Hi,

I get an error when I do a SaveAsTable as shown below. I do have write
access to the hive volume. Any idea as to why this is happening?

 val df = testDF.toDF("id", "rec")

  df.printSchema()

    val options = Map("path" -> "/hive/test.db/")

   
df.write.format("parquet").partitionBy("id").options(options).mode(SaveMode.Append).saveAsTable("sessRecs")

16/02/15 19:04:41 WARN scheduler.TaskSetManager: Lost task 369.0 in stage
2.0 (): org.apache.spark.SparkException: Task failed while writing rows.

at
org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:393)

at
org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:150)

at
org.apache.spark.sql.execution.datasources.InsertIntoHadoopFsRelation$$anonfun$run$1$$anonfun$apply$mcV$sp$3.apply(InsertIntoHadoopFsRelation.scala:150)

at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:66)

at org.apache.spark.scheduler.Task.run(Task.scala:88)

at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:214)

at
java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145)

at
java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615)

at java.lang.Thread.run(Thread.java:745)

Caused by: java.lang.RuntimeException: Failed to commit task

at
org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.commitTask$2(WriterContainer.scala:422)

at
org.apache.spark.sql.execution.datasources.DynamicPartitionWriterContainer.writeRows(WriterContainer.scala:388)

... 8 more

Caused by: java.io.IOException: Error: Read-only file system(30), file:
test, user name: test, ID: 12345678





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Error-when-doing-a-SaveAstable-on-a-Spark-dataframe-tp26232.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

---------------------------------------------------------------------
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org

Reply via email to