[ 
https://issues.apache.org/jira/browse/SPARK-16408?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15365658#comment-15365658
 ] 

zenglinxi commented on SPARK-16408:
-----------------------------------

as shown in https://issues.apache.org/jira/browse/SPARK-4687, we have two 
functions in SparkContext.scala:
{quote}
def addFile(path: String): Unit = {
    addFile(path, false)
  }
def addFile(path: String, recursive: Boolean): Unit = {
...
}
{quote}
But there are no config to turn on or off recursive, and spark always call 
addFile(path) in default, which means the value of recursive is  false, this is 
why we get the exceptions.

> SparkSQL Added file get Exception: is a directory and recursive is not turned 
> on
> --------------------------------------------------------------------------------
>
>                 Key: SPARK-16408
>                 URL: https://issues.apache.org/jira/browse/SPARK-16408
>             Project: Spark
>          Issue Type: Task
>          Components: SQL
>    Affects Versions: 1.6.2
>            Reporter: zenglinxi
>
> when use Spark-sql to execute sql like:
> {quote}
> add file hdfs://xxx/user/test;
> {quote}
> if the HDFS path( hdfs://xxx/user/test) is a directory, then we will get an 
> exception like:
> {quote}
> org.apache.spark.SparkException: Added file hdfs://xxx/user/test is a 
> directory and recursive is not turned on.
>        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1372)
>        at org.apache.spark.SparkContext.addFile(SparkContext.scala:1340)
>        at org.apache.spark.sql.hive.execution.AddFile.run(commands.scala:117)
>        at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult$lzycompute(commands.scala:58)
>        at 
> org.apache.spark.sql.execution.ExecutedCommand.sideEffectResult(commands.scala:56)
>        at 
> org.apache.spark.sql.execution.ExecutedCommand.doExecute(commands.scala:70)
> {quote}



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to