[ https://issues.apache.org/jira/browse/SPARK-12146?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Shivaram Venkataraman resolved SPARK-12146. ------------------------------------------- Resolution: Fixed Assignee: Yanbo Liang Fix Version/s: 2.0.0 1.6.1 Resolved by https://github.com/apache/spark/pull/10145 > SparkR jsonFile should support multiple input files > --------------------------------------------------- > > Key: SPARK-12146 > URL: https://issues.apache.org/jira/browse/SPARK-12146 > Project: Spark > Issue Type: Sub-task > Components: SparkR > Reporter: Yanbo Liang > Assignee: Yanbo Liang > Fix For: 1.6.1, 2.0.0 > > > This bug is easy to reproduce, jsonFile did not support character vector as > arguments. > {code} > > path <- c("/path/to/dir1","/path/to/dir2") > > raw.terror<-jsonFile(sqlContext,path) > 15/12/03 15:59:55 ERROR RBackendHandler: jsonFile on 1 failed > Error in invokeJava(isStatic = FALSE, objId$id, methodName, ...) : > java.io.IOException: No input paths specified in job > at > org.apache.hadoop.mapred.FileInputFormat.listStatus(FileInputFormat.java:201) > at > org.apache.hadoop.mapred.FileInputFormat.getSplits(FileInputFormat.java:313) > at org.apache.spark.rdd.HadoopRDD.getPartitions(HadoopRDD.scala:207) > at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) > at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) > at scala.Option.getOrElse(Option.scala:120) > at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) > at > org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) > at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:239) > at org.apache.spark.rdd.RDD$$anonfun$partitions$2.apply(RDD.scala:237) > at scala.Option.getOrElse(Option.scala:120) > at org.apache.spark.rdd.RDD.partitions(RDD.scala:237) > at > org.apache.spark.rdd.MapPartitionsRDD.getPartitions(MapPartitionsRDD.scala:35) > at org.apache.spark.rdd.RDD$$anonfun$partitions$2 > {code} -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org