Re: reading files recursively using spark

2014-12-19 Thread Hafiz Mujadid
thanks bethesda!

But if we have structure like this

a/b/a.txt
a/c/c.txt
a/d/e/e.txt

then how can we handle this case?




--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/reading-files-recursively-using-spark-tp20782p20785.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: reading files recursively using spark

2014-12-19 Thread bethesda
On hdfs I created:

/one/one.txt  # contains text "one"
/one/two/two.txt  # contains text "two"

Then:  

val data = sc.textFile("/one/*")
data.collect

This returned:

Array(one, two)

So the above path designation appears to automatically recurse for you.



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/reading-files-recursively-using-spark-tp20782p20784.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: reading files recursively using spark

2014-12-19 Thread madhu phatak
Hi,
You can use FileInputformat API of Hadoop and newApiHadoopFile of spark to
get recursion. More on the topic you can refer here
http://stackoverflow.com/questions/8114579/using-fileinputformat-addinputpaths-to-recursively-add-hdfs-path

On Fri, Dec 19, 2014 at 4:50 PM, Sean Owen  wrote:
>
> How about using the HDFS API to create a list of all the directories
> to read from, and passing them as a comma-joined string to
> sc.textFile?
>
> On Fri, Dec 19, 2014 at 11:13 AM, Hafiz Mujadid
>  wrote:
> > Hi experts!
> >
> > what is efficient way to read all files using spark from directory and
> its
> > sub-directories as well.currently i move all files from directory and it
> > sub-directories into another temporary directory and then read them all
> > using sc.textFile method. But I want a method so that moving to temporary
> > directory cost may be saved.
> >
> > Thanks
> >
> >
> >
> > --
> > View this message in context:
> http://apache-spark-user-list.1001560.n3.nabble.com/reading-files-recursively-using-spark-tp20782.html
> > Sent from the Apache Spark User List mailing list archive at Nabble.com.
> >
> > -
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> > For additional commands, e-mail: user-h...@spark.apache.org
> >
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>
>

-- 
Regards,
Madhukara Phatak
http://www.madhukaraphatak.com


Re: reading files recursively using spark

2014-12-19 Thread Sean Owen
How about using the HDFS API to create a list of all the directories
to read from, and passing them as a comma-joined string to
sc.textFile?

On Fri, Dec 19, 2014 at 11:13 AM, Hafiz Mujadid
 wrote:
> Hi experts!
>
> what is efficient way to read all files using spark from directory and its
> sub-directories as well.currently i move all files from directory and it
> sub-directories into another temporary directory and then read them all
> using sc.textFile method. But I want a method so that moving to temporary
> directory cost may be saved.
>
> Thanks
>
>
>
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/reading-files-recursively-using-spark-tp20782.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
>
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
>

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org