Gerald,
In order to unsubscribe from this lister, you need to send an email to
user-unsubscr...@hadoop.apache.org.
On Wed, Mar 16, 2016 at 4:39 AM, Gerald-G wrote:
>
>
/Federation.html
On Fri, Sep 25, 2015 at 12:42 AM, Ashish Kumar9 <ashis...@in.ibm.com> wrote:
> This is interesting . Can you share any blog/document that talks
> multi-volume HDFS instances .
>
> Thanks and Regards,
> Ashish Kumar
>
>
> From:Corey Nolet <cjn
If the hardware is drastically different, I would think a multi-volume HDFS
instance would be a good idea (put like-hardware in the same volumes).
On Mon, Sep 21, 2015 at 3:29 PM, Tushar Kapila wrote:
> Would only matter if OS specific communication was being used between
>
Agreed.
Apache user lists archive questions and answers specifically for the
purpose of helping the larger community navigate its projects. It is not a
place for classifieds and employment information.
On Sun, May 17, 2015 at 9:24 PM, Billy Watson williamrwat...@gmail.com
wrote:
Uh, it's not
Hitarth,
I don't know how much direction you are looking for with regards to the
formats of the times but you can certainly read both files into the third
mapreduce job using the FileInputFormat by comma-separating the paths to
the files. The blocks for both files will essentially be unioned
I'm looking @ this page: http://hadoop.apache.org/docs/stable/
Is it a typo that Hadoop 2.6.0 is based on 2.4.1?
Thanks.
send an email to user-unsubscr...@hadoop.apache.org to unsubscribe.
On Wed, Nov 26, 2014 at 3:08 PM, Li Chen ahli1...@gmail.com wrote:
Please unsubscribe me, too.
Li
On Wed, Nov 26, 2014 at 3:03 PM, Sufi Nawaz s...@eaiti.com wrote:
Please suggest how to unsubscribe from this list.
Thank
I was playing around in the Spark shell and newing up an instance of Job
that I could use to configure the inputformat for a job. By default, the
Scala shell println's the result of every command typed. It throws an
exception when it printlns the newly created instance of Job because it
looks like
)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:75)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
On Tue, Nov 25, 2014 at 9:39 PM, Rohith Sharma K S
rohithsharm...@huawei.com wrote:
Could you give error message or stack trace?
*From:* Corey Nolet [mailto:cjno...@gmail.com]
*Sent