>
-
To unsubscribe, e-mail: user-unsubscr...@hadoop.apache.org
For additional commands, e-mail: user-h...@hadoop.apache.org
that complex, Which step
> were you stuck at?
>
> bit1...@163.com
>
> From: Anil Jagtap
> Date: 2015-01-03 14:32
> To: user
> Subject: Multinode setup..
> Dear All,
>
> Im trying to setup a multi node cluster and I found millions of articles on
> how to con
Dear All,
Just wanted to know if there is a way to copy multiple files using hadoop
fs -put.
Instead of specifying individual name I provide wild-chars and respective
files should get copied.
Thank You.
Rgds, Anil
/hadoop/yarn commands on linux (which you already have configured).
>
> Cheers,
>
> Rich
>
> *Rich Haase* | Sr. Software Engineer | Pandora
> m 303.887.1146 | rha...@pandora.com
>
> From: Anil Jagtap
> Reply-To: "user@hadoop.apache.org"
> Date: Wednesday,
wrote:
>
> Am 17.12.2014 um 23:29 schrieb Anil Jagtap:
> > Dear All,
> >
> > I'm pretty new to Hadoop technology and Linux environment hence
> > struggling even to find solutions for the basic stuff.
> >
> > For now, Hortonworks Sandbox is working fine for
Yes i can do that but I have connected from my mac os terminal to linux
using ssh.
Now when I run LS command it shows me list of files & folders from Linux
and not from Mac OS.
I have files which I need to put onto Hadoop directly from Mac OS.
So something like below.
>From Mac OS Terminal:
[root
Dear All,
I'm pretty new to Hadoop technology and Linux environment hence struggling
even to find solutions for the basic stuff.
For now, Hortonworks Sandbox is working fine for me and i managed to
connect to it thru SSH.
Now i have some csv files in my mac os folders which i want to copy onto
H
Possibly all have got their PHD degrees in hand on Hadoop and they don’t need
their group for knowledge sharing..
I think the best way to keep all the mates busy is asking the experts to share
their experiences, use cases.. help the new-comers…
So far i have seen many emails from Indians… why not