Hi,
i know i can start hadoop jobs like this:
hadoop jar myJob.jar org.foo.bar.MainClass --arg1 foo --arg2 bar
Now I would like to do this by JAVA-Code. My App is running within the
cluster and is started including the whole Hadoop-classpath. How can I start
the job by jar (found someth
hadoop-daemon.sh script prints the no $command to stop if it doesn'f find
the pid file.
You should echo the $pid variable and see if you hava a correct pid file there.
Ulul
Le 02/03/2015 13:53, Daniel Klinger a écrit :
Thanks for your help. But unfortunatly this didn’t do the job. Here’
to stop
1.Stop the service
2.Change the permissions for log and pid directory once again to hdfs.
3.Start service with hdfs.
This will resolve the issue
On Sun, Mar 1, 2015 at 6:40 PM, Daniel Klinger mailto:d...@web-computing.de> > wrote:
Thanks for your answer.
I put t
de is available ? (dfsadmin -report, through the WUI)
We need more detail
Ulul
Le 28/02/2015 22:05, Daniel Klinger a écrit :
Thanks but i know how to kill a process in Linux. But this didn’t answer the
question why the command say no Datanode to stop instead of stopping the
Datanode:
$HADOOP_PRE
process using kill -9
On 28 Feb 2015 09:38, "Daniel Klinger" mailto:d...@web-computing.de> > wrote:
Hello,
I used a lot of Hadoop-Distributions. Now I’m trying to install a pure Hadoop
on a little „cluster“ for testing (2 CentOS-VMs: 1 Name+DataNode 1 DataNode). I
followed
Hello,
I used a lot of Hadoop-Distributions. Now I'm trying to install a pure
Hadoop on a little "cluster" for testing (2 CentOS-VMs: 1 Name+DataNode 1
DataNode). I followed the instructions on the Documentation site:
http://hadoop.apache.org/docs/current/hadoop-project-dist/hadoop-common/Clus