Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Dear All:
I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
JAVA_PATH. Please find below error message
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.


anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


Core-site.xml?xml version=1.0?
!--core-site.xml--
configuration
    property
        namefs.default.name/name
        valuehdfs://localhost//value
    /property
/configuration

HDFS-site.xml?xml version=1.0?
!-- hdfs-site.xml --
configuration
    property
        namedfs.replication/name
        value1/value
    /property
/configuration

Mapred-site.xml?xml version=1.0?
!--mapred-site.xml--
configuration
    property
        namemapred.job.tracker/name
        valuelocalhost:8021/value
    /property
/configuration

Shall be thankful, if somebody can advise.
Regards,  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 
600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)

Re: Hadoop 2.6 issue

2015-04-01 Thread Jianfeng (Jeff) Zhang

Try to export JAVA_HOME in hadoop-env.sh


Best Regard,
Jeff Zhang


From: Anand Murali anand_vi...@yahoo.commailto:anand_vi...@yahoo.com
Reply-To: user@hadoop.apache.orgmailto:user@hadoop.apache.org 
user@hadoop.apache.orgmailto:user@hadoop.apache.org, Anand Murali 
anand_vi...@yahoo.commailto:anand_vi...@yahoo.com
Date: Wednesday, April 1, 2015 at 2:28 PM
To: user@hadoop.apache.orgmailto:user@hadoop.apache.org 
user@hadoop.apache.orgmailto:user@hadoop.apache.org
Subject: Hadoop 2.6 issue

Dear All:

I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
JAVA_PATH. Please find below error message

anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.



anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$

I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


Core-site.xml
?xml version=1.0?
!--core-site.xml--
configuration
property
namefs.default.name/name
valuehdfs://localhost//value
/property
/configuration

HDFS-site.xml
?xml version=1.0?
!-- hdfs-site.xml --
configuration
property
namedfs.replication/name
value1/value
/property
/configuration

Mapred-site.xml
?xml version=1.0?
!--mapred-site.xml--
configuration
property
namemapred.job.tracker/name
valuelocalhost:8021/value
/property
/configuration

Shall be thankful, if somebody can advise.

Regards,


Anand Murali
11/7, 'Anand Vihar', Kandasamy St, Mylapore
Chennai - 600 004, India
Ph: (044)- 28474593/ 43526162 (voicemail)


Re: Simple MapReduce logic using Java API

2015-04-01 Thread Ranadip Chatterjee
Eating up the IOException in the mapper looks suspicious to me. That can
silently consume the input without any output. Also check in the map sysout
messages for the console print output.

As an aside, since you are not doing anything in the reduce, try setting
number of reduces to 0. That will force the job to be map only and make it
simpler.

Regards,
Ranadip
On 31 Mar 2015 19:23, Shahab Yunus shahab.yu...@gmail.com wrote:

 What is the reason of using the queue?
 job.getConfiguration().set(mapred.job.queue.name, exp_dsa);

 Is your mapper or reducer even been called?

 Try adding the override annotation to the map/reduce methods as below:

 @Override
  public void map(Object key, Text value, Context context) throws
 IOException, InterruptedException {

 Regards,
 Shahab

 On Tue, Mar 31, 2015 at 3:26 AM, bradford li bradfor...@gmail.com wrote:

 I'm not sure why my Mapper and Reducer have no output. The logic behind
 my code is, given a file of UUIDs (new line separated), I want to use
 `globStatus` to display all the paths to all potential files that the UUID
 might be in. Open and read the file. Each file contains 1-n lines of JSON.
 The UUID is in `event_header.event_id` in the JSON.

 Right now the MapReduce job runs without errors. However, something is
 wrong because I dont have any output. I'm not sure how to debug MapReduce
 jobs as well. If someone could provide me a source that would be awesome!
 The expected output from this program should be

 UUID_1 1
 UUID_2 1
 UUID_3 1
 UUID_4 1
 ...
 ...
 UUID_n 1

 In my logic, the output file should be the UUIDs with a 1 next to them
 because upon found, 1 is written, if not found 0 is written. They should be
 all 1's because I pulled the UUIDs from the source.

 My Reducer currently does not do anything except I just wanted to see if
 I could get some simple logic working. There are most likely bugs in my
 code as I dont know have a easy way to debug MapReduce jobs

 Driver:

 public class SearchUUID {

 public static void main(String[] args) throws Exception {
 Configuration conf = new Configuration();
 Job job = Job.getInstance(conf, UUID Search);
 job.getConfiguration().set(mapred.job.queue.name,
 exp_dsa);
 job.setJarByClass(SearchUUID.class);
 job.setMapperClass(UUIDMapper.class);
 job.setReducerClass(UUIDReducer.class);
 job.setOutputKeyClass(Text.class);
 job.setOutputValueClass(Text.class);
 FileInputFormat.addInputPath(job, new Path(args[0]));
 FileOutputFormat.setOutputPath(job, new Path(args[1]));
 System.exit(job.waitForCompletion(true) ? 0 : 1);
 }
 }


 UUIDMapper:

 public class UUIDMapper extends MapperObject, Text, Text, Text {
 public void map(Object key, Text value, Context context) throws
 IOException, InterruptedException {

 try {
 Text one = new Text(1);
 Text zero = new Text(0);

 FileSystem fs = FileSystem.get(new Configuration());
 FileStatus[] paths = fs.globStatus(new
 Path(/data/path/to/file/d_20150330-1650));
 for (FileStatus path : paths) {
 BufferedReader br = new BufferedReader(new
 InputStreamReader(fs.open(path.getPath(;
 String json_string = br.readLine();
 while (json_string != null) {
 JsonElement jelement = new
 JsonParser().parse(json_string);
 JsonObject jsonObject =
 jelement.getAsJsonObject();
 jsonObject =
 jsonObject.getAsJsonObject(event_header);
 jsonObject =
 jsonObject.getAsJsonObject(event_id);

 if
 (value.toString().equals(jsonObject.getAsString())) {
 System.out.println(value.toString() +
 slkdjfksajflkjsfdkljsadfk;ljasklfjklasjfklsadl;sjdf);
 context.write(value, one);
 } else {
 context.write(value, zero);
 }

 json_string = br.readLine();
 }
 }
 } catch (IOException failed) {
 }
 }
 }


 Reducer:

 public class UUIDReducer extends ReducerText, Text, Text, Text{

 public void reduce(Text key, Text value, Context context) throws
 IOException, InterruptedException{
 context.write(key, value);
 }
 }





Re: Simple MapReduce logic using Java API

2015-04-01 Thread Harshit Mathur
Why are you reading the files with buffered reader in map function.


The problem with your code might be because of the following reason,
The files in /data/path/to/file/d_20150330-1650 will be locally stored
and will not be accessible to the mappers running on different nodes, and
as in your mapper code the IOException is eaten up, you are not getting the
proper stack trace.

I think you should use distributed cache to read this files in
/data/path/to/file/d_20150330-1650 and then use the setup method to
access the data in these files.


BR,
Harshit Mathur

On Wed, Apr 1, 2015 at 11:57 AM, Ranadip Chatterjee ranadi...@gmail.com
wrote:

 Eating up the IOException in the mapper looks suspicious to me. That can
 silently consume the input without any output. Also check in the map sysout
 messages for the console print output.

 As an aside, since you are not doing anything in the reduce, try setting
 number of reduces to 0. That will force the job to be map only and make it
 simpler.

 Regards,
 Ranadip
 On 31 Mar 2015 19:23, Shahab Yunus shahab.yu...@gmail.com wrote:

 What is the reason of using the queue?
 job.getConfiguration().set(mapred.job.queue.name, exp_dsa);

 Is your mapper or reducer even been called?

 Try adding the override annotation to the map/reduce methods as below:

 @Override
  public void map(Object key, Text value, Context context) throws
 IOException, InterruptedException {

 Regards,
 Shahab

 On Tue, Mar 31, 2015 at 3:26 AM, bradford li bradfor...@gmail.com
 wrote:

 I'm not sure why my Mapper and Reducer have no output. The logic behind
 my code is, given a file of UUIDs (new line separated), I want to use
 `globStatus` to display all the paths to all potential files that the UUID
 might be in. Open and read the file. Each file contains 1-n lines of JSON.
 The UUID is in `event_header.event_id` in the JSON.

 Right now the MapReduce job runs without errors. However, something is
 wrong because I dont have any output. I'm not sure how to debug MapReduce
 jobs as well. If someone could provide me a source that would be awesome!
 The expected output from this program should be

 UUID_1 1
 UUID_2 1
 UUID_3 1
 UUID_4 1
 ...
 ...
 UUID_n 1

 In my logic, the output file should be the UUIDs with a 1 next to them
 because upon found, 1 is written, if not found 0 is written. They should be
 all 1's because I pulled the UUIDs from the source.

 My Reducer currently does not do anything except I just wanted to see if
 I could get some simple logic working. There are most likely bugs in my
 code as I dont know have a easy way to debug MapReduce jobs

 Driver:

 public class SearchUUID {

 public static void main(String[] args) throws Exception {
 Configuration conf = new Configuration();
 Job job = Job.getInstance(conf, UUID Search);
 job.getConfiguration().set(mapred.job.queue.name,
 exp_dsa);
 job.setJarByClass(SearchUUID.class);
 job.setMapperClass(UUIDMapper.class);
 job.setReducerClass(UUIDReducer.class);
 job.setOutputKeyClass(Text.class);
 job.setOutputValueClass(Text.class);
 FileInputFormat.addInputPath(job, new Path(args[0]));
 FileOutputFormat.setOutputPath(job, new Path(args[1]));
 System.exit(job.waitForCompletion(true) ? 0 : 1);
 }
 }


 UUIDMapper:

 public class UUIDMapper extends MapperObject, Text, Text, Text {
 public void map(Object key, Text value, Context context) throws
 IOException, InterruptedException {

 try {
 Text one = new Text(1);
 Text zero = new Text(0);

 FileSystem fs = FileSystem.get(new Configuration());
 FileStatus[] paths = fs.globStatus(new
 Path(/data/path/to/file/d_20150330-1650));
 for (FileStatus path : paths) {
 BufferedReader br = new BufferedReader(new
 InputStreamReader(fs.open(path.getPath(;
 String json_string = br.readLine();
 while (json_string != null) {
 JsonElement jelement = new
 JsonParser().parse(json_string);
 JsonObject jsonObject =
 jelement.getAsJsonObject();
 jsonObject =
 jsonObject.getAsJsonObject(event_header);
 jsonObject =
 jsonObject.getAsJsonObject(event_id);

 if
 (value.toString().equals(jsonObject.getAsString())) {
 System.out.println(value.toString() +
 slkdjfksajflkjsfdkljsadfk;ljasklfjklasjfklsadl;sjdf);
 context.write(value, one);
 } else {
 context.write(value, zero);
 }

 json_string = br.readLine();
 }
 }
 } catch (IOException failed) {
 

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Tried export in hadoop-env.sh. Does not work either
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
jzh...@hortonworks.com wrote:
   

 
Try to export JAVA_HOME in hadoop-env.sh

Best Regard,Jeff Zhang

From: Anand Murali anand_vi...@yahoo.com
Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali 
anand_vi...@yahoo.com
Date: Wednesday, April 1, 2015 at 2:28 PM
To: user@hadoop.apache.org user@hadoop.apache.org
Subject: Hadoop 2.6 issue

Dear All:
I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
JAVA_PATH. Please find below error message
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.


anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


Core-site.xml?xml version=1.0?
!--core-site.xml--
configuration
    property
        namefs.default.name/name
        valuehdfs://localhost//value
    /property
/configuration

HDFS-site.xml?xml version=1.0?
!-- hdfs-site.xml --
configuration
    property
        namedfs.replication/name
        value1/value
    /property
/configuration

Mapred-site.xml?xml version=1.0?
!--mapred-site.xml--
configuration
    property
        namemapred.job.tracker/name
        valuelocalhost:8021/value
    /property
/configuration

Shall be thankful, if somebody can advise.
Regards,  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 
600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)

  

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
I continue to get the samede error.I
export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)

when I echo $JAVA_HOME it shows me the above path but when I $java -version, it 
gives me openjdk version
start-dfs.sh ... errors out saying JAVA_HOME not set., but echo shows 
JAVA_HOME. Strange !!

Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 2:22 PM, Anand Murali anand_vi...@yahoo.com 
wrote:
   

 Ok thanks. Shall do

Sent from my iPhone
On 01-Apr-2015, at 2:19 pm, Ram Kumar ramkumar.bash...@gmail.com wrote:


Anand,

Try Oracle JDK instead of Open JDK.

Regards,
Ramkumar Bashyam

On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali anand_vi...@yahoo.com wrote:

Tried export in hadoop-env.sh. Does not work either
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
jzh...@hortonworks.com wrote:
   

 
Try to export JAVA_HOME in hadoop-env.sh

Best Regard,Jeff Zhang

From: Anand Murali anand_vi...@yahoo.com
Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali 
anand_vi...@yahoo.com
Date: Wednesday, April 1, 2015 at 2:28 PM
To: user@hadoop.apache.org user@hadoop.apache.org
Subject: Hadoop 2.6 issue

Dear All:
I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
JAVA_PATH. Please find below error message
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.


anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


Core-site.xml?xml version=1.0?
!--core-site.xml--
configuration
    property
        namefs.default.name/name
        valuehdfs://localhost//value
    /property
/configuration

HDFS-site.xml?xml version=1.0?
!-- hdfs-site.xml --
configuration
    property
        namedfs.replication/name
        value1/value
    /property
/configuration

Mapred-site.xml?xml version=1.0?
!--mapred-site.xml--
configuration
    property
        namemapred.job.tracker/name
        valuelocalhost:8021/value
    /property
/configuration

Shall be thankful, if somebody can advise.
Regards,  Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 
600 004, IndiaPh: (044)- 28474593/ 43526162 (voicemail)

   




  

Re: Hadoop 2.6 issue

2015-04-01 Thread Ram Kumar
Anand,

Try Oracle JDK instead of Open JDK.

Regards,
Ramkumar Bashyam

On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali anand_vi...@yahoo.com wrote:

 Tried export in hadoop-env.sh. Does not work either

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
 jzh...@hortonworks.com wrote:



  Try to export JAVA_HOME in hadoop-env.sh


  Best Regard,
 Jeff Zhang


   From: Anand Murali anand_vi...@yahoo.com
 Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali
 anand_vi...@yahoo.com
 Date: Wednesday, April 1, 2015 at 2:28 PM
 To: user@hadoop.apache.org user@hadoop.apache.org
 Subject: Hadoop 2.6 issue

Dear All:

  I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME
 and JAVA_PATH. Please find below error message

  anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config
 /home/anand_vihar/hadoop-2.6.0/conf
 Starting namenodes on [localhost]
 localhost: Error: JAVA_HOME is not set and could not be found.
 cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
 Starting secondary namenodes [0.0.0.0]
 0.0.0.0: Error: JAVA_HOME is not set and could not be found.



 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo
 * $JAVA_HOME /usr/lib/jvm/java-1.7.0-openjdk-amd64*
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo
 * $HADOOP_INSTALL /home/anand_vihar/hadoop-2.6.0*
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo* $PATH*

 :/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$

  I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.


  Core-site.xml
  ?xml version=1.0?
 !--core-site.xml--
 configuration
 property
 namefs.default.name/name
 valuehdfs://localhost//value
 /property
 /configuration

  HDFS-site.xml
  ?xml version=1.0?
 !-- hdfs-site.xml --
 configuration
 property
 namedfs.replication/name
 value1/value
 /property
 /configuration

  Mapred-site.xml
 ?xml version=1.0?
 !--mapred-site.xml--
 configuration
 property
 namemapred.job.tracker/name
 valuelocalhost:8021/value
 /property
 /configuration

  Shall be thankful, if somebody can advise.

  Regards,


  Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)





Re: Run my own application master on a specific node in a YARN cluster

2015-04-01 Thread Drake민영근
Very interesting, BTW. So you try to launch app-master with YARN Container
but your own node-manager without YARN Container, Am I right?

Drake 민영근 Ph.D
kt NexR

On Wed, Apr 1, 2015 at 3:38 PM, Dongwon Kim eastcirc...@postech.ac.kr
wrote:

 Thanks for your input but I need to launch my own node manager
 (different from the Yarn NM) running on each node.
 (which is not explained in the original question)

 If I were to launch just a single master with a well-known address,
 ZooKeeper would be a great solution!
 Thanks.

 Dongwon Kim

 2015-03-31 10:47 GMT+09:00 Drake민영근 drake@nexr.com:
  Hi,
 
  In these circumstances, there is no easy way to do that. Maybe use
  workaround. How about using zookeeper for shared storage? The app master
  create predefined zookeeper node when starting with current machine's IP
 and
  Clients always look for that zookeeper node for app master's location.
 
  Thanks.
 
 
  Drake 민영근 Ph.D
  kt NexR
 
  On Mon, Mar 30, 2015 at 11:04 AM, Dongwon Kim eastcirc...@postech.ac.kr
 
  wrote:
 
  Hello,
 
  First of all, I'm using Hadoop-2.6.0. I want to launch my own app
  master on a specific node in a YARN cluster in order to open a server
  on a predetermined IP address and port. To that end, I wrote a driver
  program in which I created a ResourceRequest object and called
  setResourceName method to set a hostname, and attached it to a
  ApplicationSubmissionContext object by
  callingsetAMContainerResourceRequest method.
 
  I tried several times but couldn't launch the app master on a specific
  node. After searching code, I found that RMAppAttemptImpl invalidates
  what I've set in ResourceRequest as follows:
 
  // Currently, following fields are all hard code,
  // TODO: change these fields when we want to support
  // priority/resource-name/relax-locality specification for AM
  containers
  // allocation.
  appAttempt.amReq.setNumContainers(1);
  appAttempt.amReq.setPriority(AM_CONTAINER_PRIORITY);
  appAttempt.amReq.setResourceName(ResourceRequest.ANY);
  appAttempt.amReq.setRelaxLocality(true);
 
  Is there another way to launch a container for an application master
  on a specific node in Hadoop-2.6.0?
 
  Thanks.
 
  Dongwon Kim
 
 



Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Ok thanks. Shall do

Sent from my iPhone

 On 01-Apr-2015, at 2:19 pm, Ram Kumar ramkumar.bash...@gmail.com wrote:
 
 Anand,
 
 Try Oracle JDK instead of Open JDK.
 
 Regards,
 Ramkumar Bashyam
 
 On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali anand_vi...@yahoo.com wrote:
 Tried export in hadoop-env.sh. Does not work either
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 
 
 
 On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
 jzh...@hortonworks.com wrote:
 
 
 
 Try to export JAVA_HOME in hadoop-env.sh
 
 
 Best Regard,
 Jeff Zhang
 
 
 From: Anand Murali anand_vi...@yahoo.com
 Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali 
 anand_vi...@yahoo.com
 Date: Wednesday, April 1, 2015 at 2:28 PM
 To: user@hadoop.apache.org user@hadoop.apache.org
 Subject: Hadoop 2.6 issue
 
 Dear All:
 
 I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
 JAVA_PATH. Please find below error message
 
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
 /home/anand_vihar/hadoop-2.6.0/conf
 Starting namenodes on [localhost]
 localhost: Error: JAVA_HOME is not set and could not be found.
 cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
 Starting secondary namenodes [0.0.0.0]
 0.0.0.0: Error: JAVA_HOME is not set and could not be found.
 
 
 
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
 /usr/lib/jvm/java-1.7.0-openjdk-amd64
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
 /home/anand_vihar/hadoop-2.6.0
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
 :/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 
 
 I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.
 
 
 Core-site.xml
 ?xml version=1.0?
 !--core-site.xml--
 configuration
 property
 namefs.default.name/name
 valuehdfs://localhost//value
 /property
 /configuration
 
 HDFS-site.xml
 ?xml version=1.0?
 !-- hdfs-site.xml --
 configuration
 property
 namedfs.replication/name
 value1/value
 /property
 /configuration
 
 Mapred-site.xml
 ?xml version=1.0?
 !--mapred-site.xml--
 configuration
 property
 namemapred.job.tracker/name
 valuelocalhost:8021/value
 /property
 /configuration
 
 Shall be thankful, if somebody can advise.
 
 Regards,
  
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 


Re: Hadoop 2.6 issue

2015-04-01 Thread Ravindra Kumar Naik
Hi,

If you are using Ubuntu then add these lines to /etc/environment
JAVA_HOME=*actual path to jdk*
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

Please put the actual path to JDK in the first line.

Regards,
Ravindra


On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net
wrote:

 Anand,

 Sorry about that, I was assuming Redhat/Centos.

 For Ubuntu, try sudo update-alternatives --config java.



 Sent from my Verizon Wireless 4G LTE smartphone


  Original message 
 From: Anand Murali anand_vi...@yahoo.com
 Date: 04/01/2015 7:22 AM (GMT-05:00)
 To: user@hadoop.apache.org
 Subject: Re: Hadoop 2.6 issue

 Dear Mr.Roland:

 The alternatives command errors out. I have the extracted version of the
 Oracle JDK7. However I am ignorant regarding its installation on Ubuntu.
 Can you point me to installation material so that I can look up and try.

 Thanks

 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
 roland.depra...@cox.net wrote:


 Anand,

 My guess is that your alternatives setup isn’t complete.

 At a prompt, as su, run the command ‘alternatives - - config java’. Make
 sure that the oracle version is listed and is marked as the active one.

 If it is not, go through the steps to make sure it is.

 -  rd

 *From:* Anand Murali [mailto:anand_vi...@yahoo.com]
 *Sent:* Wednesday, April 01, 2015 5:42 AM
 *To:* user@hadoop.apache.org
 *Subject:* Re: Hadoop 2.6 issue

 I continue to get the samede error.I

 export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)


 when I echo $JAVA_HOME it shows me the above path but when I $java
 -version, it gives me openjdk version

 start-dfs.sh ... errors out saying JAVA_HOME not set., but echo shows
 JAVA_HOME. Strange !!


 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)




Re: Run my own application master on a specific node in a YARN cluster

2015-04-01 Thread Dongwon Kim
Thanks for your input but I need to launch my own node manager
(different from the Yarn NM) running on each node.
(which is not explained in the original question)

If I were to launch just a single master with a well-known address,
ZooKeeper would be a great solution!
Thanks.

Dongwon Kim

2015-03-31 10:47 GMT+09:00 Drake민영근 drake@nexr.com:
 Hi,

 In these circumstances, there is no easy way to do that. Maybe use
 workaround. How about using zookeeper for shared storage? The app master
 create predefined zookeeper node when starting with current machine's IP and
 Clients always look for that zookeeper node for app master's location.

 Thanks.


 Drake 민영근 Ph.D
 kt NexR

 On Mon, Mar 30, 2015 at 11:04 AM, Dongwon Kim eastcirc...@postech.ac.kr
 wrote:

 Hello,

 First of all, I'm using Hadoop-2.6.0. I want to launch my own app
 master on a specific node in a YARN cluster in order to open a server
 on a predetermined IP address and port. To that end, I wrote a driver
 program in which I created a ResourceRequest object and called
 setResourceName method to set a hostname, and attached it to a
 ApplicationSubmissionContext object by
 callingsetAMContainerResourceRequest method.

 I tried several times but couldn't launch the app master on a specific
 node. After searching code, I found that RMAppAttemptImpl invalidates
 what I've set in ResourceRequest as follows:

 // Currently, following fields are all hard code,
 // TODO: change these fields when we want to support
 // priority/resource-name/relax-locality specification for AM
 containers
 // allocation.
 appAttempt.amReq.setNumContainers(1);
 appAttempt.amReq.setPriority(AM_CONTAINER_PRIORITY);
 appAttempt.amReq.setResourceName(ResourceRequest.ANY);
 appAttempt.amReq.setRelaxLocality(true);

 Is there another way to launch a container for an application master
 on a specific node in Hadoop-2.6.0?

 Thanks.

 Dongwon Kim




Re: Hadoop 2.6 issue

2015-04-01 Thread roland.depratti


Anand,
Sorry about that, I was assuming Redhat/Centos.
For Ubuntu, try sudo update-alternatives --config java.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Anand Murali anand_vi...@yahoo.com 
Date: 04/01/2015  7:22 AM  (GMT-05:00) 
To: user@hadoop.apache.org 
Subject: Re: Hadoop 2.6 issue 

Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail)  


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:


 #yiv4496133200 #yiv4496133200 --
 
 _filtered #yiv4496133200 {font-family:Helvetica;panose-1:2 11 6 4 2 2 2 2 2 4;}
 _filtered #yiv4496133200 {font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;}
 _filtered #yiv4496133200 {font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;}
 _filtered #yiv4496133200 {font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;}
 _filtered #yiv4496133200 {font-family:Tahoma;panose-1:2 11 6 4 3 5 4 4 2 4;}
#yiv4496133200  
#yiv4496133200 p.yiv4496133200MsoNormal, #yiv4496133200 
li.yiv4496133200MsoNormal, #yiv4496133200 div.yiv4496133200MsoNormal
{margin:0in;margin-bottom:.0001pt;font-size:12.0pt;}
#yiv4496133200 a:link, #yiv4496133200 span.yiv4496133200MsoHyperlink
{color:blue;text-decoration:underline;}
#yiv4496133200 a:visited, #yiv4496133200 span.yiv4496133200MsoHyperlinkFollowed
{color:purple;text-decoration:underline;}
#yiv4496133200 p.yiv4496133200MsoAcetate, #yiv4496133200 
li.yiv4496133200MsoAcetate, #yiv4496133200 div.yiv4496133200MsoAcetate
{margin:0in;margin-bottom:.0001pt;font-size:8.0pt;}
#yiv4496133200 p.yiv4496133200MsoListParagraph, #yiv4496133200 
li.yiv4496133200MsoListParagraph, #yiv4496133200 
div.yiv4496133200MsoListParagraph

{margin-top:0in;margin-right:0in;margin-bottom:0in;margin-left:.5in;margin-bottom:.0001pt;font-size:12.0pt;}
#yiv4496133200 span
{}
#yiv4496133200 span.yiv4496133200BalloonTextChar
{}
#yiv4496133200 span.yiv4496133200EmailStyle20
{color:#1F497D;}
#yiv4496133200 .yiv4496133200MsoChpDefault
{font-size:10.0pt;}
 _filtered #yiv4496133200 {margin:1.0in 1.0in 1.0in 1.0in;}
#yiv4496133200 div.yiv4496133200WordSection1
{}
#yiv4496133200  
 _filtered #yiv4496133200 {}
 _filtered #yiv4496133200 {}
 _filtered #yiv4496133200 {}
 _filtered #yiv4496133200 {font-family:Wingdings;}
 _filtered #yiv4496133200 {font-family:Symbol;}
 _filtered #yiv4496133200 {}
 _filtered #yiv4496133200 {font-family:Wingdings;}
 _filtered #yiv4496133200 {font-family:Symbol;}
 _filtered #yiv4496133200 {}
 _filtered #yiv4496133200 {font-family:Wingdings;}
#yiv4496133200 ol
{margin-bottom:0in;}
#yiv4496133200 ul
{margin-bottom:0in;}
#yiv4496133200 Anand,  My guess is that your alternatives setup isn’t complete. 
 At a prompt, as su, run the command ‘alternatives - - config java’. Make sure 
that the oracle version is listed and is marked as the active one.   If it is 
not, go through the steps to make sure it is.  -  rd  From: Anand 
Murali [mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)  

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:
   

 #yiv4496133200 #yiv4496133200 -- _filtered #yiv4496133200 
{font-family:Helvetica;panose-1:2 11 6 4 2 2 2 2 2 4;} _filtered #yiv4496133200 
{font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;} _filtered #yiv4496133200 
{font-family:Wingdings;panose-1:5 0 0 0 0 0 0 0 0 0;} _filtered #yiv4496133200 
{font-family:Calibri;panose-1:2 15 5 2 2 2 4 3 2 4;} _filtered #yiv4496133200 
{font-family:Tahoma;panose-1:2 11 6 4 3 5 4 4 2 4;}#yiv4496133200 
#yiv4496133200 p.yiv4496133200MsoNormal, #yiv4496133200 
li.yiv4496133200MsoNormal, #yiv4496133200 div.yiv4496133200MsoNormal 
{margin:0in;margin-bottom:.0001pt;font-size:12.0pt;}#yiv4496133200 a:link, 
#yiv4496133200 span.yiv4496133200MsoHyperlink 
{color:blue;text-decoration:underline;}#yiv4496133200 a:visited, #yiv4496133200 
span.yiv4496133200MsoHyperlinkFollowed 
{color:purple;text-decoration:underline;}#yiv4496133200 
p.yiv4496133200MsoAcetate, #yiv4496133200 li.yiv4496133200MsoAcetate, 
#yiv4496133200 div.yiv4496133200MsoAcetate 
{margin:0in;margin-bottom:.0001pt;font-size:8.0pt;}#yiv4496133200 
p.yiv4496133200MsoListParagraph, #yiv4496133200 
li.yiv4496133200MsoListParagraph, #yiv4496133200 
div.yiv4496133200MsoListParagraph 
{margin-top:0in;margin-right:0in;margin-bottom:0in;margin-left:.5in;margin-bottom:.0001pt;font-size:12.0pt;}#yiv4496133200
 span {}#yiv4496133200 span.yiv4496133200BalloonTextChar {}#yiv4496133200 
span.yiv4496133200EmailStyle20 {color:#1F497D;}#yiv4496133200 
.yiv4496133200MsoChpDefault {font-size:10.0pt;} _filtered #yiv4496133200 
{margin:1.0in 1.0in 1.0in 1.0in;}#yiv4496133200 div.yiv4496133200WordSection1 
{}#yiv4496133200 _filtered #yiv4496133200 {} _filtered #yiv4496133200 {} 
_filtered #yiv4496133200 {} _filtered #yiv4496133200 {font-family:Wingdings;} 
_filtered #yiv4496133200 {font-family:Symbol;} _filtered #yiv4496133200 {} 
_filtered #yiv4496133200 {font-family:Wingdings;} _filtered #yiv4496133200 
{font-family:Symbol;} _filtered #yiv4496133200 {} _filtered #yiv4496133200 
{font-family:Wingdings;}#yiv4496133200 ol {margin-bottom:0in;}#yiv4496133200 ul 
{margin-bottom:0in;}#yiv4496133200 Anand,  My guess is that your alternatives 
setup isn’t complete.  At a prompt, as su, run the command ‘alternatives - - 
config java’. Make sure that the oracle version is listed and is marked as the 
active one.   If it is not, go through the steps to make sure it is.  - 
 rd  From: Anand Murali [mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)    On Wednesday, April 1, 2015 2:22 PM, Anand Murali 
anand_vi...@yahoo.com wrote:  Ok thanks. Shall do

Sent from my iPhone
On 01-Apr-2015, at 2:19 pm, Ram Kumar ramkumar.bash...@gmail.com wrote:
Anand,Try Oracle JDK instead of Open JDK.Regards,Ramkumar Bashyam  On Wed, Apr 
1, 2015 at 1:25 PM, Anand Murali anand_vi...@yahoo.com wrote:Tried export in 
hadoop-env.sh. Does not work either Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)    On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
jzh...@hortonworks.com wrote:    Try to export JAVA_HOME in hadoop-env.sh    
Best Regard,Jeff Zhang    From: Anand Murali anand_vi...@yahoo.com
Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali 
anand_vi...@yahoo.com
Date: Wednesday, April 1, 2015 at 2:28 PM
To: user@hadoop.apache.org user@hadoop.apache.org
Subject: Hadoop 2.6 issue  Dear All:  I am unable to start Hadoop even after 
setting HADOOP_INSTALL,JAVA_HOME and JAVA_PATH. Please find below error message 
 anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0: Error: JAVA_HOME is not set and could not be found.    
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME

RE: Hadoop 2.6 issue

2015-04-01 Thread Roland DePratti
Anand,

 

My guess is that your alternatives setup isn’t complete.

 

At a prompt, as su, run the command ‘alternatives - - config java’. Make sure 
that the oracle version is listed and is marked as the active one. 

 

If it is not, go through the steps to make sure it is.

 

-  rd

 

From: Anand Murali [mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue

 

I continue to get the samede error.I

 

export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)

 

 

when I echo $JAVA_HOME it shows me the above path but when I $java -version, it 
gives me openjdk version

 

start-dfs.sh ... errors out saying JAVA_HOME not set., but echo shows 
JAVA_HOME. Strange !!

 

 

Regards,

 

Anand Murali  

11/7, 'Anand Vihar', Kandasamy St, Mylapore

Chennai - 600 004, India

Ph: (044)- 28474593/ 43526162 (voicemail)

 

 

On Wednesday, April 1, 2015 2:22 PM, Anand Murali anand_vi...@yahoo.com wrote:

 

Ok thanks. Shall do

Sent from my iPhone


On 01-Apr-2015, at 2:19 pm, Ram Kumar ramkumar.bash...@gmail.com wrote:

Anand,

Try Oracle JDK instead of Open JDK.

Regards,

Ramkumar Bashyam

 

On Wed, Apr 1, 2015 at 1:25 PM, Anand Murali anand_vi...@yahoo.com wrote:

Tried export in hadoop-env.sh. Does not work either

 

Anand Murali  

11/7, 'Anand Vihar', Kandasamy St, Mylapore

Chennai - 600 004, India

Ph: (044)- 28474593/ 43526162 (voicemail)

 

 

On Wednesday, April 1, 2015 1:03 PM, Jianfeng (Jeff) Zhang 
jzh...@hortonworks.com wrote:

 

 

Try to export JAVA_HOME in hadoop-env.sh

 

 

Best Regard,

Jeff Zhang

 

 

From: Anand Murali anand_vi...@yahoo.com
Reply-To: user@hadoop.apache.org user@hadoop.apache.org, Anand Murali 
anand_vi...@yahoo.com
Date: Wednesday, April 1, 2015 at 2:28 PM
To: user@hadoop.apache.org user@hadoop.apache.org
Subject: Hadoop 2.6 issue

 

Dear All:

 

I am unable to start Hadoop even after setting HADOOP_INSTALL,JAVA_HOME and 
JAVA_PATH. Please find below error message

 

anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ start-dfs.sh --config 
/home/anand_vihar/hadoop-2.6.0/conf
Starting namenodes on [localhost]
localhost: Error: JAVA_HOME is not set and could not be found.
cat: /home/anand_vihar/hadoop-2.6.0/conf/slaves: No such file or directory
Starting secondary namenodes [0.0.0.0]
0.0.0.0 http://0.0.0.0/ : Error: JAVA_HOME is not set and could not be found.

 

 


anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $JAVA_HOME
/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $HADOOP_INSTALL
/home/anand_vihar/hadoop-2.6.0
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ echo $PATH
:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:/usr/local/games:/home/anand_vihar/hadoop-2.6.0/bin:/home/anand_vihar/hadoop-2.6.0/sbin:/usr/lib/jvm/java-1.7.0-openjdk-amd64:/usr/lib/jvm/java-1.7.0-openjdk-amd64
anand_vihar@Latitude-E5540:~/hadoop-2.6.0$ 

 

I HAVE MADE NO CHANGES IN HADOOP_ENV.sh and run it succesfully.

 

 

Core-site.xml

?xml version=1.0?
!--core-site.xml--
configuration
property
namefs.default.name http://fs.default.name/ /name
valuehdfs://localhost//value
/property
/configuration

 

HDFS-site.xml

?xml version=1.0?
!-- hdfs-site.xml --
configuration
property
namedfs.replication/name
value1/value
/property
/configuration

 

Mapred-site.xml

?xml version=1.0?
!--mapred-site.xml--
configuration
property
namemapred.job.tracker/name
valuelocalhost:8021/value
/property
/configuration

 

Shall be thankful, if somebody can advise.

 

Regards,

 

 

Anand Murali  

11/7, 'Anand Vihar', Kandasamy St, Mylapore

Chennai - 600 004, India

Ph: (044)- 28474593/ 43526162 (voicemail)

 

 

 



Re: Run my own application master on a specific node in a YARN cluster

2015-04-01 Thread Dongwon Kim
I'm doing precisely the opposite.
My own node manager (MY_NM) is an AM in YARN and, therefore, each
MY_NM is expected to run inside a YARN container.
What I trying to do is to execute the AM (MY_NM) on each slave.

For the reason, I need to launch an AM on a specific node but
Hadoop-2.6.0 ignores what I describe in a ResourceRequest object.
According to what I've understood, RMAppManager in ResourceManager
creates a RMAppImpl object for each application and a RMAppAttempt
object for each application attempt.
When the state of RMAppAttempt changes from SUBMITTED to SCHEDULED,
RMAppAttempt invalidates necessary information to launch a container
for AM such as # containers, priority, resource name (=hostname),
whether to relax locality as in the above code and then asks
YarnScheduler to allocate a container.
I conclude that any application master cannot be launched on a
specific node in Hadoop-2.6.0.

As a workaround, I try to execute an AM directly on a specific node
from the command line without submitting it to YARN and register the
AM to RM.
It however fails (of course :P) because the AM is not given a
delegation token necessary to communicate with a RPC server in RM.

Anyway I need to find another way of doing that or I have to alter the
design of my framework.
Thanks.

- Dongwon Kim

2015-04-01 17:50 GMT+09:00 Drake민영근 drake@nexr.com:
 Very interesting, BTW. So you try to launch app-master with YARN Container
 but your own node-manager without YARN Container, Am I right?

 Drake 민영근 Ph.D
 kt NexR

 On Wed, Apr 1, 2015 at 3:38 PM, Dongwon Kim eastcirc...@postech.ac.kr
 wrote:

 Thanks for your input but I need to launch my own node manager
 (different from the Yarn NM) running on each node.
 (which is not explained in the original question)

 If I were to launch just a single master with a well-known address,
 ZooKeeper would be a great solution!
 Thanks.

 Dongwon Kim

 2015-03-31 10:47 GMT+09:00 Drake민영근 drake@nexr.com:
  Hi,
 
  In these circumstances, there is no easy way to do that. Maybe use
  workaround. How about using zookeeper for shared storage? The app master
  create predefined zookeeper node when starting with current machine's IP
  and
  Clients always look for that zookeeper node for app master's location.
 
  Thanks.
 
 
  Drake 민영근 Ph.D
  kt NexR
 
  On Mon, Mar 30, 2015 at 11:04 AM, Dongwon Kim
  eastcirc...@postech.ac.kr
  wrote:
 
  Hello,
 
  First of all, I'm using Hadoop-2.6.0. I want to launch my own app
  master on a specific node in a YARN cluster in order to open a server
  on a predetermined IP address and port. To that end, I wrote a driver
  program in which I created a ResourceRequest object and called
  setResourceName method to set a hostname, and attached it to a
  ApplicationSubmissionContext object by
  callingsetAMContainerResourceRequest method.
 
  I tried several times but couldn't launch the app master on a specific
  node. After searching code, I found that RMAppAttemptImpl invalidates
  what I've set in ResourceRequest as follows:
 
  // Currently, following fields are all hard code,
  // TODO: change these fields when we want to support
  // priority/resource-name/relax-locality specification for AM
  containers
  // allocation.
  appAttempt.amReq.setNumContainers(1);
  appAttempt.amReq.setPriority(AM_CONTAINER_PRIORITY);
  appAttempt.amReq.setResourceName(ResourceRequest.ANY);
  appAttempt.amReq.setRelaxLocality(true);
 
  Is there another way to launch a container for an application master
  on a specific node in Hadoop-2.6.0?
 
  Thanks.
 
  Dongwon Kim
 
 




get started with Hadoop

2015-04-01 Thread Adam GRARE
Hello

I need your advise to start using Hadoop !

I created an AWS account and setup Elastic Map Reduce to test Amazon
solution

But, I need to know the best way to start using Hadoop

thanks,

Adam


Re: Hadoop 2.6 issue

2015-04-01 Thread Ravindra Kumar Naik
I meant /etc/environment. It should be present if you are using Ubuntu.

Regards,
Ravindra

On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com wrote:

 Mr. Ravindra

 I dont find any etc/environment. Can you be more specific please. I have
 done whatever you are saying in a user created batch program and run it,
 followed by running hadoop-env.sh and it still does not work.

 Thanks

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:


 Hi,

 If you are using Ubuntu then add these lines to /etc/environment
 JAVA_HOME=*actual path to jdk*

 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

 Please put the actual path to JDK in the first line.

 Regards,
 Ravindra


 On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net
 wrote:

 Anand,

 Sorry about that, I was assuming Redhat/Centos.

 For Ubuntu, try sudo update-alternatives --config java.



 Sent from my Verizon Wireless 4G LTE smartphone


  Original message 
 From: Anand Murali anand_vi...@yahoo.com
 Date: 04/01/2015 7:22 AM (GMT-05:00)
 To: user@hadoop.apache.org
 Subject: Re: Hadoop 2.6 issue

 Dear Mr.Roland:

 The alternatives command errors out. I have the extracted version of the
 Oracle JDK7. However I am ignorant regarding its installation on Ubuntu.
 Can you point me to installation material so that I can look up and try.

 Thanks

 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
 roland.depra...@cox.net wrote:


 Anand,

 My guess is that your alternatives setup isn’t complete.

 At a prompt, as su, run the command ‘alternatives - - config java’. Make
 sure that the oracle version is listed and is marked as the active one.

 If it is not, go through the steps to make sure it is.

 -  rd

 *From:* Anand Murali [mailto:anand_vi...@yahoo.com]
 *Sent:* Wednesday, April 01, 2015 5:42 AM
 *To:* user@hadoop.apache.org
 *Subject:* Re: Hadoop 2.6 issue

 I continue to get the samede error.I

 export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)


 when I echo $JAVA_HOME it shows me the above path but when I $java
 -version, it gives me openjdk version

 start-dfs.sh ... errors out saying JAVA_HOME not set., but echo shows
 JAVA_HOME. Strange !!


 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)








Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Mr. Ravindra:
I am using Ubuntu 14. Can you please provide the full path. I am logged in as 
root and it is not found in /etc. In any case what you have suggested I have 
tried creating a batch file and it does not work in my installation.
Thanks


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:50 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 I meant /etc/environment. It should be present if you are using Ubuntu.

Regards,
Ravindra

On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com wrote:

Mr. Ravindra 

I dont find any etc/environment. Can you be more specific please. I have done 
whatever you are saying in a user created batch program and run it, followed by 
running hadoop-env.sh and it still does not work.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 Hi,

If you are using Ubuntu then add these lines to /etc/environment 
JAVA_HOME=actual path to jdk
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

Please put the actual path to JDK in the first line.

Regards,
Ravindra


On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net wrote:

 Anand,
Sorry about that, I was assuming Redhat/Centos.
For Ubuntu, try sudo update-alternatives --config java.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Anand Murali anand_vi...@yahoo.com 
Date: 04/01/2015 7:22 AM (GMT-05:00) 
To: user@hadoop.apache.org 
Subject: Re: Hadoop 2.6 issue 

Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:
   

 Anand,  My guess is that your alternatives setup isn’t complete.  At a prompt, 
as su, run the command ‘alternatives - - config java’. Make sure that the 
oracle version is listed and is marked as the active one.   If it is not, go 
through the steps to make sure it is.  -  rd  From: Anand Murali 
[mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)  



   



  

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Mr. Ravindra 

I dont find any etc/environment. Can you be more specific please. I have done 
whatever you are saying in a user created batch program and run it, followed by 
running hadoop-env.sh and it still does not work.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 Hi,

If you are using Ubuntu then add these lines to /etc/environment 
JAVA_HOME=actual path to jdk
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

Please put the actual path to JDK in the first line.

Regards,
Ravindra


On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net wrote:

 Anand,
Sorry about that, I was assuming Redhat/Centos.
For Ubuntu, try sudo update-alternatives --config java.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Anand Murali anand_vi...@yahoo.com 
Date: 04/01/2015 7:22 AM (GMT-05:00) 
To: user@hadoop.apache.org 
Subject: Re: Hadoop 2.6 issue 

Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:
   

 Anand,  My guess is that your alternatives setup isn’t complete.  At a prompt, 
as su, run the command ‘alternatives - - config java’. Make sure that the 
oracle version is listed and is marked as the active one.   If it is not, go 
through the steps to make sure it is.  -  rd  From: Anand Murali 
[mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)  



  

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Mr. Roland:
This is what I get. How do I now get Oracle JDK to be identified
anand_vihar@Latitude-E5540:~$ sudo update-alternatives --config java
[sudo] password for anand_vihar: 
There is only one alternative in link group java (providing /usr/bin/java): 
/usr/lib/jvm/java-7-openjdk-amd64/jre/bin/java
Nothing to configure.

Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 Hi,

If you are using Ubuntu then add these lines to /etc/environment 
JAVA_HOME=actual path to jdk
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

Please put the actual path to JDK in the first line.

Regards,
Ravindra


On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net wrote:

 Anand,
Sorry about that, I was assuming Redhat/Centos.
For Ubuntu, try sudo update-alternatives --config java.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Anand Murali anand_vi...@yahoo.com 
Date: 04/01/2015 7:22 AM (GMT-05:00) 
To: user@hadoop.apache.org 
Subject: Re: Hadoop 2.6 issue 

Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:
   

 Anand,  My guess is that your alternatives setup isn’t complete.  At a prompt, 
as su, run the command ‘alternatives - - config java’. Make sure that the 
oracle version is listed and is marked as the active one.   If it is not, go 
through the steps to make sure it is.  -  rd  From: Anand Murali 
[mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)  



  

Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Mr. Ravindra:
This is visible, however I am unable to modify it, eventhough I have admin 
priveleges. I am new to the Linux environment. Shall be glad if you did advise. 
However, as I told you earlier, I have created a batch program which contains, 
JAVA_HOME setting, HADOOP_INSTALL setting and PATH setting. I have rfun this 
file but I am still unable to start the daemons. I am following Tom Whyte's 
-Hadoop definitive Guide book instructions on how to install Hadoop.
at $hadoop version works. I am able to format namenode, but fail to start 
daemons.
Reply most welcome.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 7:04 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 Are you sure that its not there, could you please check the output of this 
command

ls /etc/env*



On Wed, Apr 1, 2015 at 6:55 PM, Anand Murali anand_vi...@yahoo.com wrote:

Mr. Ravindra:
I am using Ubuntu 14. Can you please provide the full path. I am logged in as 
root and it is not found in /etc. In any case what you have suggested I have 
tried creating a batch file and it does not work in my installation.
Thanks


 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:50 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 I meant /etc/environment. It should be present if you are using Ubuntu.

Regards,
Ravindra

On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com wrote:

Mr. Ravindra 

I dont find any etc/environment. Can you be more specific please. I have done 
whatever you are saying in a user created batch program and run it, followed by 
running hadoop-env.sh and it still does not work.
Thanks
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
ravin.i...@gmail.com wrote:
   

 Hi,

If you are using Ubuntu then add these lines to /etc/environment 
JAVA_HOME=actual path to jdk
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

Please put the actual path to JDK in the first line.

Regards,
Ravindra


On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net wrote:

 Anand,
Sorry about that, I was assuming Redhat/Centos.
For Ubuntu, try sudo update-alternatives --config java.


Sent from my Verizon Wireless 4G LTE smartphone

 Original message 
From: Anand Murali anand_vi...@yahoo.com 
Date: 04/01/2015 7:22 AM (GMT-05:00) 
To: user@hadoop.apache.org 
Subject: Re: Hadoop 2.6 issue 

Dear Mr.Roland:
The alternatives command errors out. I have the extracted version of the Oracle 
JDK7. However I am ignorant regarding its installation on Ubuntu. Can you point 
me to installation material so that I can look up and try.
Thanks
Regards,
 Anand Murali  11/7, 'Anand Vihar', Kandasamy St, MylaporeChennai - 600 004, 
IndiaPh: (044)- 28474593/ 43526162 (voicemail) 


 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
roland.depra...@cox.net wrote:
   

 Anand,  My guess is that your alternatives setup isn’t complete.  At a prompt, 
as su, run the command ‘alternatives - - config java’. Make sure that the 
oracle version is listed and is marked as the active one.   If it is not, go 
through the steps to make sure it is.  -  rd  From: Anand Murali 
[mailto:anand_vi...@yahoo.com] 
Sent: Wednesday, April 01, 2015 5:42 AM
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue  I continue to get the samede error.I  export 
JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)    when I echo 
$JAVA_HOME it shows me the above path but when I $java -version, it gives me 
openjdk version  start-dfs.sh ... errors out saying JAVA_HOME not set., but 
echo shows JAVA_HOME. Strange !!    Regards, Anand Murali  11/7, 'Anand Vihar', 
Kandasamy St, MylaporeChennai - 600 004, IndiaPh: (044)- 28474593/ 43526162 
(voicemail)  



   



   



  

can't set partition class to the configuration

2015-04-01 Thread xeonmailinglist-gmail

Hi,

I have created a Mapper class[3] that filters out key values pairs that 
go to a specific partition. When I set the partition class in my code 
[1], I get the error in [2] and I don’t understand why this is 
happening. Any help to fix this?


[1]

|Configuration conf = cj.getConfiguration();
cj.setPartitionerClass(MyFilterMapper.class);
|

[2]

|The method setPartitionerClass(Class? extends Partitioner) in the type Job is not 
applicable for the arguments (ClassJobExecution.MyFilterMapper)
|

[3]

|public static class MyFilterMapper
extends MapperObject, Text, Text, IntWritable{

private Text word = new Text();
private IntWritable rvalue = new IntWritable();

public static final String REDUCE_TASK_REEXECUTE =
mapreduce.reduce.task.reexecute;
public static final int NULL_REDUCE_TASK = -1;

private Class? extends Partitioner?, ? partitionerClass;
private org.apache.hadoop.mapreduce.PartitionerObject, Text 
partitionerInstance;

public void map(Object key, Text value, Context context
) throws IOException, InterruptedException {
Configuration conf = context.getConfiguration();
partitionerInstance = new MyHashPartitionerObject, Text();

int[] task_reexecute = conf.getInts(REDUCE_TASK_REEXECUTE);
int nr_reduce_tasks = conf.getInt(mapreduce.job.reduces, 0);
System.out.println(Tasks reexecute:  + task_reexecute +  
NRREDUCETASKS:  + nr_reduce_tasks);
StringTokenizer itr = new StringTokenizer(value.toString());
while (itr.hasMoreTokens()) {
String wword = itr.nextToken();
Integer rrvalue = Integer.valueOf(itr.nextToken());
int partition = partitionerInstance.getPartition(wword, value, 
nr_reduce_tasks);

if (contains(partition, task_reexecute)) {
System.out.println(Partition Consumed:  + partition +  - key:  + 
key.toString() +  word:  + wword +  value -  + value.toString());
System.out.println(Partition Consumed:  + partition +  - word:  + 
wword +  value - );// + rrvalue);

word.set(wword);
rvalue.set(rrvalue);
context.write(word, rvalue);
}
}
}

public boolean contains(int partition, int[] set) {
for(int i=0; iset.length; i++){
if (partition == set[i])
return true;
}

return false;
}
}
|

​

--
--
Thanks,



Re: can't set partition class to the configuration

2015-04-01 Thread Ted Yu
The error message is very clear: a class which extends Partitioner is
expected.
Maybe you meant to specify MyHashPartitioner ?

Cheers

On Wed, Apr 1, 2015 at 7:54 AM, xeonmailinglist-gmail 
xeonmailingl...@gmail.com wrote:

  Hi,

 I have created a Mapper class[3] that filters out key values pairs that go
 to a specific partition. When I set the partition class in my code [1], I
 get the error in [2] and I don’t understand why this is happening. Any help
 to fix this?

 [1]

 Configuration conf = cj.getConfiguration();
 cj.setPartitionerClass(MyFilterMapper.class);

 [2]

 The method setPartitionerClass(Class? extends Partitioner) in the type Job 
 is not applicable for the arguments (ClassJobExecution.MyFilterMapper)

 [3]

 public static class MyFilterMapper
 extends MapperObject, Text, Text, IntWritable{

 private Text word = new Text();
 private IntWritable rvalue = new IntWritable();

 public static final String REDUCE_TASK_REEXECUTE =
 mapreduce.reduce.task.reexecute;
 public static final int NULL_REDUCE_TASK = -1;

 private Class? extends Partitioner?, ? partitionerClass;
 private org.apache.hadoop.mapreduce.PartitionerObject, Text 
 partitionerInstance;

 public void map(Object key, Text value, Context context
 ) throws IOException, InterruptedException {
 Configuration conf = context.getConfiguration();
 partitionerInstance = new MyHashPartitionerObject, Text();

 int[] task_reexecute = conf.getInts(REDUCE_TASK_REEXECUTE);
 int nr_reduce_tasks = conf.getInt(mapreduce.job.reduces, 0);
 System.out.println(Tasks reexecute:  + task_reexecute +  
 NRREDUCETASKS:  + nr_reduce_tasks);
 StringTokenizer itr = new StringTokenizer(value.toString());
 while (itr.hasMoreTokens()) {
 String wword = itr.nextToken();
 Integer rrvalue = Integer.valueOf(itr.nextToken());
 int partition = partitionerInstance.getPartition(wword, 
 value, nr_reduce_tasks);

 if (contains(partition, task_reexecute)) {
 System.out.println(Partition Consumed:  + partition +  
 - key:  + key.toString() +  word:  + wword +  value -  + 
 value.toString());
 System.out.println(Partition Consumed:  + partition +  
 - word:  + wword +  value - );// + rrvalue);

 word.set(wword);
 rvalue.set(rrvalue);
 context.write(word, rvalue);
 }
 }
 }

 public boolean contains(int partition, int[] set) {
 for(int i=0; iset.length; i++){
 if (partition == set[i])
 return true;
 }

 return false;
 }
 }

 ​

 --
 --
 Thanks,




Re: can't set partition class to the configuration

2015-04-01 Thread Shahab Yunus
See this for more details in how to write your own Custom Paritioner (even
if a bit outdated, they still give you the basic idea of what you need to
do).
http://hadooptutorial.wikispaces.com/Custom+partitioner
https://developer.yahoo.com/hadoop/tutorial/module5.html#partitioning

Regards,
Shahab

On Wed, Apr 1, 2015 at 11:03 AM, Shahab Yunus shahab.yu...@gmail.com
wrote:

 As the error tells you, you cannot use a class as a Partitioner if it does
 not satisfy the interface requirements of the partitioning mechanism. You
 need to set a class a Partitioner which extends or implements the Partioner
 contract.

 Regards,
 Shahab

 On Wed, Apr 1, 2015 at 10:54 AM, xeonmailinglist-gmail 
 xeonmailingl...@gmail.com wrote:

  Hi,

 I have created a Mapper class[3] that filters out key values pairs that
 go to a specific partition. When I set the partition class in my code [1],
 I get the error in [2] and I don’t understand why this is happening. Any
 help to fix this?

 [1]

 Configuration conf = cj.getConfiguration();
 cj.setPartitionerClass(MyFilterMapper.class);

 [2]

 The method setPartitionerClass(Class? extends Partitioner) in the type Job 
 is not applicable for the arguments (ClassJobExecution.MyFilterMapper)

 [3]

 public static class MyFilterMapper
 extends MapperObject, Text, Text, IntWritable{

 private Text word = new Text();
 private IntWritable rvalue = new IntWritable();

 public static final String REDUCE_TASK_REEXECUTE =
 mapreduce.reduce.task.reexecute;
 public static final int NULL_REDUCE_TASK = -1;

 private Class? extends Partitioner?, ? partitionerClass;
 private org.apache.hadoop.mapreduce.PartitionerObject, Text 
 partitionerInstance;

 public void map(Object key, Text value, Context context
 ) throws IOException, InterruptedException {
 Configuration conf = context.getConfiguration();
 partitionerInstance = new MyHashPartitionerObject, Text();

 int[] task_reexecute = conf.getInts(REDUCE_TASK_REEXECUTE);
 int nr_reduce_tasks = conf.getInt(mapreduce.job.reduces, 0);
 System.out.println(Tasks reexecute:  + task_reexecute +  
 NRREDUCETASKS:  + nr_reduce_tasks);
 StringTokenizer itr = new StringTokenizer(value.toString());
 while (itr.hasMoreTokens()) {
 String wword = itr.nextToken();
 Integer rrvalue = Integer.valueOf(itr.nextToken());
 int partition = partitionerInstance.getPartition(wword, 
 value, nr_reduce_tasks);

 if (contains(partition, task_reexecute)) {
 System.out.println(Partition Consumed:  + partition + 
  - key:  + key.toString() +  word:  + wword +  value -  + 
 value.toString());
 System.out.println(Partition Consumed:  + partition + 
  - word:  + wword +  value - );// + rrvalue);

 word.set(wword);
 rvalue.set(rrvalue);
 context.write(word, rvalue);
 }
 }
 }

 public boolean contains(int partition, int[] set) {
 for(int i=0; iset.length; i++){
 if (partition == set[i])
 return true;
 }

 return false;
 }
 }

 ​

 --
 --
 Thanks,





Re: can't set partition class to the configuration

2015-04-01 Thread Shahab Yunus
As the error tells you, you cannot use a class as a Partitioner if it does
not satisfy the interface requirements of the partitioning mechanism. You
need to set a class a Partitioner which extends or implements the Partioner
contract.

Regards,
Shahab

On Wed, Apr 1, 2015 at 10:54 AM, xeonmailinglist-gmail 
xeonmailingl...@gmail.com wrote:

  Hi,

 I have created a Mapper class[3] that filters out key values pairs that go
 to a specific partition. When I set the partition class in my code [1], I
 get the error in [2] and I don’t understand why this is happening. Any help
 to fix this?

 [1]

 Configuration conf = cj.getConfiguration();
 cj.setPartitionerClass(MyFilterMapper.class);

 [2]

 The method setPartitionerClass(Class? extends Partitioner) in the type Job 
 is not applicable for the arguments (ClassJobExecution.MyFilterMapper)

 [3]

 public static class MyFilterMapper
 extends MapperObject, Text, Text, IntWritable{

 private Text word = new Text();
 private IntWritable rvalue = new IntWritable();

 public static final String REDUCE_TASK_REEXECUTE =
 mapreduce.reduce.task.reexecute;
 public static final int NULL_REDUCE_TASK = -1;

 private Class? extends Partitioner?, ? partitionerClass;
 private org.apache.hadoop.mapreduce.PartitionerObject, Text 
 partitionerInstance;

 public void map(Object key, Text value, Context context
 ) throws IOException, InterruptedException {
 Configuration conf = context.getConfiguration();
 partitionerInstance = new MyHashPartitionerObject, Text();

 int[] task_reexecute = conf.getInts(REDUCE_TASK_REEXECUTE);
 int nr_reduce_tasks = conf.getInt(mapreduce.job.reduces, 0);
 System.out.println(Tasks reexecute:  + task_reexecute +  
 NRREDUCETASKS:  + nr_reduce_tasks);
 StringTokenizer itr = new StringTokenizer(value.toString());
 while (itr.hasMoreTokens()) {
 String wword = itr.nextToken();
 Integer rrvalue = Integer.valueOf(itr.nextToken());
 int partition = partitionerInstance.getPartition(wword, 
 value, nr_reduce_tasks);

 if (contains(partition, task_reexecute)) {
 System.out.println(Partition Consumed:  + partition +  
 - key:  + key.toString() +  word:  + wword +  value -  + 
 value.toString());
 System.out.println(Partition Consumed:  + partition +  
 - word:  + wword +  value - );// + rrvalue);

 word.set(wword);
 rvalue.set(rrvalue);
 context.write(word, rvalue);
 }
 }
 }

 public boolean contains(int partition, int[] set) {
 for(int i=0; iset.length; i++){
 if (partition == set[i])
 return true;
 }

 return false;
 }
 }

 ​

 --
 --
 Thanks,




Invalid token issue in yarn

2015-04-01 Thread Jeff Zhang
Sometimes my job will get the following error. What may be the reason for
this ? And is there any property that I can use to prevent this ?

Looks like someone got the same error.
http://mail-archives.apache.org/mod_mbox/hadoop-hdfs-user/201502.mbox/%3c54e64f97.7070...@ulul.org%3E

2015-04-01 17:52:27,120 WARN [AMRM Heartbeater thread] ipc.Client:
Exception encountered while connecting to the server :
org.apache.hadoop.ipc.RemoteException(org.apache.hadoop.security.token.SecretManager$InvalidToken):
appattempt_1427881755072_0009_01 not found in AMRMTokenSecretManager.

2015-04-01 17:52:27,123 ERROR [AMRM Heartbeater thread]
impl.AMRMClientAsyncImpl: Exception on heartbeat

org.apache.hadoop.security.token.SecretManager$InvalidToken:
appattempt_1427881755072_0009_01 not found in AMRMTokenSecretManager.

at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)

at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)

at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)

at java.lang.reflect.Constructor.newInstance(Constructor.java:526)

at org.apache.hadoop.yarn.ipc.RPCUtil.instantiateException(RPCUtil.java:53)

at
org.apache.hadoop.yarn.ipc.RPCUtil.unwrapAndThrowException(RPCUtil.java:104)

at
org.apache.hadoop.yarn.api.impl.pb.client.ApplicationMasterProtocolPBClientImpl.allocate(ApplicationMasterProtocolPBClientImpl.java:79)

at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)

at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:606)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invokeMethod(RetryInvocationHandler.java:187)

at
org.apache.hadoop.io.retry.RetryInvocationHandler.invoke(RetryInvocationHandler.java:102)

at com.sun.proxy.$Proxy29.allocate(Unknown Source)

at
org.apache.hadoop.yarn.client.api.impl.AMRMClientImpl.allocate(AMRMClientImpl.java:278)

at
org.apache.hadoop.yarn.client.api.async.impl.AMRMClientAsyncImpl$HeartbeatThread.run(AMRMClientAsyncImpl.java:224)

-- 
Best Regards

Jeff Zhang


Fwd: HMS error

2015-04-01 Thread Kumar Jayapal
Hello All,

Did any one got this error before.  I am working on database migration
task  from postgresql to MySQL.

 Here is what I did.

I took the dumps using PG_DUMP from PostgreSQL and converted it to
MySQL using PHP script.

 I don't see any error in creating the tables in MySQL db. I created
the database and user, granted the permission to user hive. When I
changed the configuration  and restarted the HMS service It errors
out. Please see the error log below.

2015-04-01 16:40:34,190 INFO
org.apache.hadoop.hive.metastore.HiveMetaStore: Starting hive
metastore on port 9083
2015-04-01 16:40:35,038 INFO
org.apache.hadoop.security.UserGroupInformation: Login successful for
user hive/hmscdh01094p001.corp.costco@systems.costco.com using
keytab file hive.keytab
2015-04-01 16:40:35,055 INFO
org.apache.hadoop.hive.metastore.HiveMetaStore: 0: Opening raw store
with implemenation class:org.apache.hadoop.hive.metastore.ObjectStore
2015-04-01 16:40:35,116 INFO
org.apache.hadoop.hive.metastore.ObjectStore: ObjectStore, initialize
called
2015-04-01 16:40:35,526 INFO DataNucleus.Persistence: Property
hive.metastore.integral.jdo.pushdown unknown - will be ignored
2015-04-01 16:40:35,526 INFO DataNucleus.Persistence: Property
datanucleus.cache.level2 unknown - will be ignored
2015-04-01 16:40:36,261 INFO
org.apache.hadoop.hive.metastore.ObjectStore: Setting MetaStore object
pin classes with
hive.metastore.cache.pinobjtypes=Table,StorageDescriptor,SerDeInfo,Partition,Database,Type,FieldSchema,Order
2015-04-01 16:40:37,713 INFO DataNucleus.Datastore: The class
org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
embedded-only so does not have its own datastore table.
2015-04-01 16:40:37,714 INFO DataNucleus.Datastore: The class
org.apache.hadoop.hive.metastore.model.MOrder is tagged as
embedded-only so does not have its own datastore table.
2015-04-01 16:40:37,864 INFO DataNucleus.Datastore: The class
org.apache.hadoop.hive.metastore.model.MFieldSchema is tagged as
embedded-only so does not have its own datastore table.
2015-04-01 16:40:37,864 INFO DataNucleus.Datastore: The class
org.apache.hadoop.hive.metastore.model.MOrder is tagged as
embedded-only so does not have its own datastore table.
2015-04-01 16:40:38,160 INFO DataNucleus.Query: Reading in results for
query org.datanucleus.store.rdbms.query.SQLQuery@0 since the
connection used is closing
2015-04-01 16:40:38,168 INFO
org.apache.hadoop.hive.metastore.ObjectStore: Initialized ObjectStore
2015-04-01 16:40:38,365 WARN
org.apache.hadoop.hive.metastore.ObjectStore: Metastore version was
0.13.0 hive.metastore.schema.verification is not enabled so recording
the new schema version 0.13.0
2015-04-01 16:40:38,408 ERROR
org.apache.hadoop.hive.metastore.HiveMetaStore:
java.lang.IllegalArgumentException: No enum constant
org.apache.hadoop.hive.metastore.api.PrincipalType.N
at java.lang.Enum.valueOf(Enum.java:236)
at 
org.apache.hadoop.hive.metastore.api.PrincipalType.valueOf(PrincipalType.java:14)
at 
org.apache.hadoop.hive.metastore.ObjectStore.getDatabase(ObjectStore.java:520)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at 
org.apache.hadoop.hive.metastore.RawStoreProxy.invoke(RawStoreProxy.java:108)
at com.sun.proxy.$Proxy8.getDatabase(Unknown Source)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB_core(HiveMetaStore.java:511)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.createDefaultDB(HiveMetaStore.java:532)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:406)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore$HMSHandler.init(HiveMetaStore.java:365)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore.startMetaStore(HiveMetaStore.java:5165)
at 
org.apache.hadoop.hive.metastore.HiveMetaStore.main(HiveMetaStore.java:5093)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:212)

2015-04-01 16:40:38,409 ERROR
org.apache.hadoop.hive.metastore.HiveMetaStore: Metastore Thrift
Server threw an exception...
java.lang.IllegalArgumentException: No enum constant
org.apache.hadoop.hive.metastore.api.PrincipalType.N
at java.lang.Enum.valueOf(Enum.java:236)
at 
org.apache.hadoop.hive.metastore.api.PrincipalType.valueOf(PrincipalType.java:14)
at 

Re: Hadoop 2.6 issue

2015-04-01 Thread Raviprasad N Pentakota
Please un subscribe me from this list.

Regards,
Ravi Prasad Pentakota
India Software Lab, IBM Software Group
Phone: +9180-43328520  Mobile: 919620959477
e-mail:rapen...@in.ibm.com





From:   Kumar Jayapal kjayapa...@gmail.com
To: user@hadoop.apache.org
Cc: Anand Murali anand_vi...@yahoo.com
Date:   04/02/2015 07:50 AM
Subject:Re: Hadoop 2.6 issue



$which java

make sure the paths are valid for your installation (change if using 32bit
version):
/usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java

/usr/lib/jvm/java-6-openjdk-amd64/bin/javac


Setup update-alternatives:


sudo update-alternatives --install /usr/bin/java java
/usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java 1
sudo update-alternatives --install /usr/bin/javac javac
/usr/lib/jvm/java-6-openjdk-amd64/bin/javac 1

sudo update-alternatives --set
java /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java
sudo update-alternatives --set
javac /usr/lib/jvm/java-6-openjdk-amd64/bin/javac


Alternatively, make sure the correct version is checked for both Java and
compiler:


sudo update-alternatives --config java
sudo update-alternatives --config javac


List the installed Java alternatives with:


sudo update-alternatives --list java
sudo update-alternatives --list javac

On Wed, Apr 1, 2015 at 10:35 AM, Ravindra Kumar Naik ravin.i...@gmail.com
wrote:
  Hi,

  Creating batch program will not have the same effect. If you put the
  variables in /etc/environment then it will be available to all users on
  the operating system. HDFS doesn't run with root privileges.
  You need to open the application with sudo or with root privileges to
  modify it.
  e.g. If you are using vi editor then its just sudo vim /etc/environment
  (similar, if you are using other editors) and add environment variables
  there.


  On Wed, Apr 1, 2015 at 7:38 PM, Anand Murali anand_vi...@yahoo.com
  wrote:
   Mr. Ravindra:

   This is visible, however I am unable to modify it, eventhough I have
   admin priveleges. I am new to the Linux environment. Shall be glad if
   you did advise. However, as I told you earlier, I have created a batch
   program which contains, JAVA_HOME setting, HADOOP_INSTALL setting and
   PATH setting. I have rfun this file but I am still unable to start the
   daemons. I am following Tom Whyte's -Hadoop definitive Guide book
   instructions on how to install Hadoop.

   at $hadoop version works. I am able to format namenode, but fail to
   start daemons.

   Reply most welcome.

   Thanks

   Anand Murali
   11/7, 'Anand Vihar', Kandasamy St, Mylapore
   Chennai - 600 004, India
   Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 7:04 PM, Ravindra Kumar Naik 
   ravin.i...@gmail.com wrote:


   Are you sure that its not there, could you please check the output of
   this command

   ls /etc/env*



   On Wed, Apr 1, 2015 at 6:55 PM, Anand Murali anand_vi...@yahoo.com
   wrote:
 Mr. Ravindra:

 I am using Ubuntu 14. Can you please provide the full path. I am
 logged in as root and it is not found in /etc. In any case what you
 have suggested I have tried creating a batch file and it does not work
 in my installation.

 Thanks



 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



 On Wednesday, April 1, 2015 6:50 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:


 I meant /etc/environment. It should be present if you are using
 Ubuntu.

 Regards,
 Ravindra

 On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com
 wrote:
  Mr. Ravindra

  I dont find any etc/environment. Can you be more specific please. I
  have done whatever you are saying in a user created batch program and
  run it, followed by running hadoop-env.sh and it still does not work.

  Thanks

  Anand Murali
  11/7, 'Anand Vihar', Kandasamy St, Mylapore
  Chennai - 600 004, India
  Ph: (044)- 28474593/ 43526162 (voicemail)



  On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
  ravin.i...@gmail.com wrote:


  Hi,

  If you are using Ubuntu then add these lines to /etc/environment
  JAVA_HOME=actual path to jdk
  
PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin


  Please put the actual path to JDK in the first line.

  Regards,
  Ravindra


  On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti 
  roland.depra...@cox.net wrote:
Anand,

Sorry about that, I was assuming Redhat/Centos.

For Ubuntu, try sudo update-alternatives --config java.



Sent from my Verizon Wireless 4G LTE smartphone


 Original message 
From: Anand Murali anand_vi...@yahoo.com
Date: 04/01/2015 7:22 AM (GMT-05:00)
To: user@hadoop.apache.org
Subject: Re: Hadoop 2.6 issue

Dear Mr.Roland:

The 

Question about log files

2015-04-01 Thread 煜 韦
Hi there,
If log files are deleted without restarting service, it seems that the logs is 
to be lost for later operation. For example, on namenode, datanode.
Why not log files could be re-created when deleted by mistake or on purpose 
during cluster is running?

Thanks,
Jared
  

How to append the contents to a output file

2015-04-01 Thread Raghavendra Chandra
Dear Team,

I am trying to append the contents to a reducer output file using multiple
output.

My requirement is to write the reducer output to mutiple folders and the
data must be appended to the existing content.

Now I have used the custom output format by extending the Text output
format class and able to write the data into multiple folders but the issue
I am facing is, it is overwriting the data in the files but I would rather
want it to append the data to the output files.

Please let me know how to handle this situiation.

Thanks and regards,

Raghav Chandra


Re: Hadoop 2.6 issue

2015-04-01 Thread Anand Murali
Ok. Many thanks shall try

Sent from my iPhone

 On 02-Apr-2015, at 7:48 am, Kumar Jayapal kjayapa...@gmail.com wrote:
 
 $which java
 
 make sure the paths are valid for your installation (change if using 32bit 
 version):
 /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java
 
 /usr/lib/jvm/java-6-openjdk-amd64/bin/javac
 Setup update-alternatives:
 
 sudo update-alternatives --install /usr/bin/java java 
 /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java 1
 sudo update-alternatives --install /usr/bin/javac javac 
 /usr/lib/jvm/java-6-openjdk-amd64/bin/javac 1
 
 sudo update-alternatives --set java 
 /usr/lib/jvm/java-6-openjdk-amd64/jre/bin/java
 sudo update-alternatives --set javac 
 /usr/lib/jvm/java-6-openjdk-amd64/bin/javac
 Alternatively, make sure the correct version is checked for both Java and 
 compiler:
 
 sudo update-alternatives --config java
 sudo update-alternatives --config javac
 List the installed Java alternatives with:
 
 sudo update-alternatives --list java
 sudo update-alternatives --list javac
 
 On Wed, Apr 1, 2015 at 10:35 AM, Ravindra Kumar Naik ravin.i...@gmail.com 
 wrote:
 Hi,
 
 Creating batch program will not have the same effect. If you put the 
 variables in /etc/environment then it will be available to all users on the 
 operating system. HDFS doesn't run with root privileges.
 You need to open the application with sudo or with root privileges to modify 
 it.
 e.g. If you are using vi editor then its just sudo vim /etc/environment 
 (similar, if you are using other editors) and add environment variables 
 there.
 
 
 On Wed, Apr 1, 2015 at 7:38 PM, Anand Murali anand_vi...@yahoo.com wrote:
 Mr. Ravindra:
 
 This is visible, however I am unable to modify it, eventhough I have admin 
 priveleges. I am new to the Linux environment. Shall be glad if you did 
 advise. However, as I told you earlier, I have created a batch program 
 which contains, JAVA_HOME setting, HADOOP_INSTALL setting and PATH setting. 
 I have rfun this file but I am still unable to start the daemons. I am 
 following Tom Whyte's -Hadoop definitive Guide book instructions on how to 
 install Hadoop.
 
 at $hadoop version works. I am able to format namenode, but fail to start 
 daemons.
 
 Reply most welcome.
 
 Thanks
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 
 
 
 On Wednesday, April 1, 2015 7:04 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:
 
 
 Are you sure that its not there, could you please check the output of this 
 command
 
 ls /etc/env*
 
 
 
 On Wed, Apr 1, 2015 at 6:55 PM, Anand Murali anand_vi...@yahoo.com wrote:
 Mr. Ravindra:
 
 I am using Ubuntu 14. Can you please provide the full path. I am logged in 
 as root and it is not found in /etc. In any case what you have suggested I 
 have tried creating a batch file and it does not work in my installation.
 
 Thanks
 
 
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 
 
 
 On Wednesday, April 1, 2015 6:50 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:
 
 
 I meant /etc/environment. It should be present if you are using Ubuntu.
 
 Regards,
 Ravindra
 
 On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com wrote:
 Mr. Ravindra 
 
 I dont find any etc/environment. Can you be more specific please. I have 
 done whatever you are saying in a user created batch program and run it, 
 followed by running hadoop-env.sh and it still does not work.
 
 Thanks
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 
 
 
 On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:
 
 
 Hi,
 
 If you are using Ubuntu then add these lines to /etc/environment 
 JAVA_HOME=actual path to jdk
 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin
 
 Please put the actual path to JDK in the first line.
 
 Regards,
 Ravindra
 
 
 On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net 
 wrote:
 Anand,
 
 Sorry about that, I was assuming Redhat/Centos.
 
 For Ubuntu, try sudo update-alternatives --config java.
 
 
 
 Sent from my Verizon Wireless 4G LTE smartphone
 
 
  Original message 
 From: Anand Murali anand_vi...@yahoo.com 
 Date: 04/01/2015 7:22 AM (GMT-05:00) 
 To: user@hadoop.apache.org 
 Subject: Re: Hadoop 2.6 issue 
 
 Dear Mr.Roland:
 
 The alternatives command errors out. I have the extracted version of the 
 Oracle JDK7. However I am ignorant regarding its installation on Ubuntu. 
 Can you point me to installation material so that I can look up and try.
 
 Thanks
 
 Regards,
  
 Anand Murali  
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)
 
 
 
 On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
 

Re: Hadoop 2.6 issue

2015-04-01 Thread Ravindra Kumar Naik
Hi,

Creating batch program will not have the same effect. If you put the
variables in /etc/environment then it will be available to all users on the
operating system. HDFS doesn't run with root privileges.
You need to open the application with *sudo* or with root privileges to
modify it.
e.g. If you are using vi editor then its just *sudo* *vim /etc/environment*
(similar, if you are using other editors) and add environment variables
there.


On Wed, Apr 1, 2015 at 7:38 PM, Anand Murali anand_vi...@yahoo.com wrote:

 Mr. Ravindra:

 This is visible, however I am unable to modify it, eventhough I have admin
 priveleges. I am new to the Linux environment. Shall be glad if you did
 advise. However, as I told you earlier, I have created a batch program
 which contains, JAVA_HOME setting, HADOOP_INSTALL setting and PATH setting.
 I have rfun this file but I am still unable to start the daemons. I am
 following Tom Whyte's -Hadoop definitive Guide book instructions on how to
 install Hadoop.

 at $hadoop version works. I am able to format namenode, but fail to start
 daemons.

 Reply most welcome.

 Thanks

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 7:04 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:


 Are you sure that its not there, could you please check the output of this
 command

 ls /etc/env*



 On Wed, Apr 1, 2015 at 6:55 PM, Anand Murali anand_vi...@yahoo.com
 wrote:

 Mr. Ravindra:

 I am using Ubuntu 14. Can you please provide the full path. I am logged in
 as root and it is not found in /etc. In any case what you have suggested I
 have tried creating a batch file and it does not work in my installation.

 Thanks



 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 6:50 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:


 I meant /etc/environment. It should be present if you are using Ubuntu.

 Regards,
 Ravindra

 On Wed, Apr 1, 2015 at 6:39 PM, Anand Murali anand_vi...@yahoo.com
 wrote:

 Mr. Ravindra

 I dont find any etc/environment. Can you be more specific please. I have
 done whatever you are saying in a user created batch program and run it,
 followed by running hadoop-env.sh and it still does not work.

 Thanks

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 6:10 PM, Ravindra Kumar Naik 
 ravin.i...@gmail.com wrote:


 Hi,

 If you are using Ubuntu then add these lines to /etc/environment
 JAVA_HOME=*actual path to jdk*

 PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/usr/games:$JAVA_HOME/bin

 Please put the actual path to JDK in the first line.

 Regards,
 Ravindra


 On Wed, Apr 1, 2015 at 5:50 PM, roland.depratti roland.depra...@cox.net
 wrote:

 Anand,

 Sorry about that, I was assuming Redhat/Centos.

 For Ubuntu, try sudo update-alternatives --config java.



 Sent from my Verizon Wireless 4G LTE smartphone


  Original message 
 From: Anand Murali anand_vi...@yahoo.com
 Date: 04/01/2015 7:22 AM (GMT-05:00)
 To: user@hadoop.apache.org
 Subject: Re: Hadoop 2.6 issue

 Dear Mr.Roland:

 The alternatives command errors out. I have the extracted version of the
 Oracle JDK7. However I am ignorant regarding its installation on Ubuntu.
 Can you point me to installation material so that I can look up and try.

 Thanks

 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)



   On Wednesday, April 1, 2015 4:33 PM, Roland DePratti 
 roland.depra...@cox.net wrote:


 Anand,

 My guess is that your alternatives setup isn’t complete.

 At a prompt, as su, run the command ‘alternatives - - config java’. Make
 sure that the oracle version is listed and is marked as the active one.

 If it is not, go through the steps to make sure it is.

 -  rd

 *From:* Anand Murali [mailto:anand_vi...@yahoo.com]
 *Sent:* Wednesday, April 01, 2015 5:42 AM
 *To:* user@hadoop.apache.org
 *Subject:* Re: Hadoop 2.6 issue

 I continue to get the samede error.I

 export JAVA_HOME=/home/anand_vihar/jdk1.0.7_u75 (in hadoop-env.sh)


 when I echo $JAVA_HOME it shows me the above path but when I $java
 -version, it gives me openjdk version

 start-dfs.sh ... errors out saying JAVA_HOME not set., but echo shows
 JAVA_HOME. Strange !!


 Regards,

 Anand Murali
 11/7, 'Anand Vihar', Kandasamy St, Mylapore
 Chennai - 600 004, India
 Ph: (044)- 28474593/ 43526162 (voicemail)