Please have the host-name and ip address mapping in the /etc/hosts file on
both the nodes that are running hadoop cluster.

One more thing  : I hope secondary namenode is also running along namenode
but you may have forgot to mention it.

Thanks,
MIS

On Mon, Feb 21, 2011 at 12:47 PM, Amlan Mandal <[email protected]> wrote:

> Thanks Mafish.
> Can you please point me which config need to be set correctly?
>
> Amlan
>
>
> On Mon, Feb 21, 2011 at 12:45 PM, Mafish Liu <[email protected]> wrote:
>
>> It seem you did not config your HDFS properly.
>>
>> "Caused by: java.lang.IllegalArgumentException: Wrong FS:
>> hdfs://
>> 192.168.1.22:54310/tmp/hive-hadoop/hive_2011-02-21_12-09-42_678_6107747797061030113
>> ,
>> expected: hdfs://amlan-laptop.local:54310 "
>>
>>
>>
>> 2011/2/21 Amlan Mandal <[email protected]>:
>> > To give more context my multinode hadoop is working fine.
>> fs.default.name,
>> > mapred.job.tracker settings are correct.
>> > I can submit job to my multinode hadoop and see output.  (One of the
>> node
>> > running namenode,datanode,job tracker , task tracker other running task
>> > tracker and datanode)
>> >
>> > On Mon, Feb 21, 2011 at 12:24 PM, Amlan Mandal <[email protected]>
>> wrote:
>> >>
>> >> Earlier I had hive running on single node hadoop which was working
>> fine.
>> >> Now I made it 2 node hadoop cluster. When I run hive from cli I am
>> getting
>> >> following error
>> >>
>> >>
>> >> java.lang.RuntimeException: Error while making MR scratch directory -
>> >> check filesystem config (null)
>> >>     at
>> org.apache.hadoop.hive.ql.Context.getMRScratchDir(Context.java:216)
>> >>     at
>> org.apache.hadoop.hive.ql.Context.getMRTmpFileURI(Context.java:292)
>> >>     at
>> >>
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.getMetaData(SemanticAnalyzer.java:825)
>> >>     at
>> >>
>> org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.analyzeInternal(SemanticAnalyzer.java:6093)
>> >>     at
>> >>
>> org.apache.hadoop.hive.ql.parse.BaseSemanticAnalyzer.analyze(BaseSemanticAnalyzer.java:125)
>> >>     at org.apache.hadoop.hive.ql.Driver.compile(Driver.java:304)
>> >>     at org.apache.hadoop.hive.ql.Driver.run(Driver.java:379)
>> >>     at
>> org.apache.hadoop.hive.cli.CliDriver.processCmd(CliDriver.java:138)
>> >>     at
>> >> org.apache.hadoop.hive.cli.CliDriver.processLine(CliDriver.java:197)
>> >>     at org.apache.hadoop.hive.cli.CliDriver.main(CliDriver.java:302)
>> >>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>> >>     at
>> >>
>> sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:39)
>> >>     at
>> >>
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>     at java.lang.reflect.Method.invoke(Method.java:597)
>> >>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>> >> Caused by: java.lang.IllegalArgumentException: Wrong FS:
>> >> hdfs://
>> 192.168.1.22:54310/tmp/hive-hadoop/hive_2011-02-21_12-09-42_678_6107747797061030113
>> ,
>> >> expected: hdfs://amlan-laptop.local:54310
>> >> ...
>> >>
>> >>
>> >> I can guess I need to change some config variable for hive , can
>> somebody
>> >> please help me out?
>> >>
>> >>
>> >> Amlan
>> >
>> >
>>
>
>

Reply via email to