Hi tom : which log will have info about why a process was Killed?
Sent from my iPad
On Oct 28, 2011, at 11:41 PM, Tom Melendez wrote:
> Hi Jay,
>
> Are you able to look at the logs or the web interface? Can you find
> out why it's getting killed?
>
> Also, can you verify that these ports are
Touché my friend... if only I could only :)
On Fri, Oct 28, 2011 at 9:16 PM, JAX wrote:
> Yup Brutal :-|
> but you never regret fixing a bug ... Unlike ---
>
> Sent from my iPad
>
> On Oct 28, 2011, at 11:43 PM, Alex Gauthier
> wrote:
>
> > Brutal Friday night. Coding < pussy.
Yup Brutal :-|
but you never regret fixing a bug ... Unlike ---
Sent from my iPad
On Oct 28, 2011, at 11:43 PM, Alex Gauthier wrote:
> Brutal Friday night. Coding < pussy.
>
> :)
>
> On Fri, Oct 28, 2011 at 8:43 PM, Alex Gauthier
> wrote:
>
>>
>>
>> On Fri, Oct 28, 2011 at 8:41
Brutal Friday night. Coding < pussy.
:)
On Fri, Oct 28, 2011 at 8:43 PM, Alex Gauthier wrote:
>
>
> On Fri, Oct 28, 2011 at 8:41 PM, Tom Melendez wrote:
>
>> Hi Jay,
>>
>> Are you able to look at the logs or the web interface? Can you find
>> out why it's getting killed?
>>
>> Also, can you v
On Fri, Oct 28, 2011 at 8:41 PM, Tom Melendez wrote:
> Hi Jay,
>
> Are you able to look at the logs or the web interface? Can you find
> out why it's getting killed?
>
> Also, can you verify that these ports are open and a process is
> connected to them (maybe with netstat)?
>
> http://www.cloud
Hi Jay,
Are you able to look at the logs or the web interface? Can you find
out why it's getting killed?
Also, can you verify that these ports are open and a process is
connected to them (maybe with netstat)?
http://www.cloudera.com/blog/2009/08/hadoop-default-ports-quick-reference/
Thanks,
T
Thanks tom : Thats interesting
First, I tried, and it complained that the input directory didnt exist, so I
ran
$> hadoop fs -mkdir /user/cloudera/input
Then, I tried to do this :
$> hadoop jar /usr/lib/hadoop-0.20/hadoop-examples.jar grep input output2
'dfs[a-z.]+'
And it seemed to start w
Hi Jay,
Some questions for you:
- Does the hadoop client itself work from that same machine?
- Are you actually able to run the hadoop example jar (in other words,
your setup is valid otherwise)?
- Is port 8020 actually available? (you can telnet or nc to it?)
- What does jps show on the namenod
Hi guys : Made more progress debugging my hadoop connection, but still
haven't got it working.. It looks like my VM (cloudera hadoop) won't
let me in. I find that there is no issue connecting to the name node - that
is , using hftp and 50070..
via standard HFTP as in here :
//This metho
hdfs scheme should work but you will have to change the port. To find
the correct port # look for fs.default.name prop in the core-site.xml
or the namenode ui should also state the port.
--
Arpit
On Oct 27, 2011, at 10:52 PM, Jay Vyas wrote:
> I found a way to connect to hadoop via hftp, and it
Jay,
Using the hdfs:// scheme is the right way, as you have determined. However…
A few things you need to ensure while using the Java FileSystem API to
do your HDFS tasks:
- Connect to NameNode's RPC port, not the web port. Default RPC port
is usually 8020, but your fs.default.name config will t
I found a way to connect to hadoop via hftp, and it works fine, (read only)
:
uri = "hftp://172.16.xxx.xxx:50070/";;
System.out.println( "uri: " + uri );
Configuration conf = new Configuration();
FileSystem fs = FileSystem.get( URI.create( uri ), conf );
fs.printStatistics();
12 matches
Mail list logo