the following:
apt-get install cmake
Reading package lists... Done
Building dependency tree
Reading state information... Done
cmake is already the newest version.
0 upgraded, 0 newly installed, 0 to remove and 0 not upgraded.
regards
On 28 Apr, 2014, at 7:43 pm, Silvina Caíno Lores silvi.ca
Are you sure that CMake is installed?
Best,
Silvina
On 28 April 2014 13:05, ascot.m...@gmail.com ascot.m...@gmail.com wrote:
Hi,
I am trying to install Hadoop 2.4.0 from source, I got the following
error, please help!!
Can anyone share the apache-maven-3.1.1/conf/settings.xml” setting?
Hi!
I've faced the same issue a couple of times and I found nothing in the logs
that lead me to the source of the error. However, I've found out that smart
container and block configuration can prevent these issues
First of all, check RM logs to find any problematic container since the
same task
this application from RM web UI or
Yarn Command Line (Type yarn application -help to find out commands).
Thanks
Xuan Gong
On Mon, Mar 10, 2014 at 4:16 AM, Silvina Caíno Lores
silvi.ca...@gmail.com wrote:
Hi all,
I've been noticing lately that sometimes my Hadoop jobs do not report
progress
Hi all,
I've been noticing lately that sometimes my Hadoop jobs do not report
progress in the terminal. They seem like they are stuck at the Running
job: job_ message, however YarnChilds are running and properly
executing.
I know that my job didn't fail but it's very inconvenient not being
Hi there,
I've been working with pipes for some months and I've finally managed to
get it working as I wanted with some legacy code I had. However, I had many
many issues regarding not only my implementation (it had to be adapted in
several ways to fit pipes, it is very restrictive) but pipes
You can check Amazon Elastic MapReduce, which comes preconfigured on EC2
but you need to pay a little por it, or make your custom instalation on EC2
(beware that EC2 instances come with nothing but really basic shell tools
on it, so it may take a while to get it running).
Amazon's free tier
I've been dealing with a similar situation and I haven't found other
solution rather than launching two independent jobs (with a script or
whatever you like), letting the output of the first be the input of the
last. If you find any other option please let me know.
Regards
On 12 February 2014
'protoc --version' did not return a version
Are you sure that you have Protocol Buffers installed?
On 17 January 2014 11:29, Nirmal Kumar nirmal.ku...@impetus.co.in wrote:
Hi All,
I am trying to build Hadoop 2.2.0 On Windows 7 64-bit env.
Can you let me know what else is needed for
Found out that building my code with Maven (along with the Pipes examples)
worked. Any clues why?
Thanks,
Silvina
On 19 December 2013 13:16, Silvina Caíno Lores silvi.ca...@gmail.comwrote:
And that it is caused by this exception, as I've found out
2013-12-19 13:14:28,237 ERROR [pipe-uplink
(WritableUtils.java:308)
at org.apache.hadoop.io.WritableUtils.readVInt(WritableUtils.java:329)
at
org.apache.hadoop.mapred.pipes.BinaryProtocol$UplinkReaderThread.run(BinaryProtocol.java:125)
On 18 December 2013 10:52, Silvina Caíno Lores silvi.ca...@gmail.comwrote:
I forgot to mention
Silvina Caíno Lores silvi.ca...@gmail.com
I'm having similar problems with pipes, mostly because of issues with the
native shared libraries that leave the job stuck either at 0%-0% or before
launch (because the resource manager gets stuck as well and crashes).
I found that out by looking
I'm having similar problems with pipes, mostly because of issues with the
native shared libraries that leave the job stuck either at 0%-0% or before
launch (because the resource manager gets stuck as well and crashes).
I found that out by looking at the stderr logs by the way.
Let us know if you
on your
console after you submit a job e.g.
13/12/10 10:41:21 INFO mapreduce.Job: The url to track the job:
http://compute-7-2:8088/proxy/application_1386668372725_0001/
2013/12/10 Silvina Caíno Lores silvi.ca...@gmail.com
Thank you! I realized that, despite I exported the variables
will study Adam's suggestion
On 11 December 2013 10:01, Silvina Caíno Lores silvi.ca...@gmail.comwrote:
Actually now it seems to be running (or at least attempting to run) but I
get further errors:
hadoop jar
~/hadoop-2.2.0-maven/hadoop-dist/target/hadoop-3.0.0-SNAPSHOT/share/hadoop
Thanks a lot Adam for your suggestion!
To prevent future disasters, may you recommend a configuration guide or
give some hints in proper resource management?
Thank you once more!
On 11 December 2013 10:32, Silvina Caíno Lores silvi.ca...@gmail.comwrote:
OK that was indeed a classpath issue
-env.sh
and yarn-env.sh as well to update
JAVA_HOME, HADOOP_CONF_DIR, HADOOP_YARN_USER and YARN_CONF_DIR.
Once these variables are set, I was able to run the example successfully.
On Mon, Dec 9, 2013 at 11:37 PM, Silvina Caíno Lores
silvi.ca...@gmail.com wrote:
Hi everyone,
I'm having
Hi,
You can check the userlogs directory where the job and attempt logs are
stored. For each attempt you should have a stderr, stdout and syslog file.
The first two hold the program output for each stream (useful for debug
purposes), while the last contains execution details provided by the
Hi,
First of all, thanks a lot for your help.
I was told that the node in which I was trying to compile had some
library/compiler inconsistencies, hence the native building failure. I used
an updated node and it seemed to build correctly.
Regards,
Silvina
On 5 December 2013 15:25, java8964
Hi again,
I've tried to build using JDK 1.6.0_38 and I'm still getting the same
exception:
~/hadoop-2.2.0-maven$ java -version
java version 1.6.0_38-ea
Java(TM) SE Runtime Environment (build 1.6.0_38-ea-b04)
Java HotSpot(TM) 64-Bit Server VM (build 20.13-b02, mixed mode)
--
[ERROR] Failed
20 matches
Mail list logo