Hi Eli,
first of all thank you for your support and your time!
I tryed to use giraph-dist-1.2.0-hadoop2-bin.tar.gz with hadoop-2.3.0 but I
got always the same error. I read all previous posts but I didn't find any
solution. I found this post about my error
https://blog.woopi.org/wordpress/?p=137 but I didn't find any occurrence of
"google protocol buffer jar with version 2.4.1" in my environment. Maybe
giraph-dist-1.2.0-bin.tar.gz was wrongly build with 2.4.1 dependency in
place of 2.5.0?! I tryed to download giraph-dist-1.2.0-hadoop2-src.zip and
build it but the build fails on giraph-core. I used "mvn -Phadoop_2
-DskipTests clean install". Could you please provide me the right mvn
command or the right mvn configuration to be used?

*Hence I tryed to use giraph-dist-1.2.0-bin.tar.gz with hadoop-1.2.1 and
this time it properly works: I succeed to use giraph in a ec2 cluster with
this input graph http://snap.stanford.edu/data/p2p-Gnutella08.html
<http://snap.stanford.edu/data/p2p-Gnutella08.html> (for beginning 1
namenode and 2 datenodes with free micro tier - ubuntu 16.04). I didn't
change anything in my giraph parameters!!!* Anyway I'd prefer to use
giraph-1.2 with hadoop-2.

I found this
https://stackoverflow.com/questions/33728752/building-giraph-with-hadoop
and also the documentation below in https://github.com/apache/giraph, but
I'm still a bit confused. In order to build giraph1.2 with hadoop-2.5.1 I
have to use "mvn -Phadoop_2 clean package -DskipTests" or I have to use
"mvn -Phadoop_yarn -Dhadoop.version=2.5.1 clean package -DskipTests". Why
both builds fail on giraph-core? Is giraph-1.2 compatible with hadoop-2.5.1
without using yarn? How hadoop-2.5.1 can be configured without using yarn?
Thanks

- Apache Hadoop 2 (latest version: 2.5.1)

  This is the latest version of Hadoop 2 (supporting YARN in addition
  to MapReduce) Giraph could use. You may tell maven to use this version
  with "mvn -Phadoop_2 <goals>".





2018-08-01 12:11 GMT+02:00 Eli Reisman <apache.mail...@gmail.com>:

> hi francesco,
>
> I think in the mailing list you might find some tips for running on hadoop
> 2.5.x but IIRC hadoop 2.3.x plus appropriate Giraph build works. Again,
> given the error msg I'd also carefully review your command line options and
> config file just in case.
>
> Also for context as far as I know no one is actively maintaining Giraph on
> YARN compatibility for newer versions right now so taking a look at the
> mailing list backscroll will probably yield the best info on which
> combinations work best
>
> On Mon, Jul 30, 2018, 11:52 AM Francesco Sclano <
> sclano.france...@gmail.com> wrote:
>
>> Hi Eli,
>> ok, thanks for you suggestion. I'll try it and let you know. In order to
>> be sure, before try to use an earlier version of yarn/hadoop2, I'm using
>> giraph-dist-1.2.0-hadoop2-bin.tar.gz. Does this giraph version
>> officially support hadoop-2.5.1 with yarn? If not how I have to configure
>> hadoop-2.5.1? Thanks
>>
>> 2018-07-29 9:47 GMT+02:00 Eli Reisman <apache.mail...@gmail.com>:
>>
>>> hey francesco sorry im on my phone and have not been running giraph on
>>> yarn for some time but if i had to guess this looks like a version
>>> compatibility issue with the version of yarn the giraph is built against
>>> compared to what you ran on ec2 or perhaps the version of protobuf dep in
>>> the giraph build and the yarn cluster? either way id try an earlier version
>>> of yarn/hadoop2 for ec2 and your giraph build, as giraph hasnt been keeping
>>> up with all the yarn side changes.
>>>
>>> alternate theory is i have seen startup argument and config errors show
>>> up with ambiguous traces similar to that. not able to review your conf file
>>> from phone but you might review configs and args against giraph and yarn
>>> code directly to ensure all those are good since docs might be outdated on
>>> some of those.
>>>
>>> Good luck!
>>>
>>> On Sat, Jul 28, 2018, 9:42 PM Francesco Sclano <
>>> sclano.france...@gmail.com> wrote:
>>>
>>>> Hi,
>>>> I'm using giraph-1.2 for my master thesis in computer science. I
>>>> developed in giraph the calculus of 4-profiles like
>>>> eelenberg.github.io/Elenberg4profileWWW16.pdf
>>>> I succesfully configured and runned giraph-1.2 and hadoop-2.5.1 in
>>>> pseudo distributed mode on my local pc with this configuration
>>>> <https://we.tl/dB0PmGhUPX>.
>>>>
>>>> In pseudo distributed mode I launch my giraph program with following
>>>> parameters:
>>>> giraph MY_JAR.jar SUPERSTEP0_CLASS -ca 
>>>> giraph.master.observers=MY_OBSERVER_CLASS
>>>> -mc MY_MASTER_CLASS -eif MY_CUSTOM_INPUT_FORMAT -eip input.txt -vof
>>>> org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op output -w
>>>> 1 -ca giraph.SplitMasterWorker=false -ca io.edge.reverse.duplicator=
>>>> true
>>>>
>>>> Then I tryed to use amazon ec2 with a simple cluster of 3 nodes with
>>>> this other configuration <https://we.tl/Gbv9q0nQ7h> but I obtain the
>>>> error below. I also re-tryed pseudo-distributed mode in amazon ec2 with
>>>> only 1 node and I obtain the same error below, but on my local pc it
>>>> works!!!
>>>> I'm using ubuntu 16.04 both in local and in ec2.
>>>>
>>>> Obviously I tried an hadoop example on my ec2 cluster and it works.
>>>> Please help me because I'm blocked with my master this for this error!
>>>> I red many forums about this error and I checked many times that I have
>>>> only protobuf-java-2.5.0.jar inside hadoop-2.5.2 and giraph-1.2 both in ec2
>>>> and in my local pc.
>>>>
>>>>
>>>>
>>>> FATAL [main] org.apache.hadoop.mapreduce.v2.app.MRAppMaster: Error
>>>> starting MRAppMaster
>>>> java.lang.VerifyError: class 
>>>> org.apache.hadoop.yarn.proto.YarnProtos$ApplicationIdProto
>>>> overrides final method getUnknownFields.()Lcom/google/protobuf/
>>>> UnknownFieldSet;
>>>>         at java.lang.ClassLoader.defineClass1(Native Method)
>>>>         at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>>>         at java.security.SecureClassLoader.defineClass(
>>>> SecureClassLoader.java:142)
>>>>         at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>>         at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>>         at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>>         at java.security.AccessController.doPrivileged(Native Method)
>>>>         at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>         at sun.misc.Launcher$AppClassLoader.loadClass(
>>>> Launcher.java:349)
>>>>         at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>         at java.lang.Class.getDeclaredConstructors0(Native Method)
>>>>         at java.lang.Class.privateGetDeclaredConstructors
>>>> (Class.java:2671)
>>>>         at java.lang.Class.getConstructor0(Class.java:3075)
>>>>         at java.lang.Class.getConstructor(Class.java:1825)
>>>>         at org.apache.hadoop.yarn.factories.impl.pb.
>>>> RecordFactoryPBImpl.newRecordInstance(RecordFactoryPBImpl.java:62)
>>>>         at org.apache.hadoop.yarn.util.Records.newRecord(Records.
>>>> java:36)
>>>>         at org.apache.hadoop.yarn.api.records.ApplicationId.
>>>> newInstance(ApplicationId.java:49)
>>>>         at org.apache.hadoop.yarn.util.ConverterUtils.
>>>> toApplicationAttemptId(ConverterUtils.java:137)
>>>>         at org.apache.hadoop.yarn.util.ConverterUtils.toContainerId(
>>>> ConverterUtils.java:177)
>>>>         at org.apache.hadoop.mapreduce.v2.app.MRAppMaster.main(
>>>> MRAppMaster.java:1391)
>>>>
>>>> INFO [main] org.apache.hadoop.util.ExitUtil: Exiting with status 1
>>>>
>>>>
>>>> Links of configurafion files works until 4 August 2018 but I can
>>>> re-send them.
>>>>
>>>> Many Thanks
>>>>
>>>> Best Regards
>>>>
>>>> --
>>>> Francesco Sclano
>>>>
>>>
>>
>>
>> --
>> Francesco Sclano
>>
>


-- 
Francesco Sclano

Reply via email to