Re: Couldn't instantiate

2014-07-08 Thread Carmen Manzulli
Hi John,
yes i've gotten the examples but the job failed this was my command:

bin/hadoop jar
/usr/local/giraph/giraph-core/target/giraph-1.1.0-SNAPSHOT-for-hadoop-1.2.1-jar-with-dependencies.jar
org.apache.giraph.GiraphRunner
org.apache.giraph.benchmark.ShortestPathsComputation -vif
org.apache.giraph.io.formats.JsonLongDoubleFloatDoubleVertexInputFormat
-vip /user/hduser/Documento -vof
org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op
/user/hduser/outShortest -w 1

and this was the result

14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge input format
specified. Ensure your InputFormat does not require one.
14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge output format
specified. Ensure your OutputFormat does not require one.
Exception in thread main java.lang.IllegalArgumentException:
checkClassTypes: edge value types not assignable, computation - class
org.apache.hadoop.io.DoubleWritable, VertexInputFormat - class
org.apache.hadoop.io.DoubleWritable
at
org.apache.giraph.job.GiraphConfigurationValidator.checkAssignable(GiraphConfigurationValidator.java:381)
at
org.apache.giraph.job.GiraphConfigurationValidator.verifyVertexInputFormatGenericTypes(GiraphConfigurationValidator.java:230)
at
org.apache.giraph.job.GiraphConfigurationValidator.validateConfiguration(GiraphConfigurationValidator.java:141)
at
org.apache.giraph.utils.ConfigurationUtils.parseArgs(ConfigurationUtils.java:214)
at org.apache.giraph.GiraphRunner.run(GiraphRunner.java:74)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
at org.apache.giraph.GiraphRunner.main(GiraphRunner.java:124)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

maybe i've some problem of configuration or there is something that i've
not understood yes (a lot XD !); Could you give any suggestions?


2014-07-05 12:55 GMT+02:00 John Yost soozandjohny...@gmail.com:

 Hi Carmen,

 I think the ChildError is being thrown by Hadoop but the root cause is
 this NPE within the BspServiceMaster.   Unfortunately, I have not seen this
 exception situation and am not sure why you are getting the NPE.  It may be
 a data issue.  Quick question--have you gotten the examples like
 SimpleShortestPath to run?  Please confirm, thanks.

 --John


 On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli carmenmanzu...@gmail.com
 wrote:

 i've red in the web that error child could mean this:
 Possible reason: the memory allocated for the tasks trackers (sum of
 mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
 memory .


 2014-07-02 15:52 GMT+02:00 Carmen Manzulli carmenmanzu...@gmail.com:

 ok course :) !

 java.lang.Throwable: Child Error
 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
 Caused by: java.io.IOException: Task process exit with nonzero status of 1.
 at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

 and from the command line:


 

Re: Couldn't instantiate

2014-07-08 Thread Carmen Manzulli
ok john the problem for the example of ShortestPathsComputation was
mapred.map.max.attempts property...now it runs well



2014-07-08 11:02 GMT+02:00 Carmen Manzulli carmenmanzu...@gmail.com:

 Hi John,
 yes i've gotten the examples but the job failed this was my command:

 bin/hadoop jar
 /usr/local/giraph/giraph-core/target/giraph-1.1.0-SNAPSHOT-for-hadoop-1.2.1-jar-with-dependencies.jar
 org.apache.giraph.GiraphRunner
 org.apache.giraph.benchmark.ShortestPathsComputation -vif
 org.apache.giraph.io.formats.JsonLongDoubleFloatDoubleVertexInputFormat
 -vip /user/hduser/Documento -vof
 org.apache.giraph.io.formats.IdWithValueTextOutputFormat -op
 /user/hduser/outShortest -w 1

 and this was the result

 14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge input format
 specified. Ensure your InputFormat does not require one.
 14/07/08 10:46:37 INFO utils.ConfigurationUtils: No edge output format
 specified. Ensure your OutputFormat does not require one.
 Exception in thread main java.lang.IllegalArgumentException:
 checkClassTypes: edge value types not assignable, computation - class
 org.apache.hadoop.io.DoubleWritable, VertexInputFormat - class
 org.apache.hadoop.io.DoubleWritable
 at
 org.apache.giraph.job.GiraphConfigurationValidator.checkAssignable(GiraphConfigurationValidator.java:381)
 at
 org.apache.giraph.job.GiraphConfigurationValidator.verifyVertexInputFormatGenericTypes(GiraphConfigurationValidator.java:230)
 at
 org.apache.giraph.job.GiraphConfigurationValidator.validateConfiguration(GiraphConfigurationValidator.java:141)
 at
 org.apache.giraph.utils.ConfigurationUtils.parseArgs(ConfigurationUtils.java:214)
 at org.apache.giraph.GiraphRunner.run(GiraphRunner.java:74)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
 at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:79)
 at org.apache.giraph.GiraphRunner.main(GiraphRunner.java:124)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606)
 at org.apache.hadoop.util.RunJar.main(RunJar.java:160)

 maybe i've some problem of configuration or there is something that i've
 not understood yes (a lot XD !); Could you give any suggestions?


 2014-07-05 12:55 GMT+02:00 John Yost soozandjohny...@gmail.com:

 Hi Carmen,

 I think the ChildError is being thrown by Hadoop but the root cause is
 this NPE within the BspServiceMaster.   Unfortunately, I have not seen this
 exception situation and am not sure why you are getting the NPE.  It may be
 a data issue.  Quick question--have you gotten the examples like
 SimpleShortestPath to run?  Please confirm, thanks.

 --John


 On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli carmenmanzu...@gmail.com
  wrote:

 i've red in the web that error child could mean this:
 Possible reason: the memory allocated for the tasks trackers (sum of
 mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
 memory .


 2014-07-02 15:52 GMT+02:00 Carmen Manzulli carmenmanzu...@gmail.com:

 ok course :) !

 java.lang.Throwable: Child Error
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
 Caused by: java.io.IOException: Task process exit with nonzero status of 1.
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

 and from the command line:


 

Re: Couldn't instantiate

2014-07-05 Thread John Yost
Hi Carmen,

I think the ChildError is being thrown by Hadoop but the root cause is this
NPE within the BspServiceMaster.   Unfortunately, I have not seen this
exception situation and am not sure why you are getting the NPE.  It may be
a data issue.  Quick question--have you gotten the examples like
SimpleShortestPath to run?  Please confirm, thanks.

--John


On Wed, Jul 2, 2014 at 9:53 AM, Carmen Manzulli carmenmanzu...@gmail.com
wrote:

 i've red in the web that error child could mean this:
 Possible reason: the memory allocated for the tasks trackers (sum of
 mapred.*.child.java.opt in mapred-site.xml) is more than the nodes actual
 memory .


 2014-07-02 15:52 GMT+02:00 Carmen Manzulli carmenmanzu...@gmail.com:

 ok course :) !

 java.lang.Throwable: Child Error
  at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
 Caused by: java.io.IOException: Task process exit with nonzero status of 1.
  at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

 and from the command line:


 /../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work/tmp
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:java.compiler=NA
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:os.name=Linux
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:os.arch=amd64
 2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:os.version=3.11.0-24-generic
 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:user.name=hduser
 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:user.home=/home/hduser
 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client 
 environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work
 2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Initiating 
 client connection, connectString=carmen-HP-Pavilion-Sleekbook-15:22181 
 sessionTimeout=6 
 

Re: Couldn't instantiate

2014-07-02 Thread soozandjohnyost
Hi Carmen,

Glad that one problem is fixed, and I can take a look at this one as well.

--John 

Sent from my iPhone

 On Jul 2, 2014, at 6:50 AM, Carmen Manzulli carmenmanzu...@gmail.com wrote:
 
 
 ok; i've done what you have told me...but now i've got this problem..
 ava.lang.Throwable: Child Error
   at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
 Caused by: java.io.IOException: Task process exit with nonzero status of 1.
   at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)
 
 
 this is my Computation code:
 import org.apache.giraph.GiraphRunner;
 import org.apache.giraph.graph.BasicComputation;
 import org.apache.giraph.graph.Vertex;
 import org.apache.giraph.edge.Edge;
 
 import org.apache.hadoop.io.Text;
 import org.apache.hadoop.io.NullWritable;
 import org.apache.hadoop.util.ToolRunner;
 
 
 
 public class SimpleSelectionComputation extends 
 BasicComputationText,NullWritable,Text,NullWritable {
 
   
 @Override
 public void compute(VertexText,NullWritable,Text 
 vertex,IterableNullWritable messages){
   
   
   Text source = new Text(http://dbpedia.org/resource/1040s;);
 
   
   if (getSuperstep()==0)
   {
   if(vertex.getId()==source)
   {
   System.out.println(il soggetto +vertex.getId()+ ha i 
 seguenti predicati e oggetti:);
   for(EdgeText,Text e : vertex.getEdges())
 
   {
   
 System.out.println(e.getValue()+\t+e.getTargetVertexId());
   }
   }
   vertex.voteToHalt();
   }
   
 }
 
 public static void main(String[] args) throws Exception {
 System.exit(ToolRunner.run(new GiraphRunner(), args));
 
   }
 
   
 }


Re: Couldn't instantiate

2014-07-02 Thread John Yost
Hi Carmen,

Please post more of the exception stack trace, not enough here for me to
figure anything out. :)

Thanks

--John


On Wed, Jul 2, 2014 at 7:33 AM, soozandjohny...@gmail.com wrote:

 Hi Carmen,

 Glad that one problem is fixed, and I can take a look at this one as well.

 --John

 Sent from my iPhone

 On Jul 2, 2014, at 6:50 AM, Carmen Manzulli carmenmanzu...@gmail.com
 wrote:


 ok; i've done what you have told me...but now i've got this problem..

 ava.lang.Throwable: Child Error
   at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
 Caused by: java.io.IOException: Task process exit with nonzero status of 1.
   at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

 this is my Computation code:
 import org.apache.giraph.GiraphRunner;
 import org.apache.giraph.graph.BasicComputation;
 import org.apache.giraph.graph.Vertex;
 import org.apache.giraph.edge.Edge;

 import org.apache.hadoop.io.Text;
 import org.apache.hadoop.io.NullWritable;
 import org.apache.hadoop.util.ToolRunner;



 public class SimpleSelectionComputation extends 
 BasicComputationText,NullWritable,Text,NullWritable {

   
 @Override
 public void compute(VertexText,NullWritable,Text 
 vertex,IterableNullWritable messages){
   
   
   Text source = new Text(http://dbpedia.org/resource/1040s;);

   
   if (getSuperstep()==0)
   {
   if(vertex.getId()==source)
   {
   System.out.println(il soggetto +vertex.getId()+ ha i 
 seguenti predicati e oggetti:);
   for(EdgeText,Text e : vertex.getEdges())

   {
   
 System.out.println(e.getValue()+\t+e.getTargetVertexId());
   }
   }
   vertex.voteToHalt();
   }
   
 }

 public static void main(String[] args) throws Exception {
 System.exit(ToolRunner.run(new GiraphRunner(), args));

   }

   
 }




Re: Couldn't instantiate

2014-07-02 Thread Carmen Manzulli
ok course :) !

java.lang.Throwable: Child Error
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:271)
Caused by: java.io.IOException: Task process exit with nonzero status of 1.
at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:258)

and from the command line:


/../lib/commons-beanutils-1.7.0.jar:/usr/local/hadoop/libexec/../lib/commons-beanutils-core-1.8.0.jar:/usr/local/hadoop/libexec/../lib/commons-cli-1.2.jar:/usr/local/hadoop/libexec/../lib/commons-codec-1.4.jar:/usr/local/hadoop/libexec/../lib/commons-collections-3.2.1.jar:/usr/local/hadoop/libexec/../lib/commons-configuration-1.6.jar:/usr/local/hadoop/libexec/../lib/commons-daemon-1.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-digester-1.8.jar:/usr/local/hadoop/libexec/../lib/commons-el-1.0.jar:/usr/local/hadoop/libexec/../lib/commons-httpclient-3.0.1.jar:/usr/local/hadoop/libexec/../lib/commons-io-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-lang-2.4.jar:/usr/local/hadoop/libexec/../lib/commons-logging-1.1.1.jar:/usr/local/hadoop/libexec/../lib/commons-logging-api-1.0.4.jar:/usr/local/hadoop/libexec/../lib/commons-math-2.1.jar:/usr/local/hadoop/libexec/../lib/commons-net-3.1.jar:/usr/local/hadoop/libexec/../lib/core-3.1.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-capacity-scheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-fairscheduler-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hadoop-thriftfs-1.2.1.jar:/usr/local/hadoop/libexec/../lib/hsqldb-1.8.0.10.jar:/usr/local/hadoop/libexec/../lib/jackson-core-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jackson-mapper-asl-1.8.8.jar:/usr/local/hadoop/libexec/../lib/jasper-compiler-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jasper-runtime-5.5.12.jar:/usr/local/hadoop/libexec/../lib/jdeb-0.8.jar:/usr/local/hadoop/libexec/../lib/jersey-core-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-json-1.8.jar:/usr/local/hadoop/libexec/../lib/jersey-server-1.8.jar:/usr/local/hadoop/libexec/../lib/jets3t-0.6.1.jar:/usr/local/hadoop/libexec/../lib/jetty-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jetty-util-6.1.26.jar:/usr/local/hadoop/libexec/../lib/jsch-0.1.42.jar:/usr/local/hadoop/libexec/../lib/junit-4.5.jar:/usr/local/hadoop/libexec/../lib/kfs-0.2.2.jar:/usr/local/hadoop/libexec/../lib/log4j-1.2.15.jar:/usr/local/hadoop/libexec/../lib/mockito-all-1.8.5.jar:/usr/local/hadoop/libexec/../lib/oro-2.0.8.jar:/usr/local/hadoop/libexec/../lib/servlet-api-2.5-20081211.jar:/usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar:/usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar:/usr/local/hadoop/libexec/../lib/xmlenc-0.52.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-2.1.jar:/usr/local/hadoop/libexec/../lib/jsp-2.1/jsp-api-2.1.jar
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.library.path=/usr/local/hadoop/libexec/../lib/native/Linux-amd64-64:/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.io.tmpdir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work/tmp
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:java.compiler=NA
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.name=Linux
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.arch=amd64
2014-07-02 15:49:17,492 INFO org.apache.zookeeper.ZooKeeper: Client
environment:os.version=3.11.0-24-generic
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.name=hduser
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.home=/home/hduser
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper: Client
environment:user.dir=/app/hadoop/tmp/mapred/local/taskTracker/hduser/jobcache/job_201407021315_0003/attempt_201407021315_0003_m_00_0/work
2014-07-02 15:49:17,493 INFO org.apache.zookeeper.ZooKeeper:
Initiating client connection,
connectString=carmen-HP-Pavilion-Sleekbook-15:22181
sessionTimeout=6
watcher=org.apache.giraph.master.BspServiceMaster@465962c4
2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Opening
socket connection to server
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181. Will not attempt to
authenticate using SASL (unknown error)
2014-07-02 15:49:17,509 INFO org.apache.zookeeper.ClientCnxn: Socket
connection established to
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, initiating session
2014-07-02 15:49:17,515 INFO org.apache.zookeeper.ClientCnxn: Session
establishment complete on server
carmen-HP-Pavilion-Sleekbook-15/127.0.1.1:22181, sessionid =
0x146f756106b0001, negotiated timeout = 60
2014-07-02 15:49:17,516 INFO org.apache.giraph.bsp.BspService:
process: Asynchronous connection complete.
2014-07-02 15:49:17,530 INFO 

Re: Couldn't instantiate

2014-07-01 Thread Carmen Manzulli
Hi John,
yes I've tried to insert a no-arguments constructor but the problem seems
to be another one.This is my code, an input format with a vertexreader to
read triple as RDF.

import java.io.IOException;
import java.util.ArrayList;
import java.lang.InterruptedException;

import org.apache.giraph.graph.Vertex;
import org.apache.giraph.edge.Edge;
import org.apache.giraph.edge.EdgeFactory;
import org.apache.giraph.io.VertexReader;
import org.apache.giraph.io.VertexInputFormat;
import org.apache.hadoop.io.LongWritable;
import org.apache.hadoop.io.NullWritable;
import org.apache.hadoop.io.Text;
import org.apache.hadoop.mapreduce.InputSplit;
import org.apache.hadoop.mapreduce.RecordReader;
import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
import org.apache.hadoop.mapreduce.TaskAttemptContext;



public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat
Text,NullWritable,Text {

public SimpleRDFVertexInputFormat() {
// TODO Auto-generated constructor stub
super();
}

public VertexReaderText,NullWritable,Text
createVertexReader(InputSplit split,TaskAttemptContext context) throws
IOException{
return new SimpleRDFVertexReader();
}
public class SimpleRDFVertexReader extends
VertexReaderText,NullWritable,Text{

private RecordReaderLongWritable,Text lineRecordReader;
private TaskAttemptContext context;


@Override
public void initialize(InputSplit inputsplit, TaskAttemptContext
context) throws IOException, InterruptedException{

this.setContext(context);
lineRecordReader= new LineRecordReader();
lineRecordReader.initialize(inputsplit, context);

}

@Override
public final boolean nextVertex() throws IOException,
InterruptedException{
return lineRecordReader.nextKeyValue();
}


@Override
public final VertexText,NullWritable,Text getCurrentVertex()
throws IOException, InterruptedException{
Text line = lineRecordReader.getCurrentValue();
VertexText,NullWritable,Text vertex =
getConf().createVertex();
String[] elements = line.toString().split( );
Text firstele = new Text(elements[0]);
int len1 = firstele.getLength();
Text  subject = new Text(firstele.toString().substring(1,
len1-1));
Text secondele = new Text(elements[1]);
int len2 = secondele.getLength();
Text predicate = new Text(secondele.toString().substring(1,
len2-1));
Text object = new Text(elements[2]);
ArrayListEdgeText,Text edge = new
ArrayListEdgeText,Text();
edge.add(EdgeFactory.create(object, predicate) );
vertex.initialize(subject, null, edge );
return vertex;

}

@Override
public void close() throws IOException{
lineRecordReader.close();
}

@Override
public  float getProgress()throws IOException,InterruptedException{
return lineRecordReader.getProgress();
}

public TaskAttemptContext getContext() {
return context;
}

public void setContext(TaskAttemptContext context) {
this.context = context;
}


}


}


2014-07-01 4:23 GMT+02:00 John Yost soozandjohny...@gmail.com:

 Hi Carmen,

 Question--did you only define an arguments constructor?  If so, I think
 you are getting this because you did not define a no-arguments constructor
 with public visibility.  If this is not the case, I recommend posting your
 source code and I will be happy to help.

 --John


 On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli carmenmanzu...@gmail.com
  wrote:

 Hi,

 I'm trying to run a selectionComputation with my own code for 
 VertexInputFormat but giraph' job starts to work and then fails with:




 java.lang.IllegalStateException: run: Caught an unrecoverable exception 
 newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
  at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
  at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
  at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
  at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
  at java.security.AccessController.doPrivileged(Native Method)
  at javax.security.auth.Subject.doAs(Subject.java:415)
  at 
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
  at org.apache.hadoop.mapred.Child.main(Child.java:249)
 Caused by: java.lang.IllegalStateException: newInstance: Couldn't 
 instantiate sisinflab.SimpleRDFVertexInputFormat
  at 
 org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
  at 
 org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
  at 
 

Re: Couldn't instantiate

2014-07-01 Thread soozandjohnyost
Ah, okay, I see the problem.  Abstract classes cannot be instantiated in Java.  
Ensure you've implemented all of the abstract methods, remove the abstract 
keyword from your class definition, and you should be ready to roll.

-John

Sent from my iPhone

 On Jul 1, 2014, at 5:42 AM, Carmen Manzulli carmenmanzu...@gmail.com wrote:
 
 Hi John,
 yes I've tried to insert a no-arguments constructor but the problem seems to 
 be another one.This is my code, an input format with a vertexreader to read 
 triple as RDF.
 
 import java.io.IOException;
 import java.util.ArrayList;
 import java.lang.InterruptedException;
 
 import org.apache.giraph.graph.Vertex;
 import org.apache.giraph.edge.Edge;
 import org.apache.giraph.edge.EdgeFactory;
 import org.apache.giraph.io.VertexReader;
 import org.apache.giraph.io.VertexInputFormat;
 import org.apache.hadoop.io.LongWritable;
 import org.apache.hadoop.io.NullWritable;
 import org.apache.hadoop.io.Text;
 import org.apache.hadoop.mapreduce.InputSplit;
 import org.apache.hadoop.mapreduce.RecordReader;
 import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
 import org.apache.hadoop.mapreduce.TaskAttemptContext;
 
 
 
 public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat 
 Text,NullWritable,Text {
 
 public SimpleRDFVertexInputFormat() {
 // TODO Auto-generated constructor stub
 super();
 }
 
 public VertexReaderText,NullWritable,Text createVertexReader(InputSplit 
 split,TaskAttemptContext context) throws IOException{
 return new SimpleRDFVertexReader();
 }
 public class SimpleRDFVertexReader extends 
 VertexReaderText,NullWritable,Text{
 
 private RecordReaderLongWritable,Text lineRecordReader;
 private TaskAttemptContext context;
 
 
 @Override
 public void initialize(InputSplit inputsplit, TaskAttemptContext 
 context) throws IOException, InterruptedException{
 
 this.setContext(context);
 lineRecordReader= new LineRecordReader();
 lineRecordReader.initialize(inputsplit, context);
 
 }
 
 @Override
 public final boolean nextVertex() throws IOException, 
 InterruptedException{
 return lineRecordReader.nextKeyValue();
 }
 
 
 @Override
 public final VertexText,NullWritable,Text getCurrentVertex() throws 
 IOException, InterruptedException{
 Text line = lineRecordReader.getCurrentValue();
 VertexText,NullWritable,Text vertex = getConf().createVertex();
 String[] elements = line.toString().split( );
 Text firstele = new Text(elements[0]);
 int len1 = firstele.getLength();
 Text  subject = new Text(firstele.toString().substring(1, 
 len1-1));
 Text secondele = new Text(elements[1]);
 int len2 = secondele.getLength();
 Text predicate = new Text(secondele.toString().substring(1, 
 len2-1));
 Text object = new Text(elements[2]);
 ArrayListEdgeText,Text edge = new 
 ArrayListEdgeText,Text();
 edge.add(EdgeFactory.create(object, predicate) );
 vertex.initialize(subject, null, edge );
 return vertex;
 
 }
 
 @Override
 public void close() throws IOException{
 lineRecordReader.close();
 }
 
 @Override
 public  float getProgress()throws IOException,InterruptedException{
 return lineRecordReader.getProgress();
 }
 
 public TaskAttemptContext getContext() {
 return context;
 }
 
 public void setContext(TaskAttemptContext context) {
 this.context = context;
 }
  
 
 }
 
 
 }
 
 
 2014-07-01 4:23 GMT+02:00 John Yost soozandjohny...@gmail.com:
 Hi Carmen,
 
 Question--did you only define an arguments constructor?  If so, I think you 
 are getting this because you did not define a no-arguments constructor with 
 public visibility.  If this is not the case, I recommend posting your source 
 code and I will be happy to help. 
 
 --John
 
 
 On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli carmenmanzu...@gmail.com 
 wrote:
 Hi,
 I'm trying to run a selectionComputation with my own code for 
 VertexInputFormat but giraph' job starts to work and then fails with:
 
 
 
 
 
 java.lang.IllegalStateException: run: Caught an unrecoverable exception 
 newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
 at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
 at java.security.AccessController.doPrivileged(Native Method)
 at 

Re: Couldn't instantiate

2014-07-01 Thread Carmen Manzulli
OK!!!So in this way i've also to implement checkInputSpecs and getSplits
metods, right??
Thank you very much :)



2014-07-01 12:51 GMT+02:00 soozandjohny...@gmail.com:

 Ah, okay, I see the problem.  Abstract classes cannot be instantiated in
 Java.  Ensure you've implemented all of the abstract methods, remove the
 abstract keyword from your class definition, and you should be ready to
 roll.

 -John

 Sent from my iPhone

 On Jul 1, 2014, at 5:42 AM, Carmen Manzulli carmenmanzu...@gmail.com
 wrote:

 Hi John,
 yes I've tried to insert a no-arguments constructor but the problem seems
 to be another one.This is my code, an input format with a vertexreader to
 read triple as RDF.

 import java.io.IOException;
 import java.util.ArrayList;
 import java.lang.InterruptedException;

 import org.apache.giraph.graph.Vertex;
 import org.apache.giraph.edge.Edge;
 import org.apache.giraph.edge.EdgeFactory;
 import org.apache.giraph.io.VertexReader;
 import org.apache.giraph.io.VertexInputFormat;
 import org.apache.hadoop.io.LongWritable;
 import org.apache.hadoop.io.NullWritable;
 import org.apache.hadoop.io.Text;
 import org.apache.hadoop.mapreduce.InputSplit;
 import org.apache.hadoop.mapreduce.RecordReader;
 import org.apache.hadoop.mapreduce.lib.input.LineRecordReader;
 import org.apache.hadoop.mapreduce.TaskAttemptContext;



 public abstract class SimpleRDFVertexInputFormat extends VertexInputFormat
 Text,NullWritable,Text {

 public SimpleRDFVertexInputFormat() {
 // TODO Auto-generated constructor stub
 super();
 }

 public VertexReaderText,NullWritable,Text
 createVertexReader(InputSplit split,TaskAttemptContext context) throws
 IOException{
 return new SimpleRDFVertexReader();
 }
 public class SimpleRDFVertexReader extends
 VertexReaderText,NullWritable,Text{

 private RecordReaderLongWritable,Text lineRecordReader;
 private TaskAttemptContext context;


 @Override
 public void initialize(InputSplit inputsplit, TaskAttemptContext
 context) throws IOException, InterruptedException{

 this.setContext(context);
 lineRecordReader= new LineRecordReader();
 lineRecordReader.initialize(inputsplit, context);

 }

 @Override
 public final boolean nextVertex() throws IOException,
 InterruptedException{
 return lineRecordReader.nextKeyValue();
 }


 @Override
 public final VertexText,NullWritable,Text getCurrentVertex()
 throws IOException, InterruptedException{
 Text line = lineRecordReader.getCurrentValue();
 VertexText,NullWritable,Text vertex =
 getConf().createVertex();
 String[] elements = line.toString().split( );
 Text firstele = new Text(elements[0]);
 int len1 = firstele.getLength();
 Text  subject = new Text(firstele.toString().substring(1,
 len1-1));
 Text secondele = new Text(elements[1]);
 int len2 = secondele.getLength();
 Text predicate = new Text(secondele.toString().substring(1,
 len2-1));
 Text object = new Text(elements[2]);
 ArrayListEdgeText,Text edge = new
 ArrayListEdgeText,Text();
 edge.add(EdgeFactory.create(object, predicate) );
 vertex.initialize(subject, null, edge );
 return vertex;

 }

 @Override
 public void close() throws IOException{
 lineRecordReader.close();
 }

 @Override
 public  float getProgress()throws IOException,InterruptedException{
 return lineRecordReader.getProgress();
 }

 public TaskAttemptContext getContext() {
 return context;
 }

 public void setContext(TaskAttemptContext context) {
 this.context = context;
 }


 }


 }


 2014-07-01 4:23 GMT+02:00 John Yost soozandjohny...@gmail.com:

 Hi Carmen,

 Question--did you only define an arguments constructor?  If so, I think
 you are getting this because you did not define a no-arguments constructor
 with public visibility.  If this is not the case, I recommend posting your
 source code and I will be happy to help.

 --John


 On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli 
 carmenmanzu...@gmail.com wrote:

 Hi,

 I'm trying to run a selectionComputation with my own code for 
 VertexInputFormat but giraph' job starts to work and then fails with:




 java.lang.IllegalStateException: run: Caught an unrecoverable exception 
 newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
 at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
 at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
 at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
 at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
 at java.security.AccessController.doPrivileged(Native Method)
 at 

Re: Couldn't instantiate

2014-06-30 Thread John Yost
Hi Carmen,

Question--did you only define an arguments constructor?  If so, I think you
are getting this because you did not define a no-arguments constructor with
public visibility.  If this is not the case, I recommend posting your
source code and I will be happy to help.

--John


On Mon, Jun 30, 2014 at 9:38 AM, Carmen Manzulli carmenmanzu...@gmail.com
wrote:

 Hi,

 I'm trying to run a selectionComputation with my own code for 
 VertexInputFormat but giraph' job starts to work and then fails with:




 java.lang.IllegalStateException: run: Caught an unrecoverable exception 
 newInstance: Couldn't instantiate sisinflab.SimpleRDFVertexInputFormat
   at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:101)
   at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:764)
   at org.apache.hadoop.mapred.MapTask.run(MapTask.java:364)
   at org.apache.hadoop.mapred.Child$4.run(Child.java:255)
   at java.security.AccessController.doPrivileged(Native Method)
   at javax.security.auth.Subject.doAs(Subject.java:415)
   at 
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1190)
   at org.apache.hadoop.mapred.Child.main(Child.java:249)
 Caused by: java.lang.IllegalStateException: newInstance: Couldn't instantiate 
 sisinflab.SimpleRDFVertexInputFormat
   at 
 org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:105)
   at 
 org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createVertexInputFormat(ImmutableClassesGiraphConfiguration.java:235)
   at 
 org.apache.giraph.conf.ImmutableClassesGiraphConfiguration.createWrappedVertexInputFormat(ImmutableClassesGiraphConfiguration.java:246)
   at 
 org.apache.giraph.graph.GraphTaskManager.checkInput(GraphTaskManager.java:171)
   at 
 org.apache.giraph.graph.GraphTaskManager.setup(GraphTaskManager.java:207)
   at org.apache.giraph.graph.GraphMapper.setup(GraphMapper.java:59)
   at org.apache.giraph.graph.GraphMapper.run(GraphMapper.java:89)
   ... 7 more
 Caused by: java.lang.InstantiationException
   at 
 sun.reflect.InstantiationExceptionConstructorAccessorImpl.newInstance(InstantiationExceptionConstructorAccessorImpl.java:48)
   at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
   at java.lang.Class.newInstance(Class.java:374)
   at 
 org.apache.giraph.utils.ReflectionUtils.newInstance(ReflectionUtils.java:103)
   ... 13 more


 what does it mean? where is the problem?

 Who can help me?

 Carmen