Hi I tried to run the program WikipediaPageRankStandalone from examples directory of Spark. But am getting the following error.Please help out.
hduser@vm4:~/spark-test/spark-0.7.2$ ./run spark.bagel.examples.WikipediaPageRankStandalone pagerank_data.txt 2 1 spark://vm4:7077 true 13/09/04 13:00:53 WARN spark.Utils: Your hostname, vm4 resolves to a loopback address: 127.0.1.1; using 192.168.0.50 instead (on interface eth0) 13/09/04 13:00:53 WARN spark.Utils: Set SPARK_LOCAL_IP if you need to bind to another address 13/09/04 13:00:54 INFO slf4j.Slf4jEventHandler: Slf4jEventHandler started 13/09/04 13:00:55 INFO spark.SparkEnv: Registering BlockManagerMaster 13/09/04 13:00:55 INFO storage.MemoryStore: MemoryStore started with capacity 326.7 MB. 13/09/04 13:00:55 INFO storage.DiskStore: Created local directory at /tmp/spark-local-20130904130055-5c43 13/09/04 13:00:55 INFO network.ConnectionManager: Bound socket to port 42840 with id = ConnectionManagerId(vm4,42840) 13/09/04 13:00:55 INFO storage.BlockManagerMaster: Trying to register BlockManager 13/09/04 13:00:55 INFO storage.BlockManagerMaster: Registered BlockManager 13/09/04 13:00:55 INFO server.Server: jetty-7.6.8.v20121106 13/09/04 13:00:55 INFO server.AbstractConnector: Started [email protected]:44114 13/09/04 13:00:55 INFO broadcast.HttpBroadcast: Broadcast server started at http://192.168.0.50:44114 13/09/04 13:00:55 INFO spark.SparkEnv: Registering MapOutputTracker 13/09/04 13:00:55 INFO spark.HttpFileServer: HTTP File server directory is /tmp/spark-980d9d9a-8451-4803-8835-04fa9d29bb65 13/09/04 13:00:55 INFO server.Server: jetty-7.6.8.v20121106 13/09/04 13:00:55 INFO server.AbstractConnector: Started [email protected]:47072 13/09/04 13:00:55 INFO io.IoWorker: IoWorker thread 'spray-io-worker-0' started 13/09/04 13:00:56 INFO server.HttpServer: akka://spark/user/BlockManagerHTTPServer started on /0.0.0.0:55091 13/09/04 13:00:56 INFO storage.BlockManagerUI: Started BlockManager web UI at http://vm4:55091 13/09/04 13:00:56 INFO client.Client$ClientActor: Connecting to master spark://vm4:7077 13/09/04 13:00:56 INFO cluster.SparkDeploySchedulerBackend: Connected to Spark cluster with app ID app-20130904130056-0012 13/09/04 13:00:56 INFO client.Client$ClientActor: Executor added: app-20130904130056-0012/0 on worker-20130903170743-vm4-37060 (vm4) with 1 cores 13/09/04 13:00:56 INFO cluster.SparkDeploySchedulerBackend: Granted executor ID app-20130904130056-0012/0 on host vm4 with 1 cores, 512.0 MB RAM 13/09/04 13:00:57 INFO client.Client$ClientActor: Executor updated: app-20130904130056-0012/0 is now RUNNING 13/09/04 13:00:58 WARN util.NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable 13/09/04 13:00:59 INFO storage.MemoryStore: ensureFreeSpace(123002) called with curMem=0, maxMem=342526525 13/09/04 13:00:59 INFO storage.MemoryStore: Block broadcast_0 stored as values to memory (estimated size 120.1 KB, free 326.5 MB) Exception in thread "main" scala.MatchError: Configuration: core-default.xml, core-site.xml, mapred-default.xml, mapred-site.xml, hdfs-default.xml, hdfs-site.xml (of class spark.SerializableWritable) at spark.bagel.examples.WPRSerializationStream.writeObject(WikipediaPageRankStandalone.scala:146) at spark.broadcast.HttpBroadcast$.write(HttpBroadcast.scala:115) at spark.broadcast.HttpBroadcast.<init>(HttpBroadcast.scala:28) at spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcast.scala:54) at spark.broadcast.HttpBroadcastFactory.newBroadcast(HttpBroadcast.scala:50) at spark.broadcast.BroadcastManager.newBroadcast(Broadcast.scala:50) at spark.SparkContext.broadcast(SparkContext.scala:440) at spark.rdd.HadoopRDD.<init>(HadoopRDD.scala:50) at spark.SparkContext.hadoopFile(SparkContext.scala:264) at spark.SparkContext.textFile(SparkContext.scala:235) at spark.bagel.examples.WikipediaPageRankStandalone$.main(WikipediaPageRankStandalone.scala:33) at spark.bagel.examples.WikipediaPageRankStandalone.main(WikipediaPageRankStandalone.scala) hduser@vm4:~/spark-test/spark-0.7.2$
