Re: Fails to run simple Spark (Hello World) scala program

2014-09-23 Thread Moshe Beeri
)
 at

 org.apache.hadoop.security.JniBasedUnixGroupsMapping.clinit(JniBasedUnixGroupsMapping.java:49)

 I have Hadoop 1.2.1 running on Ubuntu 14.04, the Scala console run as
 expected.

 What am I doing wrong?
 Any idea will be welcome





 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718.html
 Sent from the Apache Spark User List mailing list archive at
 Nabble.com.

 -
 To unsubscribe, e-mail: [hidden email]
 http://user/SendEmail.jtp?type=nodenode=14724i=2
 For additional commands, e-mail: [hidden email]
 http://user/SendEmail.jtp?type=nodenode=14724i=3




 --
 Manu Suryavansh



 --
  If you reply to this email, your message will be added to the
 discussion below:

 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14724.html
  To unsubscribe from Fails to run simple Spark (Hello World) scala
 program, click here.
 NAML
 http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml



 --
 View this message in context: Re: Fails to run simple Spark (Hello
 World) scala program
 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14731.html

 Sent from the Apache Spark User List mailing list archive
 http://apache-spark-user-list.1001560.n3.nabble.com/ at Nabble.com.




 --
  If you reply to this email, your message will be added to the discussion
 below:

 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14785.html
  To unsubscribe from Fails to run simple Spark (Hello World) scala
 program, click here
 http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=unsubscribe_by_codenode=14718code=bW9zaGUuYmVlcmlAZ21haWwuY29tfDE0NzE4fDE0NzUwMDQ2Ng==
 .
 NAML
 http://apache-spark-user-list.1001560.n3.nabble.com/template/NamlServlet.jtp?macro=macro_viewerid=instant_html%21nabble%3Aemail.namlbase=nabble.naml.namespaces.BasicNamespace-nabble.view.web.template.NabbleNamespace-nabble.view.web.template.NodeNamespacebreadcrumbs=notify_subscribers%21nabble%3Aemail.naml-instant_emails%21nabble%3Aemail.naml-send_instant_email%21nabble%3Aemail.naml





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14907.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
object Nizoz {

  def connect(): Unit = {
val conf = new SparkConf().setAppName(nizoz).setMaster(master);
val spark = new SparkContext(conf)
val lines =
spark.textFile(file:///home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md)
val lineLengths = lines.map(s = s.length)
val totalLength = lineLengths.reduce((a, b) = a + b)
println(totalLength= + totalLength)

  }

  def main(args: Array[String]) {
println(scala.tools.nsc.Properties.versionString)
try {
  //Nizoz.connect
  val logFile =
/home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md // Should
be some file on your system
  val conf = new SparkConf().setAppName(Simple
Application).setMaster(spark://master:7077)
  val sc = new SparkContext(conf)
  val logData = sc.textFile(logFile, 2).cache()
  val numAs = logData.filter(line = line.contains(a)).count()
  val numBs = logData.filter(line = line.contains(b)).count()
  println(Lines with a: %s, Lines with b: %s.format(numAs, numBs))

} catch {
  case e = {
println(e.getCause())
println(stack:)
e.printStackTrace()
  }
}
  }
}
Runs with Scala 2.10.4
The problem is this [vogue] exception:

at com.example.scamel.Nizoz.main(Nizoz.scala)
Caused by: java.lang.RuntimeException:
java.lang.reflect.InvocationTargetException
at
org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
at org.apache.hadoop.security.Groups.init(Groups.java:64)
at
org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
...
Caused by: java.lang.reflect.InvocationTargetException
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at
sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
at
sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
...
... 10 more
Caused by: java.lang.UnsatisfiedLinkError:
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
at 
org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
Method)
at
org.apache.hadoop.security.JniBasedUnixGroupsMapping.clinit(JniBasedUnixGroupsMapping.java:49)

I have Hadoop 1.2.1 running on Ubuntu 14.04, the Scala console run as
expected.

What am I doing wrong?
Any idea will be welcome 





--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Manu Suryavansh
Hi Moshe,

Spark needs a Hadoop 2.x/YARN cluster. Other wise you can run it without
hadoop in the stand alone mode.

Manu



On Sat, Sep 20, 2014 at 12:55 AM, Moshe Beeri moshe.be...@gmail.com wrote:

 object Nizoz {

   def connect(): Unit = {
 val conf = new SparkConf().setAppName(nizoz).setMaster(master);
 val spark = new SparkContext(conf)
 val lines =

 spark.textFile(file:///home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md)
 val lineLengths = lines.map(s = s.length)
 val totalLength = lineLengths.reduce((a, b) = a + b)
 println(totalLength= + totalLength)

   }

   def main(args: Array[String]) {
 println(scala.tools.nsc.Properties.versionString)
 try {
   //Nizoz.connect
   val logFile =
 /home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md // Should
 be some file on your system
   val conf = new SparkConf().setAppName(Simple
 Application).setMaster(spark://master:7077)
   val sc = new SparkContext(conf)
   val logData = sc.textFile(logFile, 2).cache()
   val numAs = logData.filter(line = line.contains(a)).count()
   val numBs = logData.filter(line = line.contains(b)).count()
   println(Lines with a: %s, Lines with b: %s.format(numAs, numBs))

 } catch {
   case e = {
 println(e.getCause())
 println(stack:)
 e.printStackTrace()
   }
 }
   }
 }
 Runs with Scala 2.10.4
 The problem is this [vogue] exception:

 at com.example.scamel.Nizoz.main(Nizoz.scala)
 Caused by: java.lang.RuntimeException:
 java.lang.reflect.InvocationTargetException
 at

 org.apache.hadoop.util.ReflectionUtils.newInstance(ReflectionUtils.java:131)
 at org.apache.hadoop.security.Groups.init(Groups.java:64)
 at

 org.apache.hadoop.security.Groups.getUserToGroupsMappingService(Groups.java:240)
 ...
 Caused by: java.lang.reflect.InvocationTargetException
 at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native
 Method)
 at

 sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62)
 at

 sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
 ...
 ... 10 more
 Caused by: java.lang.UnsatisfiedLinkError:
 org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative()V
 at
 org.apache.hadoop.security.JniBasedUnixGroupsMapping.anchorNative(Native
 Method)
 at

 org.apache.hadoop.security.JniBasedUnixGroupsMapping.clinit(JniBasedUnixGroupsMapping.java:49)

 I have Hadoop 1.2.1 running on Ubuntu 14.04, the Scala console run as
 expected.

 What am I doing wrong?
 Any idea will be welcome





 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org




-- 
Manu Suryavansh


Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
Thank Manu,

I just saw I have included hadoop client 2.x in my pom.xml, removing it
solved the problem.

Thanks for you help



--
View this message in context: 
http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14721.html
Sent from the Apache Spark User List mailing list archive at Nabble.com.

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Fails to run simple Spark (Hello World) scala program

2014-09-20 Thread Moshe Beeri
Hi Nanu/All

Now I interfacing an other strange (relatively to new complex framework)
error.
I run ./sbin/start-all.sh (my computer name after John nash) and got the
connection Connecting to master spark://nash:7077
running on my local machine yields
java.lang.ClassNotFoundException: com.example.scamel.Nizoz$$anonfun$3

But the class com.example.scamel.Nizoz (in fact Scala object) is the one
under debugging.

  def main(args: Array[String]) {
println(scala.tools.nsc.Properties.versionString)
try {
  //Nizoz.connect
  val logFile =
/home/moshe/store/frameworks/spark-1.1.0-bin-hadoop1/README.md // Should
be some file on your system
  val conf = new SparkConf().setAppName(spark
town).setMaster(spark://nash:7077); //spark://master:7077
  val sc = new SparkContext(conf)
  val logData = sc.textFile(logFile, 2).cache()
  *val numAs = logData.filter(line = line.contains(a)).count()//
- here is  where the exception thrown *

Any help will be welcome





תודה רבה,
משה בארי.
054-3133943
Email moshe.be...@gmail.com | linkedin http://www.linkedin.com/in/mobee



On Sat, Sep 20, 2014 at 11:22 AM, Moshe Beeri moshe.be...@gmail.com wrote:

 Thank Manu,

 I just saw I have included hadoop client 2.x in my pom.xml, removing it
 solved the problem.

 Thanks for you help



 --
 View this message in context:
 http://apache-spark-user-list.1001560.n3.nabble.com/Fails-to-run-simple-Spark-Hello-World-scala-program-tp14718p14721.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.

 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org