Problems running examples in IDEA

2014-08-24 Thread Ron Gonzalez

Hi,
  After getting the code base to compile, I tried running some of the 
scala examples.

  They all fail since it can't find classes like SparkConf.
  If I change the iml file to convert provided scope from PROVIDED to 
COMPILE, I am able to run them. It's simple by doing the following in 
the root directory of the spark code base: find . -name *.iml | xargs 
sed -i.bak 's/PROVIDED/COMPILE/g'.
  Is this expected? I'd really rather not modify the iml files since 
they were sourced from the pom xml files, so if you guys have some tips 
on doing this better, that would be great...


Thanks,
Ron

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Problems running examples in IDEA

2014-08-24 Thread Ron Gonzalez
Oh ok. So from the code base, local execution is dependent on everyone's 
way then, right? I am indeed changing the code to add the master to 
local[*], but still getting the no classdef found errors.


If that's the case, then I think I'm ok then...

Thanks,
Ron


On 08/24/2014 04:21 PM, Sean Owen wrote:

The examples aren't runnable quite like this. It's intended that they
are submitted to a cluster with spark-submit, which would among other
things provide Spark at runtime.

I think you might get them to run this way if you set master to
local[*] and indeed made a run profile that also included Spark on
the classpath.

You would never modify the .iml files anyway. You can change the Maven
pom.xml files if you were to need to modify a dependency scope.

On Mon, Aug 25, 2014 at 12:12 AM, Ron Gonzalez
zlgonza...@yahoo.com.invalid wrote:

Hi,
   After getting the code base to compile, I tried running some of the scala
examples.
   They all fail since it can't find classes like SparkConf.
   If I change the iml file to convert provided scope from PROVIDED to
COMPILE, I am able to run them. It's simple by doing the following in the
root directory of the spark code base: find . -name *.iml | xargs sed
-i.bak 's/PROVIDED/COMPILE/g'.
   Is this expected? I'd really rather not modify the iml files since they
were sourced from the pom xml files, so if you guys have some tips on doing
this better, that would be great...

Thanks,
Ron

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org


-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org




-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Spark Avro Generation

2014-08-11 Thread Ron Gonzalez
If you don't want to build the entire thing, you can also do

mvn generate-sources in externals/flume-sink

Thanks,
Ron

Sent from my iPhone

 On Aug 11, 2014, at 8:32 AM, Hari Shreedharan hshreedha...@cloudera.com 
 wrote:
 
 Jay running sbt compile or assembly should generate the sources.
 
 On Monday, August 11, 2014, Devl Devel devl.developm...@gmail.com wrote:
 
 Hi
 
 So far I've been managing to build Spark from source but since a change in
 spark-streaming-flume I have no idea how to generate classes (e.g.
 SparkFlumeProtocol) from the avro schema.
 
 I have used sbt to run avro:generate (from the top level spark dir) but it
 produces nothing - it just says:
 
 avro:generate
 [success] Total time: 0 s, completed Aug 11, 2014 12:26:49 PM.
 
 Please can someone send me their build.sbt or just tell me how to build
 spark so that all avro files get generated as well?
 
 Sorry for the noob question but I really have tried by best on this one!
 Cheers
 

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



1.1.0-SNAPSHOT possible regression

2014-08-08 Thread Ron Gonzalez
Hi,
I have a running spark app against the released version of 1.0.1. I recently 
decided to try and upgrade to the trunk version. Interestingly enough, after 
building the 1.1.0-SNAPSHOT assembly, replacing it as my assembly in my app 
caused errors. In particular, it seems Kryo serialization isn't taking. 
Replacing it with 1.0.1 automatically gets it working again.

Any thoughts? Is this a known issue?

Thanks,
Ron

at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180) at 
java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at 
scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137)
 at 
scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135)
 at scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226) 
at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at 
scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124) at 
scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at 
scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606) at
 java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988) at 
java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495) at 
java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431) at 
java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177) at 
java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at 
org.apache.spark.util.Utils$.serialize(Utils.scala:64) at 
org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232)
 at 
org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85)
 at 
org.apache.spark.broadcast.TorrentBroadcast.init(TorrentBroadcast.scala:66) 
at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
 at 
org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
 at
 
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
 at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)

Re: 1.1.0-SNAPSHOT possible regression

2014-08-08 Thread Ron Gonzalez
Oops, exception is below.
For local, it works and that's the case since TorrentBroadcast has if !isLocal, 
then that's the only time the broadcast actually happens. It really seems as if 
the Kryo wrapper didn't kick in for some reason. Do we have a unit test that 
tests the Kryo serialization that I can give a try?
Thanks,
Ron

Exception in thread Driver java.lang.reflect.InvocationTargetException at 
sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) 
at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606) at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:180)
Caused by: java.io.NotSerializableException: 
org.apache.avro.generic.GenericData$Record - custom writeObject data (class 
scala.collection.mutable.HashMap)


On Friday, August 8, 2014 10:16 AM, Reynold Xin r...@databricks.com wrote:
 


Looks like you didn't actually paste the exception message. Do you mind
doing that?




On Fri, Aug 8, 2014 at 10:14 AM, Reynold Xin r...@databricks.com wrote:

 Pasting a better formatted trace:



 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
 at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
 at
 scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137)
 at
 scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135)
 at
 scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
 at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39)
 at
 scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124)
 at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39)
 at scala.collection.mutable.HashMap.writeObject(HashMap.scala:135)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606) at
  java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
 at
 java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
 at
 java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431)
 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1177)
 at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347)
 at org.apache.spark.util.Utils$.serialize(Utils.scala:64)
 at
 org.apache.spark.broadcast.TorrentBroadcast$.blockifyObject(TorrentBroadcast.scala:232)
 at
 org.apache.spark.broadcast.TorrentBroadcast.sendBroadcast(TorrentBroadcast.scala:85)
 at
 org.apache.spark.broadcast.TorrentBroadcast.init(TorrentBroadcast.scala:66)
 at
 org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:36)
 at
 org.apache.spark.broadcast.TorrentBroadcastFactory.newBroadcast(TorrentBroadcastFactory.scala:29)
 at

  
org.apache.spark.broadcast.BroadcastManager.newBroadcast(BroadcastManager.scala:62)
 at org.apache.spark.SparkContext.broadcast(SparkContext.scala:809)


 On Fri, Aug 8, 2014 at 10:12 AM, Ron Gonzalez 
 zlgonza...@yahoo.com.invalid wrote:

 Hi,
 I have a running spark app against the released version of 1.0.1. I
 recently decided to try and upgrade to the trunk version. Interestingly
 enough, after building the 1.1.0-SNAPSHOT assembly, replacing it as my
 assembly in my app caused errors. In particular, it seems Kryo
 serialization isn't taking. Replacing it with 1.0.1 automatically gets it
 working again.

 Any thoughts? Is this a known issue?

 Thanks,
 Ron

 at java.io.ObjectOutputStream.writeObject0(ObjectOutputStream.java:1180)
 at java.io.ObjectOutputStream.writeObject(ObjectOutputStream.java:347) at
 scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:137)
 at
 scala.collection.mutable.HashMap$$anonfun$writeObject$1.apply(HashMap.scala:135)
 at
 scala.collection.mutable.HashTable$class.foreachEntry(HashTable.scala:226)
 at scala.collection.mutable.HashMap.foreachEntry(HashMap.scala:39) at
 scala.collection.mutable.HashTable$class.serializeTo(HashTable.scala:124)
 at scala.collection.mutable.HashMap.serializeTo(HashMap.scala:39) at
 scala.collection.mutable.HashMap.writeObject(HashMap.scala:135) at
 sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at
 sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
 at
 sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:606) at
  java.io.ObjectStreamClass.invokeWriteObject(ObjectStreamClass.java:988)
 at java.io.ObjectOutputStream.writeSerialData(ObjectOutputStream.java:1495)
 at
 java.io.ObjectOutputStream.writeOrdinaryObject(ObjectOutputStream.java:1431

Re: Buidling spark in Eclipse Kepler

2014-08-07 Thread Ron Gonzalez
So I opened it as a maven project (I opened it using the top-level pom.xml 
file), but rebuilding the project ends up in all sorts of errors about 
unresolved dependencies.

Thanks,
Ron


On Wednesday, August 6, 2014 11:15 PM, Sean Owen so...@cloudera.com wrote:
 


(Don't use gen-idea, just open it directly as a Maven project in IntelliJ.)


On Thu, Aug 7, 2014 at 4:53 AM, Ron Gonzalez
zlgonza...@yahoo.com.invalid wrote:
 So I downloaded community edition of IntelliJ, and ran sbt/sbt gen-idea.
 I then imported the pom.xml file.
 I'm still getting all sorts of errors from IntelliJ about unresolved 
 dependencies.
 Any suggestions?

 Thanks,
 Ron


 On Wednesday, August 6, 2014 12:29 PM, Ron Gonzalez 
 zlgonza...@yahoo.com.INVALID wrote:



 Hi,
   I'm trying to get the apache spark trunk compiling in my Eclipse, but I 
can't seem to get it going. In particular, I've tried sbt/sbt eclipse, but it 
doesn't seem to create the eclipse pieces for yarn and other projects. Doing 
mvn eclipse:eclipse on yarn seems to fail as well as sbt/sbt eclipse just for 
yarn fails. Is there some documentation available for eclipse? I've gone 
through the ones on the site, but to no avail.
   Any tips?

 Thanks,
 Ron

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org

Buidling spark in Eclipse Kepler

2014-08-06 Thread Ron Gonzalez
Hi,
  I'm trying to get the apache spark trunk compiling in my Eclipse, but I can't 
seem to get it going. In particular, I've tried sbt/sbt eclipse, but it doesn't 
seem to create the eclipse pieces for yarn and other projects. Doing mvn 
eclipse:eclipse on yarn seems to fail as well as sbt/sbt eclipse just for yarn 
fails. Is there some documentation available for eclipse? I've gone through the 
ones on the site, but to no avail.
  Any tips?

Thanks,
Ron

Re: Buidling spark in Eclipse Kepler

2014-08-06 Thread Ron Gonzalez
Ok I'll give it a little more time, and if I can't get it going, I'll switch. I 
am indeed a little disappointed in the Scala IDE plugin for Eclipse so I think 
switching to IntelliJ might be my best bet.

Thanks,
Ron

Sent from my iPad

 On Aug 6, 2014, at 1:43 PM, Sean Owen so...@cloudera.com wrote:
 
 I think your best bet by far is to consume the Maven build as-is from
 within Eclipse. I wouldn't try to export a project config from the
 build as there is plenty to get lost in translation.
 
 Certainly this works well with IntelliJ, and by the by, if you have a
 choice, I would strongly recommend IntelliJ over Eclipse for working
 with Maven and Scala.
 
 On Wed, Aug 6, 2014 at 8:29 PM, Ron Gonzalez
 zlgonza...@yahoo.com.invalid wrote:
 Hi,
  I'm trying to get the apache spark trunk compiling in my Eclipse, but I 
 can't seem to get it going. In particular, I've tried sbt/sbt eclipse, but 
 it doesn't seem to create the eclipse pieces for yarn and other projects. 
 Doing mvn eclipse:eclipse on yarn seems to fail as well as sbt/sbt eclipse 
 just for yarn fails. Is there some documentation available for eclipse? I've 
 gone through the ones on the site, but to no avail.
  Any tips?
 
 Thanks,
 Ron

-
To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
For additional commands, e-mail: dev-h...@spark.apache.org



Re: Buidling spark in Eclipse Kepler

2014-08-06 Thread Ron Gonzalez
So I downloaded community edition of IntelliJ, and ran sbt/sbt gen-idea.
I then imported the pom.xml file.
I'm still getting all sorts of errors from IntelliJ about unresolved 
dependencies.
Any suggestions?

Thanks,
Ron


On Wednesday, August 6, 2014 12:29 PM, Ron Gonzalez 
zlgonza...@yahoo.com.INVALID wrote:
 


Hi,
  I'm trying to get the apache spark trunk compiling in my Eclipse, but I can't 
seem to get it going. In particular, I've tried sbt/sbt eclipse, but it doesn't 
seem to create the eclipse pieces for yarn and other projects. Doing mvn 
eclipse:eclipse on yarn seems to fail as well as sbt/sbt eclipse just for yarn 
fails. Is there some documentation available for eclipse? I've gone through the 
ones on the site, but to no avail.
  Any tips?

Thanks,
Ron