unsubscribe

2015-10-20 Thread Pete Zybrick




> On Oct 20, 2015, at 5:31 PM, ravi.gawai  wrote:
> 
> you can use mapfunction..
> 
> This is java example.. 
> 
> final JavaRDD rdd1 = sc.textFile("filepath").map((line) -> {
> //logic for line to product converstion});
> 
> Product class might have 5 attributes like you said class Product{
> String str1;
> int i1;
> String str2;
> int i2;
> String str3;
> // with getter setters
> }
> Now you can convert this Product RDD to another CustomRDD lets say Class
> SubProduct{ String str1;int i1; //getter setters }
> 
> final JavaRDD rdd2 = rdd1.map(product -> {
>final SubProduct subProduct = new SubProduct();
> 
>// map product attributes to subProduct attributes;
>return subProduct;
>});
> 
> 
> 
> 
> --
> View this message in context: 
> http://apache-spark-user-list.1001560.n3.nabble.com/Filter-RDD-tp25133p25148.html
> Sent from the Apache Spark User List mailing list archive at Nabble.com.
> 
> -
> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
> For additional commands, e-mail: user-h...@spark.apache.org
> 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: Local spark jars not being detected

2015-06-20 Thread Pete Zybrick
It looks like you are using parens instead of curly braces on scala.version



 On Jun 20, 2015, at 8:38 AM, Ritesh Kumar Singh 
 riteshoneinamill...@gmail.com wrote:
 
 Hi,
 
 I'm using IntelliJ ide for my spark project. 
 I've compiled spark 1.3.0 for scala 2.11.4 and here's the one of the compiled 
 jar installed in my m2 folder :
 
 ~/.m2/repository/org/apache/spark/spark-core_2.11/1.3.0/spark-core_2.11-1.3.0.jar
 
 But when I add this dependency in my pom file for the project :
 
 dependency
 groupIdorg.apache.spark/groupId
 artifactIdspark-core_$(scala.version)/artifactId
 version${spark.version}/version
 scopeprovided/scope
 /dependency
 
 I'm getting Dependency org.apache.spark:spark-core_$(scala.version):1.3.0 
 not found.
 Why is this happening and what's the workaround ?

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org



Re: spark 1.2 ec2 launch script hang

2015-01-26 Thread Pete Zybrick
Try using an absolute path to the pem file



 On Jan 26, 2015, at 8:57 PM, ey-chih chow eyc...@hotmail.com wrote:
 
 Hi,
 
 I used the spark-ec2 script of spark 1.2 to launch a cluster.  I have
 modified the script according to 
 
 https://github.com/grzegorz-dubicki/spark/commit/5dd8458d2ab9753aae939b3bb33be953e2c13a70
 
 But the script was still hung at the following message:
 
 Waiting for cluster to enter 'ssh-ready'
 state.
 
 Any additional thing I should do to make it succeed?  Thanks.
 
 
 Ey-Chih Chow
 
 
 
 --
 View this message in context: 
 http://apache-spark-user-list.1001560.n3.nabble.com/spark-1-2-ec2-launch-script-hang-tp21381.html
 Sent from the Apache Spark User List mailing list archive at Nabble.com.
 
 -
 To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
 For additional commands, e-mail: user-h...@spark.apache.org
 

-
To unsubscribe, e-mail: user-unsubscr...@spark.apache.org
For additional commands, e-mail: user-h...@spark.apache.org