Re: [Spark Streaming] Design Patterns forEachRDD

2015-10-22 Thread Nipun Arora
Hi Sandip,

Thanks for your response..

I am not sure if this is the same thing.
I am looking for a way to connect to external network as shown in the
example.

@All - Can anyone else let me know if they have a better solution?

Thanks
Nipun

On Wed, Oct 21, 2015 at 2:07 PM, Sandip Mehta 
wrote:

> Does this help ?
>
> final JavaHBaseContext hbaseContext = new JavaHBaseContext(
> javaSparkContext, conf);
> customerModels.foreachRDD(new Function() {
>   private static final long serialVersionUID = 1L;
>   @Override
>   public Void call(JavaRDD currentRDD) throws Exception {
> JavaRDD customerWithPromotion = hbaseContext
> .mapPartition(currentRDD, new PromotionLookupFunction());
> customerWithPromotion.persist(StorageLevel.MEMORY_AND_DISK_SER());
> customerWithPromotion.foreachPartition();
>   }
> });
>
>
> On 21-Oct-2015, at 10:55 AM, Nipun Arora  wrote:
>
> Hi All,
>
> Can anyone provide a design pattern for the following code shown in the
> Spark User Manual, in JAVA ? I have the same exact use-case, and for some
> reason the design pattern for Java is missing.
>
>  Scala version taken from :
> http://spark.apache.org/docs/latest/streaming-programming-guide.html#design-patterns-for-using-foreachrdd
>
> dstream.foreachRDD { rdd =>
>   rdd.foreachPartition { partitionOfRecords =>
> val connection = createNewConnection()
> partitionOfRecords.foreach(record => connection.send(record))
> connection.close()
>   }}
>
>
> I have googled for it and haven't really found a solution. This seems to
> be an important piece of information, especially for people who need to
> ship their code necessarily in Java because of constraints in the company
> (like me) :)
>
> I'd really appreciate any help
>
> Thanks
> Nipun
>
>
>


Re: [Spark Streaming] Design Patterns forEachRDD

2015-10-21 Thread Sandip Mehta
Does this help ?

final JavaHBaseContext hbaseContext = new JavaHBaseContext(javaSparkContext, 
conf);
customerModels.foreachRDD(new Function() {
  private static final long serialVersionUID = 1L;
  @Override
  public Void call(JavaRDD currentRDD) throws Exception {
JavaRDD customerWithPromotion = 
hbaseContext.mapPartition(currentRDD, new PromotionLookupFunction());
customerWithPromotion.persist(StorageLevel.MEMORY_AND_DISK_SER());
customerWithPromotion.foreachPartition();
  }
});


> On 21-Oct-2015, at 10:55 AM, Nipun Arora  wrote:
> 
> Hi All,
> 
> Can anyone provide a design pattern for the following code shown in the Spark 
> User Manual, in JAVA ? I have the same exact use-case, and for some reason 
> the design pattern for Java is missing.
> 
>  Scala version taken from : 
> http://spark.apache.org/docs/latest/streaming-programming-guide.html#design-patterns-for-using-foreachrdd
>  
> 
> 
> dstream.foreachRDD { rdd =>
>   rdd.foreachPartition { partitionOfRecords =>
> val connection = createNewConnection()
> partitionOfRecords.foreach(record => connection.send(record))
> connection.close()
>   }
> }
> 
> I have googled for it and haven't really found a solution. This seems to be 
> an important piece of information, especially for people who need to ship 
> their code necessarily in Java because of constraints in the company (like 
> me) :)
> 
> I'd really appreciate any help
> 
> Thanks
> Nipun



[Spark Streaming] Design Patterns forEachRDD

2015-10-21 Thread Nipun Arora
Hi All,

Can anyone provide a design pattern for the following code shown in the
Spark User Manual, in JAVA ? I have the same exact use-case, and for some
reason the design pattern for Java is missing.

 Scala version taken from :
http://spark.apache.org/docs/latest/streaming-programming-guide.html#design-patterns-for-using-foreachrdd

dstream.foreachRDD { rdd =>
  rdd.foreachPartition { partitionOfRecords =>
val connection = createNewConnection()
partitionOfRecords.foreach(record => connection.send(record))
connection.close()
  }}


I have googled for it and haven't really found a solution. This seems to be
an important piece of information, especially for people who need to ship
their code necessarily in Java because of constraints in the company (like
me) :)

I'd really appreciate any help

Thanks
Nipun