[jira] [Created] (SPARK-27206) Using slice method with streaming api's Interval on DStream

2019-03-19 Thread Aarthi (JIRA)
Aarthi created SPARK-27206:
--

 Summary: Using slice method with streaming api's Interval on 
DStream
 Key: SPARK-27206
 URL: https://issues.apache.org/jira/browse/SPARK-27206
 Project: Spark
  Issue Type: Question
  Components: DStreams
Affects Versions: 2.4.0
 Environment: Linux, standalone spark
Reporter: Aarthi


Hi. I am in need to slice a DStream that receives data from a custom receiver 
(implemented with Spark's Receiver). There are two options to do this

1. slice(fromTime: 
[Time|http://spark.apache.org/docs/2.3.1/api/scala/org/apache/spark/streaming/Time.html],
 toTime: 
[Time|http://spark.apache.org/docs/2.3.1/api/scala/org/apache/spark/streaming/Time.html])
2. slice(interval: Interval)

Although the second option is a public method, the Interval class is private. 
Can you please help me understand how to use this api with 
org.apache.spark.streaming.Interval ?

Thanks, Aarthi

 



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Updated] (SPARK-26213) Custom Receiver for Structured streaming

2018-11-29 Thread Aarthi (JIRA)


 [ 
https://issues.apache.org/jira/browse/SPARK-26213?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
 ]

Aarthi updated SPARK-26213:
---
Component/s: (was: Spark Core)
 Structured Streaming

> Custom Receiver for Structured streaming
> 
>
> Key: SPARK-26213
> URL: https://issues.apache.org/jira/browse/SPARK-26213
> Project: Spark
>  Issue Type: New Feature
>  Components: Structured Streaming
>Affects Versions: 2.4.0
>Reporter: Aarthi
>Priority: Major
>
> Hi,
> I have implemented a Custom Receiver for a https/json data source by 
> implementing the Receievr abstract class as provided in the documentation 
> here [https://spark.apache.org/docs/latest//streaming-custom-receivers.html]
> This approach works on Spark streaming context  where the custom receiver 
> class is passed it receiverStream. However I would like the implement the 
> same for Structured streaming as each of the DStreams have a complex 
> structure and need to be joined with each other based on complex rules. 
> ([https://stackoverflow.com/questions/53449599/join-two-spark-dstreams-with-complex-nested-structure])
>  Structured streaming uses the Spark Session object that takes in 
> DataStreamReader which is a final class. Please advice on how to implement 
> the custom receiver for Strucutred Streaming. 
> Thanks,



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org



[jira] [Created] (SPARK-26213) Custom Receiver for Structured streaming

2018-11-29 Thread Aarthi (JIRA)
Aarthi created SPARK-26213:
--

 Summary: Custom Receiver for Structured streaming
 Key: SPARK-26213
 URL: https://issues.apache.org/jira/browse/SPARK-26213
 Project: Spark
  Issue Type: New Feature
  Components: Spark Core
Affects Versions: 2.4.0
Reporter: Aarthi


Hi,

I have implemented a Custom Receiver for a https/json data source by 
implementing the Receievr abstract class as provided in the documentation here 
[https://spark.apache.org/docs/latest//streaming-custom-receivers.html]

This approach works on Spark streaming context  where the custom receiver class 
is passed it receiverStream. However I would like the implement the same for 
Structured streaming as each of the DStreams have a complex structure and need 
to be joined with each other based on complex rules. 
([https://stackoverflow.com/questions/53449599/join-two-spark-dstreams-with-complex-nested-structure])

 Structured streaming uses the Spark Session object that takes in 
DataStreamReader which is a final class. Please advice on how to implement the 
custom receiver for Strucutred Streaming. 

Thanks,



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

-
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org