Re: Join two Spark Streaming

2016-06-07 Thread vinay453
I am working on window dstreams wherein each dstream contains 3 rdd with following keys: a,b,c b,c,d c,d,e d,e,f I want to get only unique keys across all dstream a,b,c,d,e,f How to do it in pyspark streaming? -- View this message in context:

Window Operation on Dstream Fails

2016-05-30 Thread vinay453
Hello, I am using 1.6.0 version of Spark and trying to run window operation on DStreams. Window_TwoMin = 4*60*1000 Slide_OneMin = 2*60*1000 census = ssc.textFileStream("./census_stream/").filter(lambda a: a.startswith('-') == False).map(lambda b: b.split("\t")) .map(lambda c:

Re: Model characterization

2014-11-04 Thread vinay453
Go it from a friend - println(model.weights) and println(model.intercept). -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Model-characterization-tp17985p18106.html Sent from the Apache Spark User List mailing list archive at Nabble.com.