Job with spark

2015-06-17 Thread Sergio Jiménez Barrio
I am student of telecommunications engineering and this year I worked with spark. It is a world that I like and want to know if this job having in this area. Thanks for all Regards

Fwd: Re: How to keep a SQLContext instance alive in a spark streaming application's life cycle?

2015-06-10 Thread Sergio Jiménez Barrio
Note: CCing user@spark.apache.org First, you must check if the RDD is empty: messages.foreachRDD { rdd => if (!rdd.isEmpty) { }} Now, you can obtain the instance of a SQLContext: val sqlContext = SQLContextSingleton.getInstance(rdd.sparkContext)

Spark streaming closes with Cassandra Conector

2015-05-09 Thread Sergio Jiménez Barrio
I am trying save some data in Cassandra in app with spark Streaming: Messages.foreachRDD { . . . CassandraRDD.saveToCassandra("test","test") } When I run, the app is closes when I recibe data or can't connect with Cassandra. Some idea? Thanks -- Atte. Sergio Jiménez

How update counter in cassandra

2015-05-06 Thread Sergio Jiménez Barrio
I have a Counter family colums in Cassandra. I want update this counters with a aplication in spark Streaming. How can I update counter cassandra with Spark? Thanks.

AJAX with Apache Spark

2015-05-04 Thread Sergio Jiménez Barrio
Hi, I am trying create a DashBoard of a job of Apache Spark. I need run Spark Streaming 24/7 and when recive a ajax request this answer with the actual state of the job. I have created the client, and the program in Spark. I tried create the service of response with play, but this run the program

Re: Convert DStream[Long] to Long

2015-04-25 Thread Sergio Jiménez Barrio
; > Thanks > Best Regards > > On Fri, Apr 24, 2015 at 11:20 PM, Sergio Jiménez Barrio < > drarse.a...@gmail.com> wrote: > >> Hi, >> >> I need compare the count of messages recived if is 0 or not, but >> messages.count() return a DStrea

Re: Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
But if a use messages.count().print this show a single number :/ 2015-04-24 20:22 GMT+02:00 Sean Owen : > It's not a Long. it's an infinite stream of Longs. > > On Fri, Apr 24, 2015 at 2:20 PM, Sergio Jiménez Barrio > wrote: > > It isn't the sum. This

Re: Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
data so far but may have data in the future. > That's why I say you can count records received to date. > > On Fri, Apr 24, 2015 at 1:57 PM, Sergio Jiménez Barrio > wrote: > > My problem is that I need know if I have a DStream with data. If in this > > second I didn'

Convert DStream[Long] to Long

2015-04-24 Thread Sergio Jiménez Barrio
Hi, I need compare the count of messages recived if is 0 or not, but messages.count() return a DStream[Long]. I tried this solution: val cuenta = messages.count().foreachRDD{ rdd => rdd.first() } But th

Re: Convert DStream to DataFrame

2015-04-24 Thread Sergio Jiménez Barrio
Spark Documentation Thanks for all! 2015-04-23 10:29 GMT+02:00 Sergio Jiménez Barrio : > Thank you ver much, Tathagata! > > > El miércoles, 22 de abril de 2015, Tathagata Das > escribió: > >> Aaah, that. That is probably a limitation of the SQLContext (cc'ing Yin >

Re: Convert DStream to DataFrame

2015-04-23 Thread Sergio Jiménez Barrio
Thank you ver much, Tathagata! El miércoles, 22 de abril de 2015, Tathagata Das escribió: > Aaah, that. That is probably a limitation of the SQLContext (cc'ing Yin > for more information). > > > On Wed, Apr 22, 2015 at 7:07 AM, Sergio Jiménez Barrio < > drars

Re: Convert DStream to DataFrame

2015-04-22 Thread Sergio Jiménez Barrio
Sorry, this is the error: [error] /home/sergio/Escritorio/hello/streaming.scala:77: Implementation restriction: case classes cannot have more than 22 parameters. 2015-04-22 16:06 GMT+02:00 Sergio Jiménez Barrio : > I tried the solution of the guide, but I exceded the size of case class &g

Re: Convert DStream to DataFrame

2015-04-22 Thread Sergio Jiménez Barrio
; What about sqlcontext.createDataframe(rdd)? >> On 22 Apr 2015 23:04, "Sergio Jiménez Barrio" >> wrote: >> >>> Hi, >>> >>> I am using Kafka with Apache Stream to send JSON to Apache Spark: >>> >>> val messages = KafkaUtils.c

Convert DStream to DataFrame

2015-04-22 Thread Sergio Jiménez Barrio
Hi, I am using Kafka with Apache Stream to send JSON to Apache Spark: val messages = KafkaUtils.createDirectStream[String, String, StringDecoder, StringDecoder](ssc, kafkaParams, topicsSet) Now, I want parse the DStream created to DataFrame, but I don't know if Spark 1.3 have some easy way for t

Re: From DataFrame to LabeledPoint

2015-04-07 Thread Sergio Jiménez Barrio
the user list. > > On Mon, Apr 6, 2015 at 6:53 AM, Sergio Jiménez Barrio < > drarse.a...@gmail.com> wrote: > >> Hi!, >> >> I had tried your solution, and I saw that the first row is null. This is >> important? Can I work with null rows? Some rows have