s helps.
>>
>> Thanks
>> saurabh.
>>
>>
>>
>> *Saurabh Jha*
>> Intl. Exchange Student
>> School of Computing Engineering
>> Nanyang Technological University,
>> Singapore
>> Web: http://profile.saurabhjha.in
>> Mob: +65 946631
This is a bit crazy :)
I suppose you would have to run Java code on the GPU!
I heard there are some funny projects to do that...
Pascal
On Fri, Apr 11, 2014 at 2:38 PM, Jaonary Rabarisoa wrote:
> Hi all,
>
> I'm just wondering if hybrid GPU/CPU computation is something that is
> feasible with sp
t; Date:03/27/2014 6:08 AM (GMT-05:00)
>> To: user@spark.apache.org
>> Subject: Re: Announcing Spark SQL
>>
>> nope (what I said :-P)
>>
>>
>> On Thu, Mar 27, 2014 at 11:05 AM, Pascal Voitot Dev <
>> pascal.voitot@gmail.com> wrote:
>>
>>
ot;
>
>
> On Thu, Mar 27, 2014 at 11:05 AM, Pascal Voitot Dev <
> pascal.voitot@gmail.com> wrote:
>
>>
>>
>>
>> On Thu, Mar 27, 2014 at 10:22 AM, andy petrella
>> wrote:
>>
>>> I just mean queries sent at runtime ^^, like for any
interesting.
>
>
Ok that's what I thought! But for these runtime queries, is a macro useful
for you?
>
>
> On Thu, Mar 27, 2014 at 10:15 AM, Pascal Voitot Dev <
> pascal.voitot@gmail.com> wrote:
>
>>
>> Le 27 mars 2014 09:47, "andy petrella"
t
use case envisioned with this Spark SQL.
>
I'm not sure to see what you call "ad- hoc queries"... Any sample?
> Again, only my0.2c (ok I divided by 10 after writing my thoughts ^^)
>
> Andy
>
> On Thu, Mar 27, 2014 at 9:16 AM, Pascal Voitot Dev <
pascal.vo
Hi,
Quite interesting!
Suggestion: why not go even fancier & parse SQL queries at compile-time
with a macro ? ;)
Pascal
On Wed, Mar 26, 2014 at 10:58 PM, Michael Armbrust
wrote:
> Hey Everyone,
>
> This already went out to the dev list, but I wanted to put a pointer here
> as well to a new fe
in a DStream into
several RDDs but not according to time window, I don't see any trivial way
to do it...
>
>
> On Thu, Mar 20, 2014 at 11:50 AM, Pascal Voitot Dev <
> pascal.voitot@gmail.com> wrote:
>
>> Actually it's quite simple...
>>
>> D
dvise you to test different approaches.
Maybe other people more skilled than me will have better answers ?
> @Pascal, yes your answer resolves my question partially, but the other
> part of the question(which i've clarified in above paragraph) still remains.
>
> Thank
If I may add my contribution to this discussion if I understand well your
question...
DStream is discretized stream. It discretized the data stream over windows
of time (according to the project code I've read and paper too). so when
you write:
JavaStreamingContext stcObj = new JavaStreamingConte
Hi,
I tried a few things on that in my last blog post on :
http://mandubian.com/2014/03/10/zpark-ml-nio-3/
(last part of a tryptic about spark & scalaz-stream)
I built a collaborative filtering and then use it on each RDD of the
DStream usingn a transform { rdd => model.predict(rdd)... }.
It work
Hi,
I wrote this new article after studying deeper how to adapt scalaz-stream
to spark dstreams.
I re-explain a few spark (& scalaz-stream) concepts (in my "own" words) in
it and I went further using new scalaz-stream NIO API which is quite
interesting IMHO.
The result is a long blog tryptic start
On Wed, Mar 12, 2014 at 3:06 PM, andy petrella wrote:
> Folks,
>
> I want just to pint something out...
> I didn't had time yet to sort it out and to think enough to give valuable
> strict explanation of -- event though, intuitively I feel they are a lot
> ===> need spark people or time to move fo
13 matches
Mail list logo