unsubscribe
It appears that when a custom partitioner is applied in a groupBy
operation, it is not propagated through subsequent non-shuffle operations.
Is this intentional? Is there any way to carry custom partitioning through
maps?
I've uploaded a gist that exhibits the behavior.
https://gist.github.com/Bri
You need to add the plugin to your plugins.sbt file not your build.sbt
file. Also, I don't see a 0.13.9 version on Github. 0.14.2 is current.
On Thu, Feb 18, 2016 at 9:50 PM Arko Provo Mukherjee <
arkoprovomukher...@gmail.com> wrote:
> Hello,
>
> I am trying to use sbt assembly to generate a fa
gt; SPARK-6847
>
> FYI
>
> On Wed, Jan 20, 2016 at 7:55 AM, Brian London
> wrote:
>
>> I'm running a streaming job that has two calls to updateStateByKey. When
>> run in standalone mode both calls to updateStateByKey behave as expected.
>> When run on a cluste
I'm running a streaming job that has two calls to updateStateByKey. When
run in standalone mode both calls to updateStateByKey behave as expected.
When run on a cluster, however, it appears that the first call is not being
checkpointed as shown in this DAG image:
http://i.imgur.com/zmQ8O2z.png
T
Since you're running in standalone mode, can you try it using Spark 1.5.1
please?
On Thu, Dec 31, 2015 at 9:09 AM Steve Loughran
wrote:
>
> > On 30 Dec 2015, at 19:31, KOSTIANTYN Kudriavtsev <
> kudryavtsev.konstan...@gmail.com> wrote:
> >
> > Hi Jerry,
> >
> > I want to run different jobs on dif
RDD has a method keyBy[K](f: T=>K) that acts as an alias for map(x =>
(f(x), x)) and is useful for generating pair RDDs. Is there a reason this
method doesn't exist on DStream? It's a fairly heavily used method and
allows clearer code than the more verbose map.
gt; —
> Sent from Mailbox <https://www.dropbox.com/mailbox>
>
>
> On Fri, Dec 11, 2015 at 5:38 PM, Brian London
> wrote:
>
>> That's good news I've got a PR in to up the SDK version to 1.10.40 and
>> the KCL to 1.6.1 which I'm running tests on locally
gt;
>>> Burak
>>>
>>> On Thu, Dec 10, 2015 at 8:09 PM, Nick Pentreath <
>>> nick.pentre...@gmail.com> wrote:
>>>
>>>> Yup also works for me on master branch as I've been testing DynamoDB
>>>> Streams integration. In fact wo
orks with the 1.6.0 branch?
>
> Thanks,
> Burak
>
> On Thu, Dec 10, 2015 at 11:45 AM, Brian London
> wrote:
>
>> Nick's symptoms sound identical to mine. I should mention that I just
>> pulled the latest version from github and it seems to be working there.
sn't appear to be working under 1.5.2.
> >
> > UI for 1.5.2:
> >
> > Inline image 1
> >
> > UI for 1.5.1:
> >
> > Inline image 2
> >
> > On Thu, Dec 10, 2015 at 5:50 PM, Brian London > <mailto:brianmlon...@gmail.com>> wrote:
&
Has anyone managed to run the Kinesis demo in Spark 1.5.2? The Kinesis ASL
that ships with 1.5.2 appears to not work for me although 1.5.1 is fine. I
spent some time with Amazon earlier in the week and the only thing we could
do to make it work is to change the version to 1.5.1. Can someone pleas
On my local system (8 core MBP) the Kinesis ASL example isn't working out
of the box on a fresh build (Spark 1.5.2). I can see records going into
the kinesis stream but the receiver is returning empty DStreams. The
behavior is similar to an issue that's been discussed previously:
http://stackove
On my local system (8 core MBP) the Kinesis ASL example isn't working out
of the box on a fresh build (Spark 1.5.2). I can see records going into
the kinesis stream but the receiver is returning empty DStreams. The
behavior is similar to an issue that's been discussed previously:
http://stackove
14 matches
Mail list logo