> operation.
>
> On Fri, 13 May 2016 at 00:20 Ufuk Celebi <u...@apache.org> wrote:
>
>> On Thu, May 12, 2016 at 10:44 PM, Andrew Whitaker
>> <andrew.whita...@braintreepayments.com> wrote:
>> > From what I've observed, most of the time when Flink can't succ
"Flink can't successfully restore a checkpoint" should be "Flink can't
successfully restore a savepoint".
On Thu, May 12, 2016 at 3:44 PM, Andrew Whitaker <
andrew.whita...@braintreepayments.com> wrote:
> Hi,
>
> I was recently experimenting with savepoint
flink accepts the program with the savepoint from the first version of the
program), and if this is a bug?
Thanks,
--
Andrew Whitaker | andrew.whita...@braintreepayments.com
, maybe i was to quick with linking to the JIRA.
>
> As for an example: you can look at the streaming WindowJoin example. The
> sample data uses an Iterator. (ThrottledIterator)
> Note that the iterator implementation used is part of flink and also
> implements serializable.
>
> On 07.
rializable. In fact, java.util.Iterator doesn't implement
Serializable.
I can't seem to find any examples of `FromIteratorFunction` being used in
the flink codebase. Am I using it wrong?
Thanks!
--
Andrew Whitaker | andrew.whita...@braintreepayments.com
--
Note: this information is co
;>>
>>>>> org.apache.flink.streaming.api.scala.StreamExecutionEnvironment.addSource(StreamExecutionEnvironment.scala:498)
>>>>>
>>>>> The line that causes that exception is just adding
>>>>> a FlinkKafkaConsumer08 source.
>>>>
trying to understand this behavior and if there's a way I can work
around it.
Thanks!
--
Andrew Whitaker | andrew.whita...@braintreepayments.com
ommand line arguments (call
> ./bin/jobmanager.sh
> start cluster). That should solve it.
>
> Best,
> Stephan
>
>
> On Wed, Jan 20, 2016 at 6:23 PM, Andrew Whitaker <
> andrew.whita...@braintreepayments.com> wrote:
>
>> Hi,
>>
>> I'm getting the followi
. Are there changes from the last few
days that could be causing this?
Thanks,
Andrew Whitaker | andrew.whita...@braintreepayments.com
read
Avro-serialized objects from a Kafka stream, would I have to write
something to do this or is this functionality already built somewhere?
2. Is there an analogous InputFormat in Flink's Scala API? If not, what's
the recommended way to work with Avro objects in Scala using Flink?
Thanks,
--
Andre
10 matches
Mail list logo