Hi Gna,
thanks for sharing the good news and opening the JIRA!
Cheers, Fabian
2016-03-22 23:30 GMT+01:00 Sourigna Phetsarath :
> Ufek & Fabian,
>
> FYI, I was about to extend the FileInputFormat and extend the
> createInputSplits
> to handle multiple Path - there
Ufek & Fabian,
FYI, I was about to extend the FileInputFormat and extend the
createInputSplits
to handle multiple Path - there was an improvement of reduced resource
usage and increased performance of the job.
Also added this ticket: https://issues.apache.org/jira/browse/FLINK-3655
-Gna
On
Hi David,
Here's an example of something similar to what you're talking about:
https://github.com/jgrier/FilteringExample
Have a look at the TweetImpressionFilteringJob.
-Jamie
On Tue, Mar 22, 2016 at 2:24 PM, David Brelloch wrote:
> Konstantin,
>
> Not a problem. Thanks
Hi,
I am converting a storm topology to Flink-storm topology using the flink-storm
dependency. When I run my code the FlinkTopologyBuilder eventually calls
createTopology method in TopologyBuilder and throws the error at the following
highlighted line:-
public StormTopology createTopology()
The JDBC formats don't make any assumption as to what DB backend is used.
A JDBC float in general is returned as a double, since that was the
recommended mapping i found when i wrote the formats.
Is the INT returned as a double as well?
Note: The (runtime) output type is in no way connected
Hi Robert!
Thank you! :)
David
On Tue, Mar 22, 2016 at 7:59 AM, Robert Metzger wrote:
> Hey David,
>
> FLINK-3602 has been merged to master.
>
> On Fri, Mar 11, 2016 at 5:11 PM, David Kim <
> david@braintreepayments.com> wrote:
>
>> Thanks Stephan! :)
>>
>> On Thu,
Sorry I was not clear:
I meant the initial DataSet is changing. Not the ds. :)
> Am 22.03.2016 um 15:28 schrieb Till Rohrmann :
>
> From the code extract I cannot tell what could be wrong because the code
> looks ok. If ds changes, then your normalization result
>From the code extract I cannot tell what could be wrong because the code
looks ok. If ds changes, then your normalization result should change as
well, I would assume.
On Tue, Mar 22, 2016 at 3:15 PM, Lydia Ickler
wrote:
> Hi Till,
>
> maybe it is doing so because I
Hi Lydia,
I tried to reproduce your problem but I couldn't. Can it be that you have
somewhere a non deterministic operation in your program or do you read the
data from a source with varying data? Maybe you could send us a compilable
and complete program which reproduces your problem.
Cheers,
Hi all,
I have a question.
If I have a DataSet DataSet> ds and I want to
normalize all values (at position 2) in it by the maximum of the DataSet
(ds.aggregate(Aggregations.MAX, 2)).
How do I tackle that?
If I use the cross operator my result changes every
Hey David,
FLINK-3602 has been merged to master.
On Fri, Mar 11, 2016 at 5:11 PM, David Kim
wrote:
> Thanks Stephan! :)
>
> On Thu, Mar 10, 2016 at 11:06 AM, Stephan Ewen wrote:
>
>> The following issue should track that.
>>
val aggregatedStream = stream.apply( (w:Window, values:
scala.Iterable[(List[String], Long, Int)], out:
Collector[Aggregation]) => {
import scala.collection.JavaConversions._
val agg = Aggregation( values.toList.map {
case (pages, _, ct) => (ct, pages)
})
Hello,
we are trying to set up our system to do remote debugging through Intellij.
Flink is running on a yarn long running session. We are launching Flink's
CliFrontend with the following parameters:
> run -m **::48252
/Users//Projects/flink/build-target/examples/batch/WordCount.jar
The error
by using DataStream#writeAsCsv(String path, WriteMode writeMode)
On 22.03.2016 12:18, subash basnet wrote:
Hello all,
I am trying to write the streaming data to file and update it
recurrently with the streaming data. I get the following unable to
override exception error:
*Caused by:
Hi subash,
You can pass WriteMode in second parameter of write* method. For example:
```
DataStream<…> myStream = …;
myStream.writeAsCsv(“path of output”, FileSystem.WriteMode.OVERWRITE);
```
I hope this helps.
Regards,
Chiwan Park
> On Mar 22, 2016, at 8:18 PM, subash basnet
Hi,
I have some thoughts about Evictors as well yes, but I didn’t yet write them
down. The basic idea about them is this:
class Evictor {
Predicate getPredicate(Iterable elements, int size, W
window);
}
class Predicate {
boolean evict(StreamRecord element);
}
The evictor
16 matches
Mail list logo