Hi everyone,
as some of you may already know, I have worked on a patch for OpenJDK
that introduces a compiler option for javac which allows to compile and
run programs with Java 8 Lambda Expressions in Flink without loss of
type information and user inconvenience.
I have published my patch t
Amazing!
> Am 17.12.2014 um 22:32 schrieb Ufuk Celebi :
>
>
>> On 17 Dec 2014, at 22:12, Flavio Pompermaier wrote:
>>
>> Congratulations!!
>>> On Dec 17, 2014 10:07 PM, "Fabian Hueske" wrote:
>>>
>>> Great news! :-)
>>>
>>> 2014-12-17 21:54 GMT+01:00 Robert Metzger :
Thank you fo
atches.
>
> That's why I would like to merge it into 0.8.1 and 0.9, rather than 0.8.0
> That gives the code a bit more testing/use and us a bit more time to
> thoroughly think the implications through.
>
>> On Tue, Dec 16, 2014 at 5:30 PM, Timo Walther wrote:
>>
Sorry I totally forgot to check the streaming API. Do they use the
getXXXReturnTypes() methods of the TypeExtractor?
> Am 16.12.2014 um 17:23 schrieb Robert Metzger :
>
> I've worked on adding Kryo support to Flink today. I would really like to
> include this into the release. Many users were co
Hi all,
I'm working on a small project for university and I have some question
about how to implement it. Maybe you could give me some hints
I have a directory that contains around 1 million HTML files. Basically,
I just want to read each file entirely into a String and parse it with
JSo
+1
Great idea!
On 08.12.2014 23:43, Kostas Tzoumas wrote:
+1
On Mon, Dec 8, 2014 at 11:26 PM, Paris Carbone wrote:
+1
It will certainly bring all teams closer
On 08 Dec 2014, at 23:17, Fabian Hueske wrote:
+1
2014-12-08 23:05 GMT+01:00 Ufuk Celebi :
+1 great idea
On Monday, December
Timo Walther created FLINK-1308:
---
Summary: Flink logo in documentation does not link to website home
Key: FLINK-1308
URL: https://issues.apache.org/jira/browse/FLINK-1308
Project: Flink
Issue
> * Stephan Ewen
> * Gyula Fora
> * Alan Gates
> * Fabian Hueske
> * Vasia Kalavri
> * Aljoscha Krettek
> * Robert Metzger
> * Till Rohrmann
Hey,
the new website looks very precise and clean! I agree with Vasia, I
would remove "in clusters" and generialize it to "Fast reliable parallel
data processing.".
Regards,
Timo
On 27.11.2014 20:41, Vasiliki Kalavri wrote:
Hey!
It looks great :))
If I would only change one thing, I woul
Timo Walther created FLINK-1245:
---
Summary: Introduce TypeHints for Java API operators
Key: FLINK-1245
URL: https://issues.apache.org/jira/browse/FLINK-1245
Project: Flink
Issue Type
use Java 8 lambdas inside IDE with collection
execution, and on the cluster with the properly compiled code from maven.
Stephan
On Mon, Nov 3, 2014 at 12:23 PM, Timo Walther wrote:
Hey,
I have made a small prototype for a map-operator
env.fromElements(1, 2, 3)
.map((i) -> new Tuple2()).
Hi all,
as some of you may already know, Flink's Java 8 Lambdas currently only
worked if the job has been compiled with the Eclipse JDT 4.4 compiler
because this compiler saves important type information in the class files.
Unfortunately, this feature was not officially supported by the Eclipse
Timo Walther created FLINK-1229:
---
Summary: Synchronize WebClient arguments with command line
arguments
Key: FLINK-1229
URL: https://issues.apache.org/jira/browse/FLINK-1229
Project: Flink
, Timo Walther wrote:
Hi Vasiliki,
your error is a very common problem we have together with types. The
problem is that Java does type erasure, which means that
return vertices.map(new ApplyMapperToVertex(mapper));
becomes
return vertices.map(new ApplyMapperToVertex(mapper));
Therefore we don
Timo Walther created FLINK-1225:
---
Summary: Quickstart does not work
Key: FLINK-1225
URL: https://issues.apache.org/jira/browse/FLINK-1225
Project: Flink
Issue Type: Bug
Reporter
lization, which makes it complicated and error
prone...
On Wed, Oct 29, 2014 at 4:26 PM, Timo Walther wrote:
What do you think about something like:
env.fromElements(1, 2, 3)
.flatMap((Integer i, Collector o) -> o.collect(i)).returns("
Integer")
.print();
This looks to me like th
Hi Vasiliki,
your error is a very common problem we have together with types. The
problem is that Java does type erasure, which means that
return vertices.map(new ApplyMapperToVertex(mapper));
becomes
return vertices.map(new ApplyMapperToVertex(mapper));
Therefore we don't have the types. B
ion?
Something like
DataSet.map(hint( (x) -> x.toString() , String.class));
If we go for option (1), I would suggest to call the methods just "from"
and overload them for String, Class, and TypeInformation
Stephan
On Tue, Oct 28, 2014 at 3:27 PM, Timo Walther wrote:
Hi all,
c
Timo Walther created FLINK-1197:
---
Summary: Special page on website that describes issues that occur
together with types
Key: FLINK-1197
URL: https://issues.apache.org/jira/browse/FLINK-1197
Project
Hi all,
currently the Eclipse JDT compiler was the only compiler that included
generic signatures for Lambda Expressions in class files which is
necessary to use them type-safe in Flink. Unfortunalely, this "feature"
was considered as a "bug" and had been thrown out with Eclipse 4.4.1.
This i
Timo Walther created FLINK-1135:
---
Summary: Blog post with topic "Accessing Data Stored in Hive with
Flink"
Key: FLINK-1135
URL: https://issues.apache.org/jira/browse/FLINK-1135
Proj
Timo Walther created FLINK-1098:
---
Summary: flatArray() operator that converts arrays to elements
Key: FLINK-1098
URL: https://issues.apache.org/jira/browse/FLINK-1098
Project: Flink
Issue Type
Hey everyone,
I want to get the maximum performance of my small 2 node cluster. At the
moment my execution plan has a "parallelism" of "1" at each operator.
What "-p XX" argument should I pass to the job? The number of nodes,
number of CPUs or number of slots?
Thanks and regards,
Timo
Timo Walther created FLINK-970:
--
Summary: Implement a first(n) operator
Key: FLINK-970
URL: https://issues.apache.org/jira/browse/FLINK-970
Project: Flink
Issue Type: New Feature
I already tried that but it resulted in the same exception. Where should the
ContextClassLoader be set when I deserialize an InputFormat on an other node in
the cluster?
On 18.06.2014 14:24, Stephan Ewen wrote:
You can always do "Thread.currentThread().getContextClassLoader()"
Meanwhile I don't think that it is a bug, because the getStubWrapper()
method in TaskConfig uses the classloader. userCodeClass already
contains the loaded class, so no need for the classloader:
@Override
public T getUserCodeObject(Class superClass, ClassLoader
cl) {
return Instan
Hey everyone,
I'm still trying to finish the Hadoop Compatibility PR
(https://github.com/stratosphere/stratosphere/pull/777), however, I
always get an ClassNotFoundException for my HCatalog InputFormat on the
cluster.
While searching for potential ClassLoader bugs, I found the following
lin
[
https://issues.apache.org/jira/browse/FLINK-845?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14029292#comment-14029292
]
Timo Walther commented on FLINK-845:
Can be closed. Merged with FLINK-868.
>
28 matches
Mail list logo