ore, I guess that there is something wrong
>>> with your job. What happens in PigStorage().load?
>>>
>>> Cheers,
>>> Till
>>>
>>>
>>> On Thu, Jul 16, 2015 at 4:35 PM, Philipp Goetze <
>>> philipp.goe...@tu-ilmenau.de>
().load?
>>
>> Cheers,
>> Till
>>
>>
>> On Thu, Jul 16, 2015 at 4:35 PM, Philipp Goetze <
>> philipp.goe...@tu-ilmenau.de> wrote:
>>
>>> Hey Tim,
>>>
>>> I think my previous mail was intercepted or something similar. However
t; I think my previous mail was intercepted or something similar. However
>> you can find my reply below. I already tried a simpler job which just does
>> a env.fromElements... but still the same stack.
>>
>> How do you normally submit jobs (jars) from within the code?
>&g
ng similar.
However you can find my reply below. I already tried a simpler job
which just does a env.fromElements... but still the same stack.
How do you normally submit jobs (jars) from within the code?
Best Regards,
Philipp
---- Forwarded Message ----
Subject:
ler job which just does a
> env.fromElements... but still the same stack.
>
> How do you normally submit jobs (jars) from within the code?
>
> Best Regards,
> Philipp
>
>
> ---- Forwarded Message Subject: Re: Submitting jobs from
> within Scala code D
Forwarded Message
Subject:Re: Submitting jobs from within Scala code
Date: Thu, 16 Jul 2015 14:31:01 +0200
From: Philipp Goetze
To: user@flink.apache.org
Hey,
from the JobManager I do not get any more hints:
13:36:06,674 DEBUG
Hi Philipp,
it seems that Stephan was right and that your JobGraph is somehow
corrupted. You can see it in the exception JobSubmissionException that the
JobGraph contains a vertex whose InvokableClassName is null. Furthermore,
even the ID and the vertex name are null. This is a strong indicator, t
Hey Tim,
here the console output now with log4j:
0[pool-7-thread-1-ScalaTest-running-FlinkCompileIt] INFO
org.apache.flink.client.program.Client - Starting program in
interactive mode
121 [pool-7-thread-1-ScalaTest-running-FlinkCompileIt] DEBUG
org.apache.flink.api.scala.ClosureCleaner$
Could you also look into the JobManager logs? You may be submitting a
corrupt JobGraph...
On Thu, Jul 16, 2015 at 11:45 AM, Till Rohrmann
wrote:
> When you run your program from the IDE, then you can specify a
> log4j.properties file. There you can configure where and what to log. It
> should be
When you run your program from the IDE, then you can specify a
log4j.properties file. There you can configure where and what to log. It
should be enough to place the log4j.properties file in the resource folder
of your project. An example properties file could look like:
log4j.rootLogger=INFO, tes
Hi Till,
the problem is that this is the only output :( Or is it possible to get
a more verbose log output?
Maybe it is important to note, that both Flink and our project is built
with Scala 2.11.
Best Regards,
Philipp
On 16.07.2015 11:12, Till Rohrmann wrote:
Hi Philipp,
could you post
Hi Philipp,
could you post the complete log output. This might help to get to the
bottom of the problem.
Cheers,
Till
On Thu, Jul 16, 2015 at 11:01 AM, Philipp Goetze <
philipp.goe...@tu-ilmenau.de> wrote:
> Hi community,
>
> in our project we try to submit built Flink programs to the jobmanag
Hi community,
in our project we try to submit built Flink programs to the jobmanager
from within Scala code. The test program is executed correctly when
submitted via the wrapper script "bin/flink run ..." and also with the
webclient. But when executed from within the Scala code nothing seems
13 matches
Mail list logo