Hello,

Thank you for the advice.
I have done that simulation and have successfully got correct results. But
there is some problem which I have already raised before.

My program basically reads an image file does facial detection (with the
help of open cv library) and writes it back to disk. I have not used the
open cv library from the maven central repository rather I have added it
locally to the .m2 repository (by mvn install command) and updated the
pom.xml and done rest of the program. In IDE the program was already
executing correctly. However when I was trying to execute through cli or
web frontend, the fat jar generated by *mvn clean install -Pbuild-jar * is
unable to run on the local flink node as it gives an Java linking error
(when I concluded the fat jar could not link the path to my native library
for the open cv). Hence I manually built the fat jar and uploaded it in the
web frontend, after submitting the job it is correctly executing and giving
me the desired results but with an exception (cannot show the plan)

org.apache.flink.client.program.ProgramInvocationException: The
program plan could not be fetched - the program aborted
pre-maturely.System.out: (none)


        at 
org.apache.flink.client.program.OptimizerPlanEnvironment.getOptimizedPlan(OptimizerPlanEnvironment.java:105)
        at 
org.apache.flink.client.program.Client.getOptimizedPlan(Client.java:215)
        at 
org.apache.flink.runtime.webmonitor.handlers.JarActionHandler.getJobGraphAndClassLoader(JarActionHandler.java:95)
        at 
org.apache.flink.runtime.webmonitor.handlers.JarPlanHandler.handleRequest(JarPlanHandler.java:42)
        at 
org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.respondAsLeader(RuntimeMonitorHandler.java:135)
        at 
org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.channelRead0(RuntimeMonitorHandler.java:112)
        at 
org.apache.flink.runtime.webmonitor.RuntimeMonitorHandler.channelRead0(RuntimeMonitorHandler.java:60)
        at 
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
        at io.netty.handler.codec.http.router.Handler.routed(Handler.java:62)
        at 
io.netty.handler.codec.http.router.DualAbstractHandler.channelRead0(DualAbstractHandler.java:57)
        at 
io.netty.handler.codec.http.router.DualAbstractHandler.channelRead0(DualAbstractHandler.java:20)
        at 
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
        at 
org.apache.flink.runtime.webmonitor.HttpRequestHandler.channelRead0(HttpRequestHandler.java:104)
        at 
org.apache.flink.runtime.webmonitor.HttpRequestHandler.channelRead0(HttpRequestHandler.java:65)
        at 
io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
        at 
io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:242)
        at 
io.netty.channel.CombinedChannelDuplexHandler.channelRead(CombinedChannelDuplexHandler.java:147)
        at 
io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:339)
        at 
io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:324)
        at 
io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:847)
        at 
io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468)
        at 
io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382)
        at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354)
        at 
io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
        at 
io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137)
        at java.lang.Thread.run(Thread.java:745).

  From CLI there is no exception. So one question still remains: Is
there any way of linking native library files so that the

 fat jar file created by mvn clean install -Pbuild-jar runs without
giving the linking error? (because my guess is the manually created
jar incorporates something

which causes the premature abort in the web interface).

Sorry for long mail and thanks in advance.

Warm Regards,

Debaditya


On Thu, Jun 2, 2016 at 5:53 PM, Debaditya Roy <roydca...@gmail.com> wrote:

> Yeah i will do that as a last resort. But the strange fact is in IDE it is
> working fine. Although I am not entirely aware of the mechanics on how
> flink is simulated on IDE.
>
> On Thu, 2 Jun 2016 17:35 Kostas Kloudas, <k.klou...@data-artisans.com>
> wrote:
>
>> The reason seems to be that somehow an older version of Flink is still in
>> your dependencies.
>>
>> As you can see here:
>> https://issues.apache.org/jira/browse/FLINK-3306
>> at 0.10 the method you cannot find had 2 arguments and after this commit
>> it has 3.
>> This is the reason why you cannot find the method.
>>
>> It can be that you forgot to clean some directory or forgot to rebuild
>> after updating versions.
>> What you could do is delete and clean everything and
>> reinstall/re-download/rebuild them.
>>
>> I hope this is not too much of a hassle.
>>
>> Kostas
>>
>> On Jun 2, 2016, at 4:36 PM, Debaditya Roy <roydca...@gmail.com> wrote:
>>
>> Yeah I think so. Here is a part of log file. 2016-06-02 18:37:27,376
>> INFO  org.apache.flink.runtime.jobmanager.JobManager                -
>> ------------------------------------------------------------
>> --------------------2016-06-02 18:37:27,377 INFO
>> org.apache.flink.runtime.jobmanager.JobManager                -
>> Starting JobManager (Version: 1.0.3, Rev:f3a6b5f, Date:06.05.2016 @
>> 12:58:02 UTC)2016-06-02 18:37:27,377 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -  Current user: royd19902016-06-02
>> 18:37:27,377 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -  JVM: Java HotSpot(TM) 64-Bit
>> Server VM - Oracle Corporation - 1.8/25.74-b022016-06-02 18:37:27,377
>> INFO  org.apache.flink.runtime.jobmanager.JobManager                -
>> Maximum heap size: 736 MiBytes2016-06-02 18:37:27,377 INFO
>> org.apache.flink.runtime.jobmanager.JobManager                -
>> JAVA_HOME: /usr/lib/jvm/java-8-oracle/jre2016-06-02 18:37:27,385 INFO
>> org.apache.flink.runtime.jobmanager.JobManager                -  Hadoop
>> version: 1.2.12016-06-02 18:37:27,385 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -  JVM Options:2016-06-02
>> 18:37:27,385 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -     -Xms768m2016-06-02
>> 18:37:27,385 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -     -Xmx768m2016-06-02
>> 18:37:27,385 INFO  org.apache.flink.runtime.
>> jobmanager.JobManager                -     -Dlog.file=/home/royd1990/
>> Downloads/flink-1.0.3/log/flink-royd1990-jobmanager-0-
>> royd1990-HP-ENVY-Notebook.log
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>>
>> On Thu, Jun 2, 2016 at 4:32 PM, Kostas Kloudas <
>> k.klou...@data-artisans.com> wrote:
>>
>>> Form the Flink logs you see that the version is the correct right
>>> (1.0.3)?
>>>
>>> On Jun 2, 2016, at 4:19 PM, Debaditya Roy <roydca...@gmail.com> wrote:
>>>
>>> Hello,
>>>
>>> I updated the artifact Id and regenerated the jar, but it is giving the
>>> same error. It is somehow trying to access the method which is not there in
>>> the updated version.
>>>
>>> Warm Regards,
>>> Debaditya
>>>
>>> On Thu, Jun 2, 2016 at 4:10 PM, Robert Metzger <rmetz...@apache.org>
>>> wrote:
>>>
>>>> The correct artifact id is flink-yarn-tests_2.10.
>>>>
>>>> On Thu, Jun 2, 2016 at 3:54 PM, Debaditya Roy <roydca...@gmail.com>
>>>> wrote:
>>>>
>>>>> Hi Kostas,
>>>>>
>>>>> Doing this throws an error in the pom.
>>>>>
>>>>> <dependency>
>>>>>    <groupId>org.apache.flink</groupId>
>>>>>    <artifactId>flink-yarn-tests</artifactId>
>>>>>    <version>${flink.version}</version>
>>>>> </dependency>
>>>>>
>>>>> 'org.apache.flink:flink-yarn-tests:1.0.3' not found.
>>>>>
>>>>> Warm Regards,
>>>>> Debaditya
>>>>>
>>>>> On Thu, Jun 2, 2016 at 3:32 PM, Kostas Kloudas <
>>>>> k.klou...@data-artisans.com> wrote:
>>>>>
>>>>>> Could you replace 0.10-SNAPSHOT with ${flink.version} in the pom?
>>>>>>
>>>>>> Thanks,
>>>>>> Kostas
>>>>>>
>>>>>> On Jun 2, 2016, at 3:13 PM, Debaditya Roy <roydca...@gmail.com>
>>>>>> wrote:
>>>>>>
>>>>>> 0.10-SNAPSHOT
>>>>>>
>>>>>>
>>>>>>
>>>>>
>>>>
>>>
>>>
>>
>>

Reply via email to