Re: Exception while submitting jobs through Yarn

2018-06-20 Thread Till Rohrmann
Great to hear. Please open a PR for the improvements you like to contribute.

Cheers,
Till

On Wed, Jun 20, 2018 at 4:56 PM Garvit Sharma  wrote:

> So, finally, I have got this working. The issue was because of a poor
> library which was using xerces 2.6 :).
>
> In this process, I found few things missing from the doc would like to
> contribute the same.
>
> I really appreciate the support provided.
>
> Thanks,
>
> On Tue, 19 Jun 2018 at 4:05 PM, Ted Yu  wrote:
>
>> Since you're using a vendor's distro, I would suggest asking on their
>> user forum.
>>
>> Cheers
>>
>>  Original message 
>> From: Garvit Sharma 
>> Date: 6/19/18 3:34 AM (GMT-08:00)
>> To: trohrm...@apache.org
>> Cc: Amit Jain , Chesnay Schepler ,
>> Ted Yu , user@flink.apache.org
>> Subject: Re: Exception while submitting jobs through Yarn
>>
>> Any help on this?
>>
>> On Mon, Jun 18, 2018 at 11:31 PM Garvit Sharma 
>> wrote:
>>
>>> Yes, it is.
>>>
>>> On Mon, Jun 18, 2018 at 7:54 PM Till Rohrmann 
>>> wrote:
>>>
>>>> Is `/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar` a link to `
>>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar`?
>>>>
>>>> On Mon, Jun 18, 2018 at 4:02 PM Garvit Sharma 
>>>> wrote:
>>>>
>>>>> I don't think I can access core-default as it comes with Hadoop jar
>>>>>
>>>>> On Mon, 18 Jun 2018 at 7:30 PM, Till Rohrmann 
>>>>> wrote:
>>>>>
>>>>>> Hmm, could you check whether core-default.xml contains any suspicious
>>>>>> entries? Apparently xerces:2.9.1 cannot read it.
>>>>>>
>>>>>> On Mon, Jun 18, 2018 at 3:40 PM Garvit Sharma 
>>>>>> wrote:
>>>>>>
>>>>>>> Hi,
>>>>>>>
>>>>>>> After putting the following log in my code, I can see that the
>>>>>>> Xerces version is - Xerces version : Xerces-J 2.9.1
>>>>>>>
>>>>>>> log.info("Xerces version : {}", 
>>>>>>> org.apache.xerces.impl.Version.getVersion());
>>>>>>>
>>>>>>> Also, following is the response of *$* *locate xerces* command on
>>>>>>> the server -
>>>>>>>
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar
>>>>>>>
>>>>>>> /usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar
>>>>>>>
>>>>>>>


Re: Exception while submitting jobs through Yarn

2018-06-20 Thread Garvit Sharma
So, finally, I have got this working. The issue was because of a poor
library which was using xerces 2.6 :).

In this process, I found few things missing from the doc would like to
contribute the same.

I really appreciate the support provided.

Thanks,

On Tue, 19 Jun 2018 at 4:05 PM, Ted Yu  wrote:

> Since you're using a vendor's distro, I would suggest asking on their user
> forum.
>
> Cheers
>
>  Original message 
> From: Garvit Sharma 
> Date: 6/19/18 3:34 AM (GMT-08:00)
> To: trohrm...@apache.org
> Cc: Amit Jain , Chesnay Schepler ,
> Ted Yu , user@flink.apache.org
> Subject: Re: Exception while submitting jobs through Yarn
>
> Any help on this?
>
> On Mon, Jun 18, 2018 at 11:31 PM Garvit Sharma 
> wrote:
>
>> Yes, it is.
>>
>> On Mon, Jun 18, 2018 at 7:54 PM Till Rohrmann 
>> wrote:
>>
>>> Is `/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar` a link to `
>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar`?
>>>
>>> On Mon, Jun 18, 2018 at 4:02 PM Garvit Sharma 
>>> wrote:
>>>
>>>> I don't think I can access core-default as it comes with Hadoop jar
>>>>
>>>> On Mon, 18 Jun 2018 at 7:30 PM, Till Rohrmann 
>>>> wrote:
>>>>
>>>>> Hmm, could you check whether core-default.xml contains any suspicious
>>>>> entries? Apparently xerces:2.9.1 cannot read it.
>>>>>
>>>>> On Mon, Jun 18, 2018 at 3:40 PM Garvit Sharma 
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> After putting the following log in my code, I can see that the Xerces
>>>>>> version is - Xerces version : Xerces-J 2.9.1
>>>>>>
>>>>>> log.info("Xerces version : {}", 
>>>>>> org.apache.xerces.impl.Version.getVersion());
>>>>>>
>>>>>> Also, following is the response of *$* *locate xerces* command on
>>>>>> the server -
>>>>>>
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar
>>>>>>
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar
>>>>>>
>>>>>> /usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar
>>>>>>
>>>>>>


Re: Exception while submitting jobs through Yarn

2018-06-19 Thread Ted Yu
Since you're using a vendor's distro, I would suggest asking on their user 
forum.
Cheers
 Original message From: Garvit Sharma  
Date: 6/19/18  3:34 AM  (GMT-08:00) To: trohrm...@apache.org Cc: Amit Jain 
, Chesnay Schepler , Ted Yu 
, user@flink.apache.org Subject: Re: Exception while 
submitting jobs through Yarn 
Any help on this?
On Mon, Jun 18, 2018 at 11:31 PM Garvit Sharma  wrote:
Yes, it is. 

On Mon, Jun 18, 2018 at 7:54 PM Till Rohrmann  wrote:
Is `/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar` a link to 
`/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar`?
On Mon, Jun 18, 2018 at 4:02 PM Garvit Sharma  wrote:
I don't think I can access core-default as it comes with Hadoop jar
On Mon, 18 Jun 2018 at 7:30 PM, Till Rohrmann  wrote:
Hmm, could you check whether core-default.xml contains any suspicious entries? 
Apparently xerces:2.9.1 cannot read it.
On Mon, Jun 18, 2018 at 3:40 PM Garvit Sharma  wrote:
Hi,
After putting the following log in my code, I can see that the Xerces version 
is - Xerces version : Xerces-J 2.9.1log.info("Xerces version : {}", 
org.apache.xerces.impl.Version.getVersion());Also, following is the response of 
$ locate xerces command on the server -

 





/usr/hdp/2.6.1.0-129/falcon/client/lib/xercesImpl-2.10.0.jar
/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/hadoop/client/xercesImpl.jar
/usr/hdp/2.6.1.0-129/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/hbase/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/livy/jars/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/livy2/jars/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/oozie/lib/xercesImpl-2.10.0.jar
/usr/hdp/2.6.1.0-129/oozie/libserver/xercesImpl-2.10.0.jar
/usr/hdp/2.6.1.0-129/oozie/libtools/xercesImpl-2.10.0.jar
/usr/hdp/2.6.1.0-129/slider/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/spark2/jars/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/storm/contrib/storm-autocreds/xercesImpl-2.9.1.jar
/usr/hdp/2.6.1.0-129/zookeeper/lib/xercesMinimal-1.9.6.2.jar
/usr/hdp/2.6.3.0-235/falcon/client/lib/xercesImpl-2.10.0.jar
/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/hadoop/client/xercesImpl.jar
/usr/hdp/2.6.3.0-235/hadoop-hdfs/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/hbase/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/hive-hcatalog/share/webhcat/svr/lib/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/livy/jars/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/livy2/jars/xercesImpl-2.9.1.jar
/usr/hdp/2.6.3.0-235/oozie/lib/xercesImpl-2.10.0.jar


Re: Exception while submitting jobs through Yarn

2018-06-19 Thread Garvit Sharma
7 19:01:31,663 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>>>> JVM:
>>>>>>>>>> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
>>>>>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>>>> Maximum
>>>>>>>>>> heap size: 6647 MiBytes
>>>>>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>>  JAVA_HOME: /usr/lib/jvm/java-openjdk
>>>>>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>>>> Hadoop
>>>>>>>>>> version: 2.8.3
>>>>>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>>>>>>>>>> Options:
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -Xms6936m
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -Xmx6936m
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -XX:MaxDirectMemorySize=4072m
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>>>> Program
>>>>>>>>>> Arguments:
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>> --configDir
>>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - 
>>>>>>>>>> .
>>>>>>>>>> *2018-06-17 19:01:31,666 INFO
>>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>>  Classpath:
>>>>>>>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>>>>>>>>>
>>>>>>>>>> --
>>>>>>>>>> Thanks,
>>>>>>>>>> Amit
>>>>>>>>>>
>>>>>>>>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <
>>>>>>>>>> garvit...@gmail.com> wrote:
>>>>>>>>>>
>>>>>>>>>>> Hi,
>>>>>>>>>>>

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
  -  
>>>>>>>>> Hadoop
>>>>>>>>> version: 2.8.3
>>>>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>>>>>>>>> Options:
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -Xms6936m
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -Xmx6936m
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -XX:MaxDirectMemorySize=4072m
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>>> Program
>>>>>>>>> Arguments:
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>> --configDir
>>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>>>>>>>>> *2018-06-17 19:01:31,666 INFO
>>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>>  Classpath:
>>>>>>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>>>>>>>>
>>>>>>>>> --
>>>>>>>>> Thanks,
>>>>>>>>> Amit
>>>>>>>>>
>>>>>>>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <
>>>>>>>>> garvit...@gmail.com> wrote:
>>>>>>>>>
>>>>>>>>>> Hi,
>>>>>>>>>>
>>>>>>>>>> Please refer to my previous mail for complete logs.
>>>>>>>>>>
>>>>>>>>>> Thanks,
>>>>>>>>>>
>>>>>>>>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <
>>>>>>>>>> trohrm...@apache.org> wrote:
>>>>>>>>>>
>>>>>>>>>>> Could you also please share the complete log file with us.
>>>>>>>>>>>
>>>>>>>>>>> Cheers,
>>>>>>>>>>> Till
>>>>>>>>>>>
>>>>>>>>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu 
>>>>>>>>>>> wrote:
>>>>>>>>>>>
>>>>>>>>>>>> The error for core-default.xml is interesting.
>>>>>>>>>>>>
>>>>>>>>>>>> Flink doesn't have this file. Probably it came with Yarn.
>>>>>>>&g

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Till Rohrmann
  -
>>>>>>>> -XX:MaxDirectMemorySize=4072m
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>>> Program
>>>>>>>> Arguments:
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>> --configDir
>>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>>>>>>>> *2018-06-17 19:01:31,666 INFO
>>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>>  Classpath:
>>>>>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>>>>>>>
>>>>>>>> --
>>>>>>>> Thanks,
>>>>>>>> Amit
>>>>>>>>
>>>>>>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma >>>>>>> > wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> Please refer to my previous mail for complete logs.
>>>>>>>>>
>>>>>>>>> Thanks,
>>>>>>>>>
>>>>>>>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <
>>>>>>>>> trohrm...@apache.org> wrote:
>>>>>>>>>
>>>>>>>>>> Could you also please share the complete log file with us.
>>>>>>>>>>
>>>>>>>>>> Cheers,
>>>>>>>>>> Till
>>>>>>>>>>
>>>>>>>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu 
>>>>>>>>>> wrote:
>>>>>>>>>>
>>>>>>>>>>> The error for core-default.xml is interesting.
>>>>>>>>>>>
>>>>>>>>>>> Flink doesn't have this file. Probably it came with Yarn. Please
>>>>>>>>>>> check the hadoop version Flink was built with versus the hadoop 
>>>>>>>>>>> version in
>>>>>>>>>>> your cluster.
>>>>>>>>>>>
>>>>>>>>>>> Thanks
>>>>>>>>>>>
>>>>>>>>>>>  Original message 
>>>>>>>>>>> From: Garvit Sharma 
>>>>>>>>>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>>>>>>>>>> To: trohrm...@apache.org
>>>>>>>>>>> Cc: Chesnay Schepler , user@flink.apache.org
>>>>>>>>>>> Subject: Re: Exception while submitting jobs through Yarn
>>>>>>>>>>>
>>>>>>>>>>> I am not able to figure out, got stuck badly in this since last
>>>>>>>>>>> 1 week. Any little help would be appreciated.
>>>>>>>>>>>
>>>>>>>>

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
 at 4:50 PM Till Rohrmann 
>>>> wrote:
>>>>
>>>>> Which Hadoop version have you installed? It looks as if Flink has been
>>>>> build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If
>>>>> you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink
>>>>> binaries or the one built for Hadoop 2.6.
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>>
>>>>> On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma 
>>>>> wrote:
>>>>>
>>>>>> Ok, I have attached the log file.
>>>>>>
>>>>>> Please check and let me know.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> On Mon, Jun 18, 2018 at 2:07 PM Amit Jain  wrote:
>>>>>>
>>>>>>> Hi Gravit,
>>>>>>>
>>>>>>> I think Till is interested to know about classpath details present
>>>>>>> at the start of JM and TM logs e.g. following logs provide classpath
>>>>>>> details used by TM in our case.
>>>>>>>
>>>>>>> 2018-06-17 19:01:30,656 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> 
>>>>>>> 2018-06-17 19:01:30,658 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>> Starting
>>>>>>> YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @
>>>>>>> 14:54:44 UTC)
>>>>>>> 2018-06-17 19:01:30,659 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  OS
>>>>>>> current user: yarn
>>>>>>> 2018-06-17 19:01:31,662 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>> Current
>>>>>>> Hadoop/Kerberos user: hadoop
>>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM:
>>>>>>> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
>>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>> Maximum
>>>>>>> heap size: 6647 MiBytes
>>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>>  JAVA_HOME: /usr/lib/jvm/java-openjdk
>>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Hadoop
>>>>>>> version: 2.8.3
>>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>>>>>>> Options:
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -Xms6936m
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -Xmx6936m
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -XX:MaxDirectMemorySize=4072m
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>>> Program
>>>>>>> Arguments:
>>>>>>> 2018-06-17 19:

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Till Rohrmann
;>>> Thanks,
>>>>>
>>>>> On Mon, Jun 18, 2018 at 2:07 PM Amit Jain  wrote:
>>>>>
>>>>>> Hi Gravit,
>>>>>>
>>>>>> I think Till is interested to know about classpath details present at
>>>>>> the start of JM and TM logs e.g. following logs provide classpath details
>>>>>> used by TM in our case.
>>>>>>
>>>>>> 2018-06-17 19:01:30,656 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> 
>>>>>> 2018-06-17 19:01:30,658 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  
>>>>>> Starting
>>>>>> YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @
>>>>>> 14:54:44 UTC)
>>>>>> 2018-06-17 19:01:30,659 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  OS
>>>>>> current user: yarn
>>>>>> 2018-06-17 19:01:31,662 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Current
>>>>>> Hadoop/Kerberos user: hadoop
>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM:
>>>>>> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Maximum
>>>>>> heap size: 6647 MiBytes
>>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>  JAVA_HOME: /usr/lib/jvm/java-openjdk
>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Hadoop
>>>>>> version: 2.8.3
>>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>>>>>> Options:
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -Xms6936m
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -Xmx6936m
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -XX:MaxDirectMemorySize=4072m
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Program
>>>>>> Arguments:
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>> --configDir
>>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>>>>>> *2018-06-17 19:01:31,666 INFO
>>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>>  Classpath:
>>>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-az

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
rting
>>>>> YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @
>>>>> 14:54:44 UTC)
>>>>> 2018-06-17 19:01:30,659 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  OS
>>>>> current user: yarn
>>>>> 2018-06-17 19:01:31,662 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Current
>>>>> Hadoop/Kerberos user: hadoop
>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM:
>>>>> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Maximum
>>>>> heap size: 6647 MiBytes
>>>>> 2018-06-17 19:01:31,663 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>  JAVA_HOME: /usr/lib/jvm/java-openjdk
>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Hadoop
>>>>> version: 2.8.3
>>>>> 2018-06-17 19:01:31,664 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>>>>> Options:
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -Xms6936m
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -Xmx6936m
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -XX:MaxDirectMemorySize=4072m
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -Dlogback.configurationFile=file:./logback.xml
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> -Dlog4j.configuration=file:./log4j.properties
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Program
>>>>> Arguments:
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>> --configDir
>>>>> 2018-06-17 19:01:31,665 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>>>>> *2018-06-17 19:01:31,666 INFO
>>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>>  Classpath:
>>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>>>>
>>>>> --
>>>>> Thanks,
>>>>> Amit
>>>>>
>>>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma 
>>>>> wrote:
>>>>>
>>>>>> Hi,
>>>>>>
>>>>>> Please refer to my previous mail for complete logs.
>>>>>>
>>>>>> Thanks,
>>>>>>
>>>>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
>>>>>> wrote:
>>>>>>
>>>>>>> Could you also please share the complete log file with us.
>>>>>>>
>>>>>>> Cheers,
>>>>>>> Till
>>>>>>>
>>>>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>>>>>>
>>>>>>>> The error for core-default.xml is interesting.
>>>>

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Till Rohrmann
>>>> 2018-06-17 19:01:31,665 INFO
>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>> --configDir
>>>> 2018-06-17 19:01:31,665 INFO
>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>>>> *2018-06-17 19:01:31,666 INFO
>>>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>>>  Classpath:
>>>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.....*
>>>>
>>>> --
>>>> Thanks,
>>>> Amit
>>>>
>>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma 
>>>> wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> Please refer to my previous mail for complete logs.
>>>>>
>>>>> Thanks,
>>>>>
>>>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
>>>>> wrote:
>>>>>
>>>>>> Could you also please share the complete log file with us.
>>>>>>
>>>>>> Cheers,
>>>>>> Till
>>>>>>
>>>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>>>>>
>>>>>>> The error for core-default.xml is interesting.
>>>>>>>
>>>>>>> Flink doesn't have this file. Probably it came with Yarn. Please
>>>>>>> check the hadoop version Flink was built with versus the hadoop version 
>>>>>>> in
>>>>>>> your cluster.
>>>>>>>
>>>>>>> Thanks
>>>>>>>
>>>>>>>  Original message 
>>>>>>> From: Garvit Sharma 
>>>>>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>>>>>> To: trohrm...@apache.org
>>>>>>> Cc: Chesnay Schepler , user@flink.apache.org
>>>>>>> Subject: Re: Exception while submitting jobs through Yarn
>>>>>>>
>>>>>>> I am not able to figure out, got stuck badly in this since last 1
>>>>>>> week. Any little help would be appreciated.
>>>>>>>
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,523 DEBUG
>>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>>> Parallelism set: 1 for 8
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,578 DEBUG
>>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>>> Parallelism set: 1 for 1
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,588 DEBUG
>>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>>> CONNECTED: KeyGroupStreamPartitioner - 1 -> 8
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,591 DEBUG
>>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>>> Parallelism set: 1 for 5
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,597 DEBUG
>>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>>> CONNECTED: KeyGroupStreamPartitioner - 5 -> 8
>>>>>>>
>>>>>>> 2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
>>>>>>> - error parsing conf core-default.xml
>>>>>>>
>>>>>>> javax.xml.parsers.ParserConfigurationException: Feature '
>>>>>>> http://apache.org/xml/features/xinclude' is not recognized.
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>>>>>>> Source)
>>>>>>>
>>>>>>> at
>>>>>>> org.apache.hadoop.conf.Co

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
g4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>>
>>> --
>>> Thanks,
>>> Amit
>>>
>>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma 
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> Please refer to my previous mail for complete logs.
>>>>
>>>> Thanks,
>>>>
>>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
>>>> wrote:
>>>>
>>>>> Could you also please share the complete log file with us.
>>>>>
>>>>> Cheers,
>>>>> Till
>>>>>
>>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>>>>
>>>>>> The error for core-default.xml is interesting.
>>>>>>
>>>>>> Flink doesn't have this file. Probably it came with Yarn. Please
>>>>>> check the hadoop version Flink was built with versus the hadoop version 
>>>>>> in
>>>>>> your cluster.
>>>>>>
>>>>>> Thanks
>>>>>>
>>>>>>  Original message 
>>>>>> From: Garvit Sharma 
>>>>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>>>>> To: trohrm...@apache.org
>>>>>> Cc: Chesnay Schepler , user@flink.apache.org
>>>>>> Subject: Re: Exception while submitting jobs through Yarn
>>>>>>
>>>>>> I am not able to figure out, got stuck badly in this since last 1
>>>>>> week. Any little help would be appreciated.
>>>>>>
>>>>>>
>>>>>> 2018-06-16 19:25:10,523 DEBUG
>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>> Parallelism set: 1 for 8
>>>>>>
>>>>>> 2018-06-16 19:25:10,578 DEBUG
>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>> Parallelism set: 1 for 1
>>>>>>
>>>>>> 2018-06-16 19:25:10,588 DEBUG
>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>> CONNECTED: KeyGroupStreamPartitioner - 1 -> 8
>>>>>>
>>>>>> 2018-06-16 19:25:10,591 DEBUG
>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>> Parallelism set: 1 for 5
>>>>>>
>>>>>> 2018-06-16 19:25:10,597 DEBUG
>>>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>>>> CONNECTED: KeyGroupStreamPartitioner - 5 -> 8
>>>>>>
>>>>>> 2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
>>>>>> - error parsing conf core-default.xml
>>>>>>
>>>>>> javax.xml.parsers.ParserConfigurationException: Feature '
>>>>>> http://apache.org/xml/features/xinclude' is not recognized.
>>>>>>
>>>>>> at
>>>>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>>>>>> Source)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
>>>>>>
>>>>>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>>>>>
>>>>>> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>>>>>>
>>>>>> at
>>>>>> org.apache.hadoop.yarn

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Till Rohrmann
Which Hadoop version have you installed? It looks as if Flink has been
build with Hadoop 2.7 but I see /usr/hdp/2.6.3.0-235 in the class path. If
you want to run Flink on Hadoop 2.6, then try to use the Hadoop free Flink
binaries or the one built for Hadoop 2.6.

Cheers,
Till

On Mon, Jun 18, 2018 at 10:48 AM Garvit Sharma  wrote:

> Ok, I have attached the log file.
>
> Please check and let me know.
>
> Thanks,
>
> On Mon, Jun 18, 2018 at 2:07 PM Amit Jain  wrote:
>
>> Hi Gravit,
>>
>> I think Till is interested to know about classpath details present at the
>> start of JM and TM logs e.g. following logs provide classpath details used
>> by TM in our case.
>>
>> 2018-06-17 19:01:30,656 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> 
>> 2018-06-17 19:01:30,658 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Starting
>> YARN TaskExecutor runner (Version: 1.5.0, Rev:c61b108, Date:24.05.2018 @
>> 14:54:44 UTC)
>> 2018-06-17 19:01:30,659 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  OS
>> current user: yarn
>> 2018-06-17 19:01:31,662 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Current
>> Hadoop/Kerberos user: hadoop
>> 2018-06-17 19:01:31,663 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM:
>> OpenJDK 64-Bit Server VM - Oracle Corporation - 1.8/25.171-b10
>> 2018-06-17 19:01:31,663 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Maximum
>> heap size: 6647 MiBytes
>> 2018-06-17 19:01:31,663 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>  JAVA_HOME: /usr/lib/jvm/java-openjdk
>> 2018-06-17 19:01:31,664 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Hadoop
>> version: 2.8.3
>> 2018-06-17 19:01:31,664 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  JVM
>> Options:
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -Xms6936m
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -Xmx6936m
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -XX:MaxDirectMemorySize=4072m
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -Dlogback.configurationFile=file:./logback.xml
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> -Dlog4j.configuration=file:./log4j.properties
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -  Program
>> Arguments:
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>> --configDir
>> 2018-06-17 19:01:31,665 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  - .
>> *2018-06-17 19:01:31,666 INFO
>>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>>  Classpath:
>> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>>
>> --
>> Thanks,
>> Amit
>>
>> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma 
>> wrote:
>>
>>> Hi,
>>>
>>> Please refer to my previous mail for complete logs.
>>>
>>> Thanks,
>>>
>>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
>>> wrote:
>>>
>>>> Could you also please share the complete log file with us.
>>>>
>>>> Cheers,
>>>> Till
>>>>
>>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>>>
>>>>

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain  wrote:

> Hi Gravit,
>
> I think Till is interested to know about classpath details present at the
> start of JM and TM logs e.g. following logs provide classpath details used
> by TM in our case.
>
> 2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -
> 
> 2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  Starting YARN TaskExecutor runner (Version: 1.5.0,
> Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
> 2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  OS current user: yarn
> 2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  Current Hadoop/Kerberos user: hadoop
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation -
> 1.8/25.171-b10
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  Maximum heap size: 6647 MiBytes
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
> 2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  Hadoop version: 2.8.3
> 2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  JVM Options:
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - -Xms6936m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - -Xmx6936m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - -XX:MaxDirectMemorySize=4072m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -
> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - -Dlogback.configurationFile=file:./logback.xml
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - -Dlog4j.configuration=file:./log4j.properties
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  -  Program Arguments:
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - --configDir
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>  - .
> *2018-06-17 19:01:31,666 INFO
>  org.apache.flink.yarn.YarnTaskExecutorRunner  -
>  Classpath:
> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*
>
> --
> Thanks,
> Amit
>
> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma 
> wrote:
>
>> Hi,
>>
>> Please refer to my previous mail for complete logs.
>>
>> Thanks,
>>
>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
>> wrote:
>>
>>> Could you also please share the complete log file with us.
>>>
>>> Cheers,
>>> Till
>>>
>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>>
>>>> The error for core-default.xml is interesting.
>>>>
>>>> Flink doesn't have this file. Probably it came with Yarn. Please check
>>>> the hadoop version Flink was built with versus the hadoop version in your
>>>> cluster.
>>>>
>>>> Thanks
>>>>
>>>>  Original message 
>>>> From: Garvit Sharma 
>>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>>> To: trohrm...@apache.org
>>>> Cc: Chesnay Schepler , user@flink.apache.org
>>>> Subject: Re: Exception while submitting jobs through Yarn
>>>>
>>>> I am not able to figure out, got stuck badly in this since last 1 week.
>>>> Any little he

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Amit Jain
Hi Gravit,

I think Till is interested to know about classpath details present at the
start of JM and TM logs e.g. following logs provide classpath details used
by TM in our case.

2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -

2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Starting YARN TaskExecutor runner (Version: 1.5.0,
Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  OS current user: yarn
2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Current Hadoop/Kerberos user: hadoop
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation -
1.8/25.171-b10
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Maximum heap size: 6647 MiBytes
2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  JAVA_HOME: /usr/lib/jvm/java-openjdk
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Hadoop version: 2.8.3
2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  JVM Options:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - -Xms6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - -Xmx6936m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - -XX:MaxDirectMemorySize=4072m
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -
-Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - -Dlogback.configurationFile=file:./logback.xml
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - -Dlog4j.configuration=file:./log4j.properties
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Program Arguments:
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - --configDir
2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 - .
*2018-06-17 19:01:31,666 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
 -  Classpath:
lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.*

--
Thanks,
Amit

On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma  wrote:

> Hi,
>
> Please refer to my previous mail for complete logs.
>
> Thanks,
>
> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann 
> wrote:
>
>> Could you also please share the complete log file with us.
>>
>> Cheers,
>> Till
>>
>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>>
>>> The error for core-default.xml is interesting.
>>>
>>> Flink doesn't have this file. Probably it came with Yarn. Please check
>>> the hadoop version Flink was built with versus the hadoop version in your
>>> cluster.
>>>
>>> Thanks
>>>
>>>  Original message 
>>> From: Garvit Sharma 
>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>> To: trohrm...@apache.org
>>> Cc: Chesnay Schepler , user@flink.apache.org
>>> Subject: Re: Exception while submitting jobs through Yarn
>>>
>>> I am not able to figure out, got stuck badly in this since last 1 week.
>>> Any little help would be appreciated.
>>>
>>>
>>> 2018-06-16 19:25:10,523 DEBUG org.apache.flink.streaming.api.graph.
>>> StreamingJobGraphGenerator  - Parallelism set: 1 for 8
>>>
>>> 2018-06-16 19:25:10,578 DEBUG org.apache.flink.streaming.api.graph.
>>> StreamingJobGraphGenerator  - Parallelism set: 1 for 1
>>>
>>> 2018-06-16 19:25:10,588 DEBUG org.apache.flink.streaming.api.graph.
>>> StreamingJobGraphGenerator  - CONNECTED: KeyGroupStreamPartitioner - 1
&g

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Garvit Sharma
Hi,

Please refer to my previous mail for complete logs.

Thanks,

On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann  wrote:

> Could you also please share the complete log file with us.
>
> Cheers,
> Till
>
> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:
>
>> The error for core-default.xml is interesting.
>>
>> Flink doesn't have this file. Probably it came with Yarn. Please check
>> the hadoop version Flink was built with versus the hadoop version in your
>> cluster.
>>
>> Thanks
>>
>>  Original message 
>> From: Garvit Sharma 
>> Date: 6/16/18 7:23 AM (GMT-08:00)
>> To: trohrm...@apache.org
>> Cc: Chesnay Schepler , user@flink.apache.org
>> Subject: Re: Exception while submitting jobs through Yarn
>>
>> I am not able to figure out, got stuck badly in this since last 1 week.
>> Any little help would be appreciated.
>>
>>
>> 2018-06-16 19:25:10,523 DEBUG
>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>> Parallelism set: 1 for 8
>>
>> 2018-06-16 19:25:10,578 DEBUG
>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>> Parallelism set: 1 for 1
>>
>> 2018-06-16 19:25:10,588 DEBUG
>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>> CONNECTED: KeyGroupStreamPartitioner - 1 -> 8
>>
>> 2018-06-16 19:25:10,591 DEBUG
>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>> Parallelism set: 1 for 5
>>
>> 2018-06-16 19:25:10,597 DEBUG
>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>> CONNECTED: KeyGroupStreamPartitioner - 5 -> 8
>>
>> 2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
>> - error parsing conf core-default.xml
>>
>> javax.xml.parsers.ParserConfigurationException: Feature '
>> http://apache.org/xml/features/xinclude' is not recognized.
>>
>> at
>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>> Source)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
>>
>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
>>
>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
>>
>> at
>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>
>> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>
>> at
>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>
>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>>
>> at
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>
>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>
>> 2018-06-16 19:25:10,620 WARN  
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>   - Error while getting queue information from YARN: null
>>
>> 2018-06-16 19:25:10,621 DEBUG
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor   - Error
>> details
>>
>> java.lang.ExceptionInInitializerError
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.g

Re: Exception while submitting jobs through Yarn

2018-06-18 Thread Till Rohrmann
Could you also please share the complete log file with us.

Cheers,
Till

On Sat, Jun 16, 2018 at 5:22 PM Ted Yu  wrote:

> The error for core-default.xml is interesting.
>
> Flink doesn't have this file. Probably it came with Yarn. Please check the
> hadoop version Flink was built with versus the hadoop version in your
> cluster.
>
> Thanks
>
>  Original message 
> From: Garvit Sharma 
> Date: 6/16/18 7:23 AM (GMT-08:00)
> To: trohrm...@apache.org
> Cc: Chesnay Schepler , user@flink.apache.org
> Subject: Re: Exception while submitting jobs through Yarn
>
> I am not able to figure out, got stuck badly in this since last 1 week.
> Any little help would be appreciated.
>
>
> 2018-06-16 19:25:10,523 DEBUG
> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
> Parallelism set: 1 for 8
>
> 2018-06-16 19:25:10,578 DEBUG
> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
> Parallelism set: 1 for 1
>
> 2018-06-16 19:25:10,588 DEBUG
> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
> CONNECTED: KeyGroupStreamPartitioner - 1 -> 8
>
> 2018-06-16 19:25:10,591 DEBUG
> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
> Parallelism set: 1 for 5
>
> 2018-06-16 19:25:10,597 DEBUG
> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
> CONNECTED: KeyGroupStreamPartitioner - 5 -> 8
>
> 2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
>   - error parsing conf core-default.xml
>
> javax.xml.parsers.ParserConfigurationException: Feature '
> http://apache.org/xml/features/xinclude' is not recognized.
>
> at
> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
> Source)
>
> at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)
>
> at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
>
> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
>
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
>
> at
> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>
> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>
> at
> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>
> 2018-06-16 19:25:10,620 WARN  
> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>   - Error while getting queue information from YARN: null
>
> 2018-06-16 19:25:10,621 DEBUG
> org.apache.flink.yarn.AbstractYarnClusterDescriptor   - Error
> details
>
> java.lang.ExceptionInInitializerError
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>
> at
> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>


Re: Exception while submitting jobs through Yarn

2018-06-16 Thread Ted Yu
The error for core-default.xml is interesting. 
Flink doesn't have this file. Probably it came with Yarn. Please check the 
hadoop version Flink was built with versus the hadoop version in your cluster.
Thanks
 Original message From: Garvit Sharma  
Date: 6/16/18  7:23 AM  (GMT-08:00) To: trohrm...@apache.org Cc: Chesnay 
Schepler , user@flink.apache.org Subject: Re: Exception 
while submitting jobs through Yarn 











I am not able to figure out, got stuck badly in this since last 1 week. Any 
little help would be appreciated.
2018-06-16 19:25:10,523 DEBUG 
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism 
set: 1 for 8
2018-06-16 19:25:10,578 DEBUG 
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism 
set: 1 for 1
2018-06-16 19:25:10,588 DEBUG 
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: 
KeyGroupStreamPartitioner - 1 -> 8
2018-06-16 19:25:10,591 DEBUG 
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - Parallelism 
set: 1 for 5
2018-06-16 19:25:10,597 DEBUG 
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  - CONNECTED: 
KeyGroupStreamPartitioner - 5 -> 8
2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration              
            - error parsing conf core-default.xml
javax.xml.parsers.ParserConfigurationException: Feature 
'http://apache.org/xml/features/xinclude' is not recognized.
at 
org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown 
Source)
at 
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)
at 
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
at 
org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
at 
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
at org.apache.hadoop.yarn.util.Records.(Records.java:32)
at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
at 
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
at 
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
at 
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
at 
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
at 
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
at 
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subject.doAs(Subject.java:422)
at 
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
at 
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
2018-06-16 19:25:10,620 WARN  
org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error while 
getting queue information from YARN: null
2018-06-16 19:25:10,621 DEBUG 
org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error details
java.lang.ExceptionInInitializerError
at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
at 
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
at 
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
at 
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
at 
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
at 
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
at 
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
at 
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)


Re: Exception while submitting jobs through Yarn

2018-06-16 Thread Garvit Sharma
I am not able to figure out, got stuck badly in this since last 1 week. Any
little help would be appreciated.


2018-06-16 19:25:10,523 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 8

2018-06-16 19:25:10,578 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 1

2018-06-16 19:25:10,588 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
CONNECTED: KeyGroupStreamPartitioner - 1 -> 8

2018-06-16 19:25:10,591 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
Parallelism set: 1 for 5

2018-06-16 19:25:10,597 DEBUG
org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
CONNECTED: KeyGroupStreamPartitioner - 5 -> 8

2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
- error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
Source)

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)

at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.(Records.java:32)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-16 19:25:10,620 WARN
org.apache.flink.yarn.AbstractYarnClusterDescriptor
  - Error while getting queue information from YARN: null

2018-06-16 19:25:10,621 DEBUG
org.apache.flink.yarn.AbstractYarnClusterDescriptor   - Error
details

java.lang.ExceptionInInitializerError

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.RuntimeException:
javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2600)

at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)

at
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.(Records.java:32)

... 14 more

Caused by: javax.xml.parsers.ParserConfigurationException: 

Re: Exception while submitting jobs through Yarn

2018-06-15 Thread Till Rohrmann
Hmm could you maybe share the client logs with us.

Cheers,
Till

On Fri, Jun 15, 2018 at 4:54 PM Garvit Sharma  wrote:

> Yes, I did.
>
> On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann 
> wrote:
>
>> Hi Garvit,
>>
>> have you exported the HADOOP_CLASSPATH as described in the release notes
>> [1]?
>>
>> [1]
>> https://ci.apache.org/projects/flink/flink-docs-release-1.5/release-notes/flink-1.5.html#hadoop-classpath-discovery
>>
>> Cheers,
>> Till
>>
>> On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma 
>> wrote:
>>
>>> Does someone has any idea how to get rid if the above parse exception
>>> while submitting flink job to Yarn.
>>>
>>> Already searched on the internet, could not find any solution to it.
>>>
>>> Please help.
>>>
>>> On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma 
>>> wrote:
>>>
 Thanks Chesnay, Now it is connecting to the Resource Manager but I am
 getting the below exception :

 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
   - error parsing conf core-default.xml

 javax.xml.parsers.ParserConfigurationException: Feature '
 http://apache.org/xml/features/xinclude' is not recognized.

 at
 org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
 Source)

 at
 org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

 at
 org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

 at
 org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

 at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

 at
 org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

 at org.apache.hadoop.yarn.util.Records.(Records.java:32)

 at
 org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

 at
 org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

 at
 org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

 at
 org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

 at
 org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

 at
 org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

 at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

 at
 org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

 at
 org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

 at
 org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

 at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

 2018-06-15 09:12:44,825 WARN  
 org.apache.flink.yarn.AbstractYarnClusterDescriptor
   - Error while getting queue information from YARN: null

 java.lang.NoClassDefFoundError: Could not initialize class
 org.apache.hadoop.yarn.util.Records

 at
 org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

 at
 org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

 at
 org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

 at
 org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

 at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

 at
 org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

 at
 org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

 at java.security.AccessController.doPrivileged(Native Method)

 at javax.security.auth.Subject.doAs(Subject.java:422)

 at
 org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

 at
 org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

 at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

 Please help.

 Thanks,


 On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler 
 wrote:

> My gut feeling is that these classes must be present in jars in the
> /lib directory. I don't think you can supply these with the submitted jar.
> For a simple test, put your jar into the /lib folder before s

Re: Exception while submitting jobs through Yarn

2018-06-15 Thread Garvit Sharma
Yes, I did.

On Fri, Jun 15, 2018 at 6:17 PM Till Rohrmann  wrote:

> Hi Garvit,
>
> have you exported the HADOOP_CLASSPATH as described in the release notes
> [1]?
>
> [1]
> https://ci.apache.org/projects/flink/flink-docs-release-1.5/release-notes/flink-1.5.html#hadoop-classpath-discovery
>
> Cheers,
> Till
>
> On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma  wrote:
>
>> Does someone has any idea how to get rid if the above parse exception
>> while submitting flink job to Yarn.
>>
>> Already searched on the internet, could not find any solution to it.
>>
>> Please help.
>>
>> On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma 
>> wrote:
>>
>>> Thanks Chesnay, Now it is connecting to the Resource Manager but I am
>>> getting the below exception :
>>>
>>> 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
>>> - error parsing conf core-default.xml
>>>
>>> javax.xml.parsers.ParserConfigurationException: Feature '
>>> http://apache.org/xml/features/xinclude' is not recognized.
>>>
>>> at
>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>>> Source)
>>>
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)
>>>
>>> at
>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)
>>>
>>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)
>>>
>>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)
>>>
>>> at
>>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>>
>>> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>>>
>>> at
>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)
>>>
>>> at
>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)
>>>
>>> at
>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>>
>>> at
>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>>
>>> at
>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>
>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>
>>> at
>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>
>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>
>>> 2018-06-15 09:12:44,825 WARN  
>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>>   - Error while getting queue information from YARN: null
>>>
>>> java.lang.NoClassDefFoundError: Could not initialize class
>>> org.apache.hadoop.yarn.util.Records
>>>
>>> at
>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)
>>>
>>> at
>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)
>>>
>>> at
>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>
>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>
>>> at
>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>
>>> at java.security.AccessController.doPrivileged(Native Method)
>>>
>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>
>>> at
>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>>
>>> at
>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>
>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>
>>> Please help.
>>>
>>> Thanks,
>>>
>>>
>>> On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler 
>>> wrote:
>>>
 My gut feeling is that these classes must be present in jars in the
 /lib directory. I don't think you can supply these with the submitted jar.
 For a simple test, put your jar into the /lib folder before submitting
 it.

 On 14.06.2018 06:56, Garvit Sharma wrote:

 Can someone please tell why am I facing this?

 On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma 
 wrote:

> Hi,
>
> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to su

Re: Exception while submitting jobs through Yarn

2018-06-15 Thread Till Rohrmann
Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes
[1]?

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.5/release-notes/flink-1.5.html#hadoop-classpath-discovery

Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma  wrote:

> Does someone has any idea how to get rid if the above parse exception
> while submitting flink job to Yarn.
>
> Already searched on the internet, could not find any solution to it.
>
> Please help.
>
> On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma  wrote:
>
>> Thanks Chesnay, Now it is connecting to the Resource Manager but I am
>> getting the below exception :
>>
>> 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
>> - error parsing conf core-default.xml
>>
>> javax.xml.parsers.ParserConfigurationException: Feature '
>> http://apache.org/xml/features/xinclude' is not recognized.
>>
>> at
>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>> Source)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)
>>
>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)
>>
>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)
>>
>> at
>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>
>> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>
>> at
>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>
>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>
>> at
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>
>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>
>> 2018-06-15 09:12:44,825 WARN  
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>   - Error while getting queue information from YARN: null
>>
>> java.lang.NoClassDefFoundError: Could not initialize class
>> org.apache.hadoop.yarn.util.Records
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)
>>
>> at
>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>
>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>
>> at
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>
>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>
>> Please help.
>>
>> Thanks,
>>
>>
>> On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler 
>> wrote:
>>
>>> My gut feeling is that these classes must be present in jars in the /lib
>>> directory. I don't think you can supply these with the submitted jar.
>>> For a simple test, put your jar into the /lib folder before submitting
>>> it.
>>>
>>> On 14.06.2018 06:56, Garvit Sharma wrote:
>>>
>>> Can someone please tell why am I facing this?
>>>
>>> On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma 
>>> wrote:
>>>
 Hi,

 I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs
 through Yarn, but I am getting the below exception :

 java.lang.NoClassDefFoundError:
 com/sun/jersey/core/util/FeaturesAndProperties

 at java.lang.ClassLoader.defineClass1(Native Method)

Re: Exception while submitting jobs through Yarn

2018-06-15 Thread Garvit Sharma
Does someone has any idea how to get rid if the above parse exception while
submitting flink job to Yarn.

Already searched on the internet, could not find any solution to it.

Please help.

On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma  wrote:

> Thanks Chesnay, Now it is connecting to the Resource Manager but I am
> getting the below exception :
>
> 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
>   - error parsing conf core-default.xml
>
> javax.xml.parsers.ParserConfigurationException: Feature '
> http://apache.org/xml/features/xinclude' is not recognized.
>
> at
> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
> Source)
>
> at
> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)
>
> at
> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)
>
> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)
>
> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)
>
> at
> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>
> at org.apache.hadoop.yarn.util.Records.(Records.java:32)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>
> at
> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>
> 2018-06-15 09:12:44,825 WARN  
> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>   - Error while getting queue information from YARN: null
>
> java.lang.NoClassDefFoundError: Could not initialize class
> org.apache.hadoop.yarn.util.Records
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)
>
> at
> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)
>
> at
> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>
> Please help.
>
> Thanks,
>
>
> On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler 
> wrote:
>
>> My gut feeling is that these classes must be present in jars in the /lib
>> directory. I don't think you can supply these with the submitted jar.
>> For a simple test, put your jar into the /lib folder before submitting it.
>>
>> On 14.06.2018 06:56, Garvit Sharma wrote:
>>
>> Can someone please tell why am I facing this?
>>
>> On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma 
>> wrote:
>>
>>> Hi,
>>>
>>> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs through
>>> Yarn, but I am getting the below exception :
>>>
>>> java.lang.NoClassDefFoundError:
>>> com/sun/jersey/core/util/FeaturesAndProperties
>>>
>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>>
>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>>
>>> at
>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>
>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>
>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>
>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>
>>> at jav

Re: Exception while submitting jobs through Yarn

2018-06-14 Thread Garvit Sharma
Thanks Chesnay, Now it is connecting to the Resource Manager but I am
getting the below exception :

2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
- error parsing conf core-default.xml

javax.xml.parsers.ParserConfigurationException: Feature '
http://apache.org/xml/features/xinclude' is not recognized.

at
org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
Source)

at
org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)

at
org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)

at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)

at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)

at
org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)

at org.apache.hadoop.yarn.util.Records.(Records.java:32)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

2018-06-15 09:12:44,825 WARN
org.apache.flink.yarn.AbstractYarnClusterDescriptor
  - Error while getting queue information from YARN: null

java.lang.NoClassDefFoundError: Could not initialize class
org.apache.hadoop.yarn.util.Records

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)

at
org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)

at
org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Please help.

Thanks,


On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler  wrote:

> My gut feeling is that these classes must be present in jars in the /lib
> directory. I don't think you can supply these with the submitted jar.
> For a simple test, put your jar into the /lib folder before submitting it.
>
> On 14.06.2018 06:56, Garvit Sharma wrote:
>
> Can someone please tell why am I facing this?
>
> On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma 
> wrote:
>
>> Hi,
>>
>> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs through
>> Yarn, but I am getting the below exception :
>>
>> java.lang.NoClassDefFoundError:
>> com/sun/jersey/core/util/FeaturesAndProperties
>>
>> at java.lang.ClassLoader.defineClass1(Native Method)
>>
>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>
>> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>
>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>
>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>
>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>
>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>
>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>
>> at
>> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
>>
>> at
>> org.apache.hadoop.yarn.client

Re: Exception while submitting jobs through Yarn

2018-06-14 Thread Chesnay Schepler
My gut feeling is that these classes must be present in jars in the /lib 
directory. I don't think you can supply these with the submitted jar.

For a simple test, put your jar into the /lib folder before submitting it.

On 14.06.2018 06:56, Garvit Sharma wrote:

Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma > wrote:


Hi,

I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs
through Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError:
com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at
java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at

org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at

org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at

org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at
org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at

org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at

org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at

org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at

org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at
org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at

org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at

org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at

org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException:
com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)



Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m
yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test
-yqu default -p 20 test.jar

The class *com/sun/jersey/core/util/FeaturesAndProperties* is
already present in the test.jar so not sure why am I getting this
exception.

Please check and let me know.

Thanks,
-- 


Garvit Sharma
github.com/garvitlnmiit/ 

No Body is a Scholar by birth, its only hard work and strong
determination that makes him master.



--

Garvit Sharma
github.com/garvitlnmiit/ 

No Body is a Scholar by birth, its only hard work and strong 
determination that makes him master.





Re: Exception while submitting jobs through Yarn

2018-06-13 Thread Garvit Sharma
Can someone please tell why am I facing this?

On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma  wrote:

> Hi,
>
> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs through
> Yarn, but I am getting the below exception :
>
> java.lang.NoClassDefFoundError:
> com/sun/jersey/core/util/FeaturesAndProperties
>
> at java.lang.ClassLoader.defineClass1(Native Method)
>
> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>
> at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>
> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>
> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>
> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
> at
> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
>
> at
> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
>
> at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)
>
> at
> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)
>
> at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)
>
> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>
> at
> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>
> at
> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>
> at java.security.AccessController.doPrivileged(Native Method)
>
> at javax.security.auth.Subject.doAs(Subject.java:422)
>
> at
> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
>
> at
> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>
> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>
> Caused by: java.lang.ClassNotFoundException:
> com.sun.jersey.core.util.FeaturesAndProperties
>
> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>
> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>
> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>
>
> Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster
> -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20
> test.jar
>
> The class *com/sun/jersey/core/util/FeaturesAndProperties* is already
> present in the test.jar so not sure why am I getting this exception.
>
> Please check and let me know.
>
> Thanks,
> --
>
> Garvit Sharma
> github.com/garvitlnmiit/
>
> No Body is a Scholar by birth, its only hard work and strong determination
> that makes him master.
>


-- 

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination
that makes him master.


Exception while submitting jobs through Yarn

2018-06-13 Thread Garvit Sharma
Hi,

I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs through
Yarn, but I am getting the below exception :

java.lang.NoClassDefFoundError:
com/sun/jersey/core/util/FeaturesAndProperties

at java.lang.ClassLoader.defineClass1(Native Method)

at java.lang.ClassLoader.defineClass(ClassLoader.java:763)

at java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)

at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)

at java.net.URLClassLoader.access$100(URLClassLoader.java:73)

at java.net.URLClassLoader$1.run(URLClassLoader.java:368)

at java.net.URLClassLoader$1.run(URLClassLoader.java:362)

at java.security.AccessController.doPrivileged(Native Method)

at java.net.URLClassLoader.findClass(URLClassLoader.java:361)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)

at
org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)

at
org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)

at org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)

at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)

at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)

at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)

at
org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)

at org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)

at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)

at
org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)

at
org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)

at java.security.AccessController.doPrivileged(Native Method)

at javax.security.auth.Subject.doAs(Subject.java:422)

at
org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)

at
org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)

at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)

Caused by: java.lang.ClassNotFoundException:
com.sun.jersey.core.util.FeaturesAndProperties

at java.net.URLClassLoader.findClass(URLClassLoader.java:381)

at java.lang.ClassLoader.loadClass(ClassLoader.java:424)

at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)

at java.lang.ClassLoader.loadClass(ClassLoader.java:357)


Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m yarn-cluster
-yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu default -p 20
test.jar

The class *com/sun/jersey/core/util/FeaturesAndProperties* is already
present in the test.jar so not sure why am I getting this exception.

Please check and let me know.

Thanks,
-- 

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination
that makes him master.