Ok, I have attached the log file.

Please check and let me know.

Thanks,

On Mon, Jun 18, 2018 at 2:07 PM Amit Jain <aj201...@gmail.com> wrote:

> Hi Gravit,
>
> I think Till is interested to know about classpath details present at the
> start of JM and TM logs e.g. following logs provide classpath details used
> by TM in our case.
>
> 2018-06-17 19:01:30,656 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -
> --------------------------------------------------------------------------------
> 2018-06-17 19:01:30,658 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  Starting YARN TaskExecutor runner (Version: 1.5.0,
> Rev:c61b108, Date:24.05.2018 @ 14:54:44 UTC)
> 2018-06-17 19:01:30,659 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  OS current user: yarn
> 2018-06-17 19:01:31,662 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  Current Hadoop/Kerberos user: hadoop
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  JVM: OpenJDK 64-Bit Server VM - Oracle Corporation -
> 1.8/25.171-b10
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  Maximum heap size: 6647 MiBytes
> 2018-06-17 19:01:31,663 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  JAVA_HOME: /usr/lib/jvm/java-openjdk
> 2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  Hadoop version: 2.8.3
> 2018-06-17 19:01:31,664 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  JVM Options:
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     -Xms6936m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     -Xmx6936m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     -XX:MaxDirectMemorySize=4072m
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -
> -Dlog.file=/var/log/hadoop-yarn/containers/application_1528342246614_0002/container_1528342246614_0002_01_282649/taskmanager.log
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     -Dlogback.configurationFile=file:./logback.xml
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     -Dlog4j.configuration=file:./log4j.properties
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -  Program Arguments:
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     --configDir
> 2018-06-17 19:01:31,665 INFO  org.apache.flink.yarn.YarnTaskExecutorRunner
>                  -     .
> *2018-06-17 19:01:31,666 INFO
>  org.apache.flink.yarn.YarnTaskExecutorRunner                  -
>  Classpath:
> lib/flink-dist_2.11-1.5.0.jar:lib/flink-python_2.11-1.5.0.jar:lib/flink-shaded-hadoop2-uber-1.5.0.jar:lib/flink-shaded-include-yarn-0.9.1.jar:lib/guava-18.0.jar:lib/log4j-1.2.17.jar:lib/slf4j-log4j12-1.7.7.jar:log4j.properties:logback.xml:flink.jar:flink-conf.yaml::/etc/hadoop/conf:/usr/lib/hadoop/hadoop-common-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs.jar:/usr/lib/hadoop/hadoop-auth.jar:/usr/lib/hadoop/hadoop-archives-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-archive-logs-2.8.3-amzn-0.jar:/usr/lib/hadoop/hadoop-azure-datalake-2.8.3-amzn-0.jar.........*
>
> --
> Thanks,
> Amit
>
> On Mon, Jun 18, 2018 at 2:00 PM, Garvit Sharma <garvit...@gmail.com>
> wrote:
>
>> Hi,
>>
>> Please refer to my previous mail for complete logs.
>>
>> Thanks,
>>
>> On Mon, Jun 18, 2018 at 1:17 PM Till Rohrmann <trohrm...@apache.org>
>> wrote:
>>
>>> Could you also please share the complete log file with us.
>>>
>>> Cheers,
>>> Till
>>>
>>> On Sat, Jun 16, 2018 at 5:22 PM Ted Yu <yuzhih...@gmail.com> wrote:
>>>
>>>> The error for core-default.xml is interesting.
>>>>
>>>> Flink doesn't have this file. Probably it came with Yarn. Please check
>>>> the hadoop version Flink was built with versus the hadoop version in your
>>>> cluster.
>>>>
>>>> Thanks
>>>>
>>>> -------- Original message --------
>>>> From: Garvit Sharma <garvit...@gmail.com>
>>>> Date: 6/16/18 7:23 AM (GMT-08:00)
>>>> To: trohrm...@apache.org
>>>> Cc: Chesnay Schepler <ches...@apache.org>, user@flink.apache.org
>>>> Subject: Re: Exception while submitting jobs through Yarn
>>>>
>>>> I am not able to figure out, got stuck badly in this since last 1 week.
>>>> Any little help would be appreciated.
>>>>
>>>>
>>>> 2018-06-16 19:25:10,523 DEBUG
>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>> Parallelism set: 1 for 8
>>>>
>>>> 2018-06-16 19:25:10,578 DEBUG
>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>> Parallelism set: 1 for 1
>>>>
>>>> 2018-06-16 19:25:10,588 DEBUG
>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>> CONNECTED: KeyGroupStreamPartitioner - 1 -> 8
>>>>
>>>> 2018-06-16 19:25:10,591 DEBUG
>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>> Parallelism set: 1 for 5
>>>>
>>>> 2018-06-16 19:25:10,597 DEBUG
>>>> org.apache.flink.streaming.api.graph.StreamingJobGraphGenerator  -
>>>> CONNECTED: KeyGroupStreamPartitioner - 5 -> 8
>>>>
>>>> 2018-06-16 19:25:10,618 FATAL org.apache.hadoop.conf.Configuration
>>>>                       - error parsing conf core-default.xml
>>>>
>>>> javax.xml.parsers.ParserConfigurationException: Feature '
>>>> http://apache.org/xml/features/xinclude' is not recognized.
>>>>
>>>> at
>>>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>>>> Source)
>>>>
>>>> at
>>>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2482)
>>>>
>>>> at
>>>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2444)
>>>>
>>>> at
>>>> org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2361)
>>>>
>>>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1188)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>>>
>>>> at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
>>>>
>>>> at
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>>>
>>>> at
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>>>
>>>> at
>>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1692)
>>>>
>>>> at
>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>>
>>>> 2018-06-16 19:25:10,620 WARN  
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>>>           - Error while getting queue information from YARN: null
>>>>
>>>> 2018-06-16 19:25:10,621 DEBUG
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor           - Error
>>>> details
>>>>
>>>> java.lang.ExceptionInInitializerError
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:495)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:525)
>>>>
>>>> at
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>>>
>>>> at
>>>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>>>
>>>> at
>>>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>
>>>
>>
>> --
>>
>> Garvit Sharma
>> github.com/garvitlnmiit/
>>
>> No Body is a Scholar by birth, its only hard work and strong
>> determination that makes him master.
>>
>
>

-- 

Garvit Sharma
github.com/garvitlnmiit/

No Body is a Scholar by birth, its only hard work and strong determination
that makes him master.

Attachment: yarn.logs
Description: Binary data

Reply via email to