Hi Garvit,

have you exported the HADOOP_CLASSPATH as described in the release notes
[1]?

[1]
https://ci.apache.org/projects/flink/flink-docs-release-1.5/release-notes/flink-1.5.html#hadoop-classpath-discovery

Cheers,
Till

On Fri, Jun 15, 2018 at 2:22 PM Garvit Sharma <garvit...@gmail.com> wrote:

> Does someone has any idea how to get rid if the above parse exception
> while submitting flink job to Yarn.
>
> Already searched on the internet, could not find any solution to it.
>
> Please help.
>
> On Fri, Jun 15, 2018 at 9:15 AM Garvit Sharma <garvit...@gmail.com> wrote:
>
>> Thanks Chesnay, Now it is connecting to the Resource Manager but I am
>> getting the below exception :
>>
>> 2018-06-15 09:12:44,812 FATAL org.apache.hadoop.conf.Configuration
>>                     - error parsing conf core-default.xml
>>
>> javax.xml.parsers.ParserConfigurationException: Feature '
>> http://apache.org/xml/features/xinclude' is not recognized.
>>
>> at
>> org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.newDocumentBuilder(Unknown
>> Source)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResource(Configuration.java:2737)
>>
>> at
>> org.apache.hadoop.conf.Configuration.loadResources(Configuration.java:2696)
>>
>> at org.apache.hadoop.conf.Configuration.getProps(Configuration.java:2579)
>>
>> at org.apache.hadoop.conf.Configuration.get(Configuration.java:1350)
>>
>> at
>> org.apache.hadoop.yarn.factory.providers.RecordFactoryProvider.getRecordFactory(RecordFactoryProvider.java:49)
>>
>> at org.apache.hadoop.yarn.util.Records.<clinit>(Records.java:32)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getQueueInfoRequest(YarnClientImpl.java:584)
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.getAllQueues(YarnClientImpl.java:614)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.checkYarnQueues(AbstractYarnClusterDescriptor.java:658)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:486)
>>
>> at
>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>
>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>
>> at
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>
>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>
>> 2018-06-15 09:12:44,825 WARN  
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor
>>           - Error while getting queue information from YARN: null
>>
>> java.lang.NoClassDefFoundError: Could not initialize class
>> org.apache.hadoop.yarn.util.Records
>>
>> at
>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createApplication(YarnClientImpl.java:230)
>>
>> at
>> org.apache.flink.yarn.AbstractYarnClusterDescriptor.deployInternal(AbstractYarnClusterDescriptor.java:497)
>>
>> at
>> org.apache.flink.yarn.YarnClusterDescriptor.deployJobCluster(YarnClusterDescriptor.java:75)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:235)
>>
>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>
>> at
>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>
>> at java.security.AccessController.doPrivileged(Native Method)
>>
>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>
>> at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1836)
>>
>> at
>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>
>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>
>> Please help.
>>
>> Thanks,
>>
>>
>> On Thu, Jun 14, 2018 at 1:28 PM Chesnay Schepler <ches...@apache.org>
>> wrote:
>>
>>> My gut feeling is that these classes must be present in jars in the /lib
>>> directory. I don't think you can supply these with the submitted jar.
>>> For a simple test, put your jar into the /lib folder before submitting
>>> it.
>>>
>>> On 14.06.2018 06:56, Garvit Sharma wrote:
>>>
>>> Can someone please tell why am I facing this?
>>>
>>> On Wed, Jun 13, 2018 at 10:33 PM Garvit Sharma <garvit...@gmail.com>
>>> wrote:
>>>
>>>> Hi,
>>>>
>>>> I am using *flink-1.5.0-bin-hadoop27-scala_2.11 *to submit jobs
>>>> through Yarn, but I am getting the below exception :
>>>>
>>>> java.lang.NoClassDefFoundError:
>>>> com/sun/jersey/core/util/FeaturesAndProperties
>>>>
>>>> at java.lang.ClassLoader.defineClass1(Native Method)
>>>>
>>>> at java.lang.ClassLoader.defineClass(ClassLoader.java:763)
>>>>
>>>> at
>>>> java.security.SecureClassLoader.defineClass(SecureClassLoader.java:142)
>>>>
>>>> at java.net.URLClassLoader.defineClass(URLClassLoader.java:467)
>>>>
>>>> at java.net.URLClassLoader.access$100(URLClassLoader.java:73)
>>>>
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:368)
>>>>
>>>> at java.net.URLClassLoader$1.run(URLClassLoader.java:362)
>>>>
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:361)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>
>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.TimelineClient.createTimelineClient(TimelineClient.java:55)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.createTimelineClient(YarnClientImpl.java:181)
>>>>
>>>> at
>>>> org.apache.hadoop.yarn.client.api.impl.YarnClientImpl.serviceInit(YarnClientImpl.java:168)
>>>>
>>>> at
>>>> org.apache.hadoop.service.AbstractService.init(AbstractService.java:163)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.getClusterDescriptor(FlinkYarnSessionCli.java:966)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createDescriptor(FlinkYarnSessionCli.java:269)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:444)
>>>>
>>>> at
>>>> org.apache.flink.yarn.cli.FlinkYarnSessionCli.createClusterDescriptor(FlinkYarnSessionCli.java:92)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.runProgram(CliFrontend.java:221)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.run(CliFrontend.java:210)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.parseParameters(CliFrontend.java:1020)
>>>>
>>>> at
>>>> org.apache.flink.client.cli.CliFrontend.lambda$main$9(CliFrontend.java:1096)
>>>>
>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>
>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>
>>>> at
>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1754)
>>>>
>>>> at
>>>> org.apache.flink.runtime.security.HadoopSecurityContext.runSecured(HadoopSecurityContext.java:41)
>>>>
>>>> at org.apache.flink.client.cli.CliFrontend.main(CliFrontend.java:1096)
>>>>
>>>> Caused by: java.lang.ClassNotFoundException:
>>>> com.sun.jersey.core.util.FeaturesAndProperties
>>>>
>>>> at java.net.URLClassLoader.findClass(URLClassLoader.java:381)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:424)
>>>>
>>>> at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:335)
>>>>
>>>> at java.lang.ClassLoader.loadClass(ClassLoader.java:357)
>>>>
>>>>
>>>> Command : HADOOP_CONF_DIR=/etc/hadoop/conf bin/flink run -m
>>>> yarn-cluster -yd -yn 2 -ys 20 -yjm 10240 -ytm 10240 -yst -ynm test -yqu
>>>> default -p 20 test.jar
>>>>
>>>> The class *com/sun/jersey/core/util/FeaturesAndProperties* is already
>>>> present in the test.jar so not sure why am I getting this exception.
>>>>
>>>> Please check and let me know.
>>>>
>>>> Thanks,
>>>> --
>>>>
>>>> Garvit Sharma
>>>> github.com/garvitlnmiit/
>>>>
>>>> No Body is a Scholar by birth, its only hard work and strong
>>>> determination that makes him master.
>>>>
>>>
>>>
>>> --
>>>
>>> Garvit Sharma
>>> github.com/garvitlnmiit/
>>>
>>> No Body is a Scholar by birth, its only hard work and strong
>>> determination that makes him master.
>>>
>>>
>>>
>>
>> --
>>
>> Garvit Sharma
>> github.com/garvitlnmiit/
>>
>> No Body is a Scholar by birth, its only hard work and strong
>> determination that makes him master.
>>
>
>
> --
>
> Garvit Sharma
> github.com/garvitlnmiit/
>
> No Body is a Scholar by birth, its only hard work and strong determination
> that makes him master.
>

Reply via email to