Hi Terence,
Im not able to get logs for these jobs. “yarn logs” command does nt return
anything.
Sam
> On Mar 26, 2017, at 17:32, Terence Yim <cht...@gmail.com> wrote:
>
> Hi Sam,
>
> I guess it might be related to the missing of the Hadoop conf directory in
> the container classpath, such that the locationfactory constructed from the
> container side is not correct. Do you have access to the containers stdout
> file? It shows the classpath twill uses.
>
> Terence
>
> Sent from my iPhone
>
>> On Mar 26, 2017, at 3:16 PM, Sam William <sampri...@gmail.com> wrote:
>>
>> It works with Twill-0.9.0. So far I have been able to narrow it down to one
>> commit
>>
>> 5986553 (TWILL-63) Speed up application launch time
>>
>> Let me see if can nail down to a particular change.
>>
>> Sam
>>
>>
>>> On Mar 25, 2017, at 13:34, Sam William <sampri...@gmail.com> wrote:
>>>
>>> HI Terence,
>>> Our cloudera installation is CDH-5.7 and I use hadoop 2.3.0 packages for my
>>> fat jars.
>>>
>>> SAm
>>>> On Mar 25, 2017, at 12:31, Terence Yim <cht...@gmail.com> wrote:
>>>>
>>>> Hi,
>>>>
>>>> Haven't seen this error before. What is the version of Hadoop that the
>>>> cluster is running with? Also, seems like the $HADOOP_CONF is not in the
>>>> classpath as the FileContext is trying to use local file system instead of
>>>> the distributed one.
>>>>
>>>> Terence
>>>>
>>>> Sent from my iPhone
>>>>
>>>>> On Mar 25, 2017, at 12:25 PM, Sam William <sampri...@gmail.com> wrote:
>>>>>
>>>>> Hi,
>>>>> I have been using Twill for sometime now and I just tried to upgrade our
>>>>> application from Twill-0.8.0 to 0.10.0. I havent made any kind of code
>>>>> changes besides changing the Twill version string in the build script.
>>>>> The application fails immediately and I see this on the RM UI. Any idea
>>>>> why this could be happening?
>>>>>
>>>>> Diagnostics:
>>>>> Application application_1484158548936_11154 failed 2 times due to AM
>>>>> Container for appattempt_1484158548936_11154_000002 exited with exitCode:
>>>>> -1000
>>>>> For more detailed output, check application tracking page<> Then, click
>>>>> on links to logs of each attempt.
>>>>> Diagnostics: No such file or directory
>>>>> ENOENT: No such file or directory
>>>>> at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmodImpl(Native Method)
>>>>> at org.apache.hadoop.io.nativeio.NativeIO$POSIX.chmod(NativeIO.java:230)
>>>>> at
>>>>> org.apache.hadoop.fs.RawLocalFileSystem.setPermission(RawLocalFileSystem.java:660)
>>>>> at
>>>>> org.apache.hadoop.fs.DelegateToFileSystem.setPermission(DelegateToFileSystem.java:206)
>>>>> at org.apache.hadoop.fs.FilterFs.setPermission(FilterFs.java:251)
>>>>> at org.apache.hadoop.fs.FileContext$10.next(FileContext.java:955)
>>>>> at org.apache.hadoop.fs.FileContext$10.next(FileContext.java:951)
>>>>> at org.apache.hadoop.fs.FSLinkResolver.resolve(FSLinkResolver.java:90)
>>>>> at org.apache.hadoop.fs.FileContext.setPermission(FileContext.java:951)
>>>>> at org.apache.hadoop.yarn.util.FSDownload$3.run(FSDownload.java:419)
>>>>> at org.apache.hadoop.yarn.util.FSDownload$3.run(FSDownload.java:417)
>>>>> at java.security.AccessController.doPrivileged(Native Method)
>>>>> at javax.security.auth.Subject.doAs(Subject.java:422)
>>>>> at
>>>>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1693)
>>>>> at
>>>>> org.apache.hadoop.yarn.util.FSDownload.changePermissions(FSDownload.java:417)
>>>>> at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:363)
>>>>> at org.apache.hadoop.yarn.util.FSDownload.call(FSDownload.java:60)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>> at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
>>>>> at java.util.concurrent.FutureTask.run(FutureTask.java:266)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
>>>>> at
>>>>> java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
>>>>> at java.lang.Thread.run(Thread.java:745)
>>>>> Failing this attempt. Failing the application.
>>>>>
>>>>>
>>>>> Sam
>>>
>>