You may try copying the file to same location on all nodes and try to read
from that place
On 24 Sep 2016 00:20, "ABHISHEK" <abhi...@gmail.com> wrote:

> I have tried with hdfs/tmp location but it didn't work. Same error.
>
> On 23 Sep 2016 19:37, "Aditya" <aditya.calangut...@augmentiq.co.in> wrote:
>
>> Hi Abhishek,
>>
>> Try below spark submit.
>> spark-submit --master yarn --deploy-mode cluster  --files hdfs://
>> abc.com:8020/tmp/abc.drl --class com.abc.StartMain
>> abc-0.0.1-SNAPSHOT-jar-with-dependencies.jar abc.drl
>> <http://abc.com:8020/tmp/abc.drl>
>>
>> On Friday 23 September 2016 07:29 PM, ABHISHEK wrote:
>>
>> Thanks for your response Aditya and Steve.
>> Steve:
>> I have tried specifying both /tmp/filename in hdfs and local path but it
>> didn't work.
>> You may be write that Kie session is configured  to  access files from
>> Local path.
>> I have attached code here for your reference and if you find some thing
>> wrong, please help to correct it.
>>
>> Aditya:
>> I have attached code here for reference. --File option will distributed
>> reference file to all node but  Kie session is not able  to pickup it.
>>
>> Thanks,
>> Abhishek
>>
>> On Fri, Sep 23, 2016 at 2:25 PM, Steve Loughran <ste...@hortonworks.com>
>> wrote:
>>
>>>
>>> On 23 Sep 2016, at 08:33, ABHISHEK <abhi...@gmail.com> wrote:
>>>
>>>         at java.lang.Thread.run(Thread.java:745)
>>> Caused by: java.io.FileNotFoundException: hdfs:/abc.com:8020/user/abhiet
>>> c/abc.drl (No such file or directory)
>>>         at java.io.FileInputStream.open(Native Method)
>>>         at java.io.FileInputStream.<init>(FileInputStream.java:146)
>>>         at org.drools.core.io.impl.FileSystemResource.getInputStream(Fi
>>> leSystemResource.java:123)
>>>         at org.drools.compiler.kie.builder.impl.KieFileSystemImpl.write
>>> (KieFileSystemImpl.java:58)
>>>
>>>
>>>
>>> Looks like this .KieFileSystemImpl class only works with local files, so
>>> when it gets an HDFS path in it tries to open it and gets confused.
>>>
>>> you may need to write to a local FS temp file then copy it into HDFS
>>>
>>
>>
>>
>>

Reply via email to