The latest HDP sandbox 2.3.2

> On Dec 1, 2015, at 10:38 PM, Jeff Zhang <zjf...@gmail.com> wrote:
> 
> Which version of HDP do you use ?
> 
> On Wed, Dec 2, 2015 at 11:23 AM, Will Du <will...@gmail.com 
> <mailto:will...@gmail.com>> wrote:
> I have assigned an dedicate yarn queue to spark, the status of hive query 
> becomes from ACCEPT to RUNNING. However, it sees running forever still. 
> Everything is else is in default config of HDP sandbox. Do I need to set 
> something else? The browser I saw status is from hue.
> 
> Thanks,
> wd
> 
>> On Nov 30, 2015, at 12:40 AM, Rick Moritz <rah...@gmail.com 
>> <mailto:rah...@gmail.com>> wrote:
>> 
>> To explain the previous reply: the SparkContext created by Zeppelin is 
>> persistent and independent of whether it's currently processing a paragraph 
>> or not. Therefore the Zeppelin job will claim all ressources assigned to it 
>> until Zeppelin is stopped.
>> 
>> On Mon, Nov 30, 2015 at 4:20 AM, Jeff Zhang <zjf...@gmail.com 
>> <mailto:zjf...@gmail.com>> wrote:
>> I assume this is yarn app Job Browser. How many executors do you specify for 
>> your zeppelin yarn app ? It seems your zeppelin yarn app consume all the 
>> resources so that it block other applications.
>> 
>> 
>> 
>> On Mon, Nov 30, 2015 at 11:10 AM, Will Du <will...@gmail.com 
>> <mailto:will...@gmail.com>> wrote:
>> Hi folks,
>> I am running a simple scala word count from zeppelin in HDP sandbox. The job 
>> is successful with expected result. But the job of zeppelin is shown in job 
>> status of hue forever. It seems block the my hive job. Does anyone know why?
>> thanks,
>> wd
>> <PastedGraphic-1.tiff>
>> 
>> 
>> 
>> -- 
>> Best Regards
>> 
>> Jeff Zhang
>> 
> 
> 
> 
> 
> -- 
> Best Regards
> 
> Jeff Zhang

Reply via email to