Re:Livy:Task submission scheduling problem in yarn mode

2020-03-09 Thread adley
With the gradual understanding of livy, my opinion is that livy only keeps the 
context on the driver side, but specifically on which running node the worker / 
exectutor side is executing, it is controlled by yarn, and livy cannot control 
these. I wonder if this is correct?











At 2020-03-05 16:01:26, "ipconfig"  wrote:

Hi:
There are two tasks in my program. Task one caches data locally to the 
device node where the current session is located (not hdfs). Task two will read 
the data cached by task one when submitted, so I need to submit both task one 
and task two To the same device node.
now using livy, I submit both task one and task two to the same session. 
Can I ensure that these two tasks are scheduled to the same running node and 
get the cached data?

Can a livy client upload and run different jar packages multiple times?

2020-03-08 Thread adley
Hi:
I followed the official example-PiApp.java, and tried to submit and run 
successfully. 
Then I modified the code and called client.uploadJar (newjar) again after 
client.submit, and finally called the class implementation in newjar, 
client.submit ( newjar-newclass), the upload is successful, no error is 
reported, but it has no effect. The execution result shows that the class 
implementation in oldjar is still called. 
Can Livy support the repeated submission of jar packages under one client?

In yarn mode, can the use of livy resident context avoid the time loss of multiple resource allocation?

2020-03-05 Thread adley
Hi:
By using livy and spark, can it achieve similar effects to llama (impala) 
and save time consumption of yarn container resource allocation? 
Reference link: https://cloudera.github.io/llama/