1.
On Sun, May 25, 2014 at 5:43 PM, qingyang li
liqingyang1...@gmail.com wrote:
hi, Mayur, thanks for replying.
I know spark application should take all cores by default. My
question is how to set task number on each core ?
If one silce, one task, how can i set silce file size
hi, Mayur, thanks for replying.
I know spark application should take all cores by default. My question is
how to set task number on each core ?
If one silce, one task, how can i set silce file size ?
2014-05-23 16:37 GMT+08:00 Mayur Rustagi mayur.rust...@gmail.com:
How many cores do you see
application should take all cores by default. My question is
how to set task number on each core ?
If one silce, one task, how can i set silce file size ?
2014-05-23 16:37 GMT+08:00 Mayur Rustagi mayur.rust...@gmail.com:
How many cores do you see on your spark master (8080 port).
By default spark
, available at port 1.
On Sun, May 25, 2014 at 5:43 PM, qingyang li liqingyang1...@gmail.comwrote:
hi, Mayur, thanks for replying.
I know spark application should take all cores by default. My question
is how to set task number on each core ?
If one silce, one task, how can i set silce
know spark application should take all cores by default. My question
is how to set task number on each core ?
If one silce, one task, how can i set silce file size ?
2014-05-23 16:37 GMT+08:00 Mayur Rustagi mayur.rust...@gmail.com:
How many cores do you see on your spark master (8080 port
spark application should take all cores by default. My question
is how to set task number on each core ?
If one silce, one task, how can i set silce file size ?
2014-05-23 16:37 GMT+08:00 Mayur Rustagi mayur.rust...@gmail.com:
How many cores do you see on your spark master (8080 port
interface, available at port 1.
On Sun, May 25, 2014 at 5:43 PM, qingyang li
liqingyang1...@gmail.comwrote:
hi, Mayur, thanks for replying.
I know spark application should take all cores by default. My question
is how to set task number on each core ?
If one silce, one task, how can i
interface, available at port 1.
On Sun, May 25, 2014 at 5:43 PM, qingyang li liqingyang1...@gmail.com
wrote:
hi, Mayur, thanks for replying.
I know spark application should take all cores by default. My
question is how to set task number on each core ?
If one silce, one task, how can
as storage system and using to shark to query a table
which is a bigtable, i have 5 machines as a spark cluster, there are 4
cores on each machine .
My question is:
1. how to set task number on each core?
2. where to see how many partitions of one RDD?
i am using tachyon as storage system and using to shark to query a table
which is a bigtable, i have 5 machines as a spark cluster, there are 4
cores on each machine .
My question is:
1. how to set task number on each core?
2. where to see how many partitions of one RDD?
cores on each machine .
My question is:
1. how to set task number on each core?
2. where to see how many partitions of one RDD?
machines as a spark cluster, there are 4
cores on each machine .
My question is:
1. how to set task number on each core?
2. where to see how many partitions of one RDD?
12 matches
Mail list logo