> 1
You have 4 CPU core and 34 threads (system wide, you likely have many more,
by the way).
Think of it as having 4 espresso machine and 34 baristas. Does the fact
that you have only 4 espresso machine mean you can only have 4 baristas? Of
course not, there's plenty more work other than making espresso, like
foaming the milk, talking to customers (IO) etc. They just have to use the
espresso machine in turn, which is managed by the OS.

> 2
Imagine you are making 100 cups and 10K cups of coffee, respectively. If
you have 4 espresso machine, what's the most sensible thing to do? Probably
just using 4 machines in both cases.








ᐧ

On Sat, Feb 7, 2015 at 10:14 AM, Deep Pradhan <pradhandeep1...@gmail.com>
wrote:

> Hi,
> I am using YourKit tool to profile Spark jobs that is run in my Single
> Node Spark Cluster.
> When I see the YourKit UI Performance Charts, the thread count always
> remains at
> All threads: 34
> Daemon threads: 32
>
> Here are my questions:
>
> 1. My system can run only 4 threads simultaneously, and obviously my
> system does not have 34 threads. What could 34 threads mean?
>
> 2. I tried running the same job with four different datasets, two small
> and two relatively big. But in the UI the thread count increases by two,
> irrespective of data size. Does this mean that the number of threads
> allocated to each job depending on data size is not taken care by the
> framework?
>
> Thank You
>

Reply via email to