--
*VJ Anand*
*Founder *
*Sankia*
vjan...@sankia.com
925-640-1340
www.sankia.com
*Confidentiality Notice*: This e-mail message, including any attachments,
is for the sole use of the intended recipient(s) and may contain
confidential and privileged information. Any unauthorized review, use
MODULE$.apply(Tuple.class);
}
Compiler error: no suitable constructor found for RDD (SparkContext, Seq,
ClassTag)
Any thoughts? pointers..
Thanks
VJ
Hi - Is there a design document for those operations that have been
implemented in 1.4.0? if so,where can I find them
-VJ
On Sun, Oct 11, 2015 at 7:27 PM, Cheng, Hao <hao.ch...@intel.com> wrote:
> Yes, I think the SPARK-2211 should be the right place to follow the CBO
> stuff,
reflection?? any approach to
solve this?
VJ
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/DataFrame-with-bean-class-tp24970.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
leverage existing RDD?
Any pointers appreciated
Thanks
VJ
--
View this message in context:
http://apache-spark-user-list.1001560.n3.nabble.com/Building-RDD-for-a-Custom-MPP-Database-tp24934.html
Sent from the Apache Spark User List mailing list archive at Nabble.com
question, is if I subclass or extend this RDD -
can I override the getPartitions, and other methods? Or is there any other
alternative? Any help or pointers much appreciated
Thanks
VJ
Hi Please can someone advice on this.
On Wed, Sep 17, 2014 at 6:59 PM, VJ Shalish vjshal...@gmail.com wrote:
I am trying to benchmark spark in a hadoop cluster.
I need to design a sample spark job to test the CPU utilization, RAM
usage, Input throughput, Output throughput and Duration
Similarly you can get any metric of process once you have the PID.
Thanks
Best Regards
On Wed, Sep 17, 2014 at 8:59 AM, VJ Shalish vjshal...@gmail.com wrote:
Sorry for the confusion Team.
My requirement is to measure the CPU utilisation, RAM usage, Network IO
and other metrics of a SPARK JOB
I am trying to benchmark spark in a hadoop cluster.
I need to design a sample spark job to test the CPU utilization, RAM usage,
Input throughput, Output throughput and Duration of execution in the
cluster.
I need to test the state of the cluster for :-
A spark job which uses high CPU
A spark
Hi
I need to get the CPU utilisation, RAM usage, Network IO and other metrics
using Java program. Can anyone help me on this?
Thanks
Shalish.
, Memory, Network, Filesystem and process based metrics.
Amit
On Sep 16, 2014, at 20:14, VJ Shalish vjshal...@gmail.com wrote:
Hi
I need to get the CPU utilisation, RAM usage, Network IO and other
metrics using Java program. Can anyone help me on this?
Thanks
Shalish.
check out SIGAR API. It
let's you get CPU, Memory, Network, Filesystem and process based metrics.
Amit
On Sep 16, 2014, at 20:14, VJ Shalish vjshal...@gmail.com wrote:
Hi
I need to get the CPU utilisation, RAM usage, Network IO and other
metrics using Java program. Can anyone help me
12 matches
Mail list logo