?
Yong
--
Subject: Re: Calculating Min and Max Values using Spark Transformations?
To: as...@wso2.com
CC: user@spark.apache.org
From: jfc...@us.ibm.com
Date: Fri, 28 Aug 2015 09:28:43 -0700
If you already loaded csv data into a dataframe, why not register
Yes, absolutely. Take a look at:
https://spark.apache.org/docs/1.4.1/mllib-statistics.html#summary-statistics
On Fri, Aug 28, 2015 at 8:39 AM, ashensw as...@wso2.com wrote:
Hi all,
I have a dataset which consist of large number of features(columns). It is
in csv format. So I loaded it into a
--
Subject: Re: Calculating Min and Max Values using Spark Transformations?
To: as...@wso2.com
CC: user@spark.apache.org
From: jfc...@us.ibm.com
Date: Fri, 28 Aug 2015 09:28:43 -0700
If you already loaded csv data into a dataframe, why not register it as a
table, and use
If you already loaded csv data into a dataframe, why not register it as a
table, and use Spark SQL
to find max/min or any other aggregates? SELECT MAX(column_name) FROM
dftable_name ... seems natural.
Or RDD.max() and RDD.min() won't work for you?
Yong
Subject: Re: Calculating Min and Max Values using Spark Transformations?
To: as...@wso2.com
CC: user@spark.apache.org
From: jfc...@us.ibm.com
Date: Fri, 28 Aug 2015 09:28:43 -0700
If you already loaded csv data into a dataframe, why
-null and non-NA elements as well.
Burak
On Fri, Aug 28, 2015 at 10:09 AM, java8964 java8...@hotmail.com wrote:
Or RDD.max() and RDD.min() won't work for you?
Yong
--
Subject: Re: Calculating Min and Max Values using Spark Transformations?
To: as...@wso2.com
CC