: Spark SQL. Memory consumption
From: masfwo...@gmail.com
To: user@spark.apache.org
Hi.
I'm using Spark SQL 1.2. I have this query:
CREATE TABLE test_MA STORED AS PARQUET AS SELECT field1 ,field2 ,field3
,field4 ,field5 ,COUNT(1) AS field6 ,MAX(field7),MIN(field8)
,SUM(field9
Hi.
I'm using Spark SQL 1.2. I have this query:
CREATE TABLE test_MA STORED AS PARQUET AS
SELECT
field1
,field2
,field3
,field4
,field5
,COUNT(1) AS field6
,MAX(field7)
,MIN(field8)
,SUM(field9 / 100)
,COUNT(field10)
,SUM(IF(field11 -500, 1, 0))
,MAX(field12)
,SUM(IF(field13 = 1, 1, 0))
:47 PM
*To:* user@spark.apache.org
*Subject:* Spark SQL. Memory consumption
Hi.
I'm using Spark SQL 1.2. I have this query:
CREATE TABLE test_MA STORED AS PARQUET AS
SELECT
field1
,field2
,field3
,field4
,field5
, but that’s still on going.
Cheng Hao
From: Masf [mailto:masfwo...@gmail.com]
Sent: Thursday, April 2, 2015 11:47 PM
To: user@spark.apache.org
Subject: Spark SQL. Memory consumption
Hi.
I'm using Spark SQL 1.2. I have this query:
CREATE TABLE test_MA STORED AS PARQUET AS
SELECT