jeanlyn created SPARK-7885: ------------------------------ Summary: add config to control map aggregation in spark sql Key: SPARK-7885 URL: https://issues.apache.org/jira/browse/SPARK-7885 Project: Spark Issue Type: Improvement Affects Versions: 1.3.1, 1.2.2, 1.2.0 Reporter: jeanlyn
For now, *execution.HashAggregation* add the map aggregation in oder to decrease the shuffle data.However,we found gc problem when we use this optimization and finally the executor crash.For example, {noformat} select sale_ord_id as order_id, coalesce(sum(sku_offer_amount),0.0) as sku_offer_amount, coalesce(sum(suit_offer_amount),0.0) as suit_offer_amount, coalesce(sum(flash_gp_offer_amount),0.0) + coalesce(sum(gp_offer_amount),0.0) as gp_offer_amount, coalesce(sum(flash_gp_offer_amount),0.0) as flash_gp_offer_amount, coalesce(sum(full_minus_offer_amount),0.0) as full_rebate_offer_amount, 0.0 as telecom_point_offer_amount, coalesce(sum(coupon_pay_amount),0.0) as dq_and_jq_pay_amount, coalesce(sum(jq_pay_amount),0.0) + coalesce(sum(pop_shop_jq_pay_amount),0.0) + coalesce(sum(lim_cate_jq_pay_amount),0.0) as jq_pay_amount, coalesce(sum(dq_pay_amount),0.0) + coalesce(sum(pop_shop_dq_pay_amount),0.0) + coalesce(sum(lim_cate_dq_pay_amount),0.0) as dq_pay_amount, coalesce(sum(gift_cps_pay_amount),0.0) as gift_cps_pay_amount , coalesce(sum(mobile_red_packet_pay_amount),0.0) as mobile_red_packet_pay_amount, coalesce(sum(acct_bal_pay_amount),0.0) as acct_bal_pay_amount, coalesce(sum(jbean_pay_amount),0.0) as jbean_pay_amount, coalesce(sum(sku_rebate_amount),0.0) as sku_rebate_amount, coalesce(sum(yixun_point_pay_amount),0.0) as yixun_point_pay_amount, coalesce(sum(sku_freight_coupon_amount),0.0) as freight_coupon_amount from ord_at_det_di where ds = '2015-05-20' group by sale_ord_id {noformat} the sql scan two text files and each file is 360MB,we use 6 executor, each executor has 8GB memory and 2 cpu. We can add a config control map aggregation to avoid it. -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org