Authorization Support(on all operations not only DDL) in Spark Sql

2016-04-26 Thread our...@cnsuning.com
hi rxin, Will Spark sql Support Authorization not only DDL ? In my user case ,a hive table was granted read to userA and other user don't have permission to read , but userB can read this hive table using spark sql. Ricky Ou

spark-sql[1.4.0] not compatible hive sql when using in with date_sub or regexp_replace

2016-01-25 Thread our...@cnsuning.com
hi , all when migrating hive sql to spark sql encountor a incompatibility problem . Please give me some suggestions. hive table description and data format as following : 1 use spark; drop table spark.test_or1; CREATE TABLE `spark.test_or1`( `statis_date` string, `lbl_nm` string) r

Re: Re: --driver-java-options not support multiple JVM configuration ?

2016-01-20 Thread our...@cnsuning.com
X:GCTimeLimit=5,-XX:GCHeapFreeLimit=95' From: Marcelo Vanzin Date: 2016-01-21 12:09 To: our...@cnsuning.com CC: user Subject: Re: --driver-java-options not support multiple JVM configuration ? On Wed, Jan 20, 2016 at 7:38 PM, our...@cnsuning.com wrote: > --driver-java-options $sparkdriverextraJavaOpti

--driver-java-options not support multiple JVM configuration ?

2016-01-20 Thread our...@cnsuning.com
hi all; --driver-java-options not support multiple JVM configuration. the submot as following: Cores=16 sparkdriverextraJavaOptions="-XX:newsize=2096m -XX:MaxPermSize=512m -XX:+PrintGCDetails -XX:+PrintGCTimeStamps -XX:+UseP arNewGC -XX:+UseConcMarkSweepGC -XX:CMSInitiatingOccupancyFr

Re: Re: spark streaming updateStateByKey state is nonsupport other type except ClassTag such as list?

2015-12-22 Thread our...@cnsuning.com
ive count of the words) // wordDstream.updateStateByKey(newUpdateFunc, new HashPartitioner (ssc.sparkContext.defaultParallelism),true,initialRDD) val stateDstream = wordDstream.updateStateByKey[Seq[Int]](newUpdateFunc, new HashPartitioner (ssc.sparkContext.defaultParallelism),true, initialRDD) stateDstream.print() ssc.sta

Re: Re: spark streaming updateStateByKey state is nonsupport other type except ClassTag such as list?

2015-12-22 Thread our...@cnsuning.com
lean, org.apache.spark.rdd.RDD[(String, Seq[Int])]) val stateDstream = wordDstream.updateStateByKey[Seq[Int]](newUpdateFunc, Ricky Ou(欧 锐) 部 门:苏宁云商 IT总部技术支撑研发中心大
数据中心数据平台开发部 tel :18551600418 email : our...@cnsuning.com From: Dean Wampler Date: 2015-12-23 00:46

Re: Re: spark streaming updateStateByKey state is nonsupport other type except ClassTag such as list?

2015-12-22 Thread our...@cnsuning.com
So sorry , should be Seq, not sql . thanks for your help. Ricky Ou(欧 锐) From: Dean Wampler Date: 2015-12-23 00:46 To: our...@cnsuning.com CC: user; t...@databricks.com Subject: Re: spark streaming updateStateByKey state is nonsupport other type except ClassTag such as list? There are

spark streaming updateStateByKey state is nonsupport other type except ClassTag such as list?

2015-12-21 Thread our...@cnsuning.com
spark streaming updateStateByKey state no support Array type without classTag? how to slove the problem? def updateStateByKey[S: ClassTag]( updateFunc: (Seq[V], Option[S]) => Option[S] ): DStream[(K, S)] = ssc.withScope { updateStateByKey(updateFunc, defaultPartitioner()) } ClassTag not

spark sql throw java.lang.ArrayIndexOutOfBoundsException when use table.*

2015-11-29 Thread our...@cnsuning.com
hi all, throw java.lang.ArrayIndexOutOfBoundsException when I use following spark sql on spark standlone or yarn. the sql: select ta.* from bi_td.dm_price_seg_td tb join bi_sor.sor_ord_detail_tf ta on 1 = 1 where ta.sale_dt = '20140514' and ta.sale_price >= tb.pri_from a

Re: Re: --jars option using hdfs jars cannot effect when spark standlone deploymode with cluster

2015-11-03 Thread our...@cnsuning.com
Akhil, In locally ,all nodes will has the same jar because the driver will be assgined to random node ;otherwise the driver log wiil report :no jar was founded . Ricky Ou(欧 锐) From: Akhil Das Date: 2015-11-02 17:59 To: our...@cnsuning.com CC: user; 494165115

--jars option using hdfs jars cannot effect when spark standlone deploymode with cluster

2015-10-27 Thread our...@cnsuning.com
hi all, when using command: spark-submit --deploy-mode cluster --jars hdfs:///user/spark/cypher.jar --class com.suning.spark.jdbc.MysqlJdbcTest hdfs:///user/spark/MysqlJdbcTest.jar the program throw exception that cannot find class in cypher.jar, the driver log show no --jars download

回复: sometimes No event logs found for application using same JavaSparkSQL example

2015-09-25 Thread our...@cnsuning.com
https://issues.apache.org/jira/browse/SPARK-10832 发件人: our...@cnsuning.com 发送时间: 2015-09-25 20:36 收件人: user 抄送: 494165115 主题: sometimes No event logs found for application using same JavaSparkSQL example hi all, when using JavaSparkSQL example,the code was submit many times as

sometimes No event logs found for application using same JavaSparkSQL example

2015-09-25 Thread our...@cnsuning.com
hi all, when using JavaSparkSQL example,the code was submit many times as following: /home/spark/software/spark/bin/spark-submit --deploy-mode cluster --class org.apache.spark.examples.sql.JavaSparkSQL hdfs://SuningHadoop2/user/spark/lib/spark-examples-1.4.0-hadoop2.4.0.jar unfortunately , s

Re: Re: Job aborted due to stage failure: java.lang.StringIndexOutOfBoundsException: String index out of range: 18

2015-08-28 Thread our...@cnsuning.com
org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) From: Terry Hole Date: 2015-08-28 17:22 To: our...@cnsuning.com CC: user; hao.cheng; Huang, Jie Subject: Re: Job aborted due to stage failure: java.lang.StringIndexOutOfBoundsException: String index out of range: 18 Ricky, You may need to use map

Job aborted due to stage failure: java.lang.StringIndexOutOfBoundsException: String index out of range: 18

2015-08-28 Thread our...@cnsuning.com
DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1450) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:1411) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:48) Ricky Ou(欧 锐) 部 门:苏宁云商 IT总部技术支撑研发中心大
数据中心数据平台开发部 email : our...@cnsuning.com