Unfortunately, I think the SQLParser is not threadsafe.  I would recommend
using HiveQL.

On Thu, Apr 30, 2015 at 4:07 AM, Wangfei (X) <wangf...@huawei.com> wrote:

> actually this is a sql parse exception, are you sure your sql is right?
>
> 发自我的 iPhone
>
> > 在 2015年4月30日,18:50,"Haopu Wang" <hw...@qilinsoft.com> 写道:
> >
> > Hi, in a test on SparkSQL 1.3.0, multiple threads are doing select on a
> > same SQLContext instance, but below exception is thrown, so it looks
> > like SQLContext is NOT thread safe? I think this is not the desired
> > behavior.
> >
> > ======
> >
> > java.lang.RuntimeException: [1.1] failure: ``insert'' expected but
> > identifier select found
> >
> > select id ,ext.d from UNIT_TEST
> > ^
> >         at scala.sys.package$.error(package.scala:27)
> >         at
> > org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSpark
> > SQLParser.scala:40)
> >         at
> > org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:130)
> >         at
> > org.apache.spark.sql.SQLContext$$anonfun$2.apply(SQLContext.scala:130)
> >         at
> > org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkS
> > QLParser$$others$1.apply(SparkSQLParser.scala:96)
> >         at
> > org.apache.spark.sql.SparkSQLParser$$anonfun$org$apache$spark$sql$SparkS
> > QLParser$$others$1.apply(SparkSQLParser.scala:95)
> >         at
> > scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:136)
> >         at
> > scala.util.parsing.combinator.Parsers$Success.map(Parsers.scala:135)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parser
> > s.scala:242)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$map$1.apply(Parser
> > s.scala:242)
> >         at
> > scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$
> > apply$2.apply(Parsers.scala:254)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1$$anonfun$
> > apply$2.apply(Parsers.scala:254)
> >         at
> > scala.util.parsing.combinator.Parsers$Failure.append(Parsers.scala:202)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Par
> > sers.scala:254)
> >         at
> > scala.util.parsing.combinator.Parsers$Parser$$anonfun$append$1.apply(Par
> > sers.scala:254)
> >         at
> > scala.util.parsing.combinator.Parsers$$anon$3.apply(Parsers.scala:222)
> >         at
> > scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Pa
> > rsers.scala:891)
> >         at
> > scala.util.parsing.combinator.Parsers$$anon$2$$anonfun$apply$14.apply(Pa
> > rsers.scala:891)
> >         at
> > scala.util.DynamicVariable.withValue(DynamicVariable.scala:57)
> >         at
> > scala.util.parsing.combinator.Parsers$$anon$2.apply(Parsers.scala:890)
> >         at
> > scala.util.parsing.combinator.PackratParsers$$anon$1.apply(PackratParser
> > s.scala:110)
> >         at
> > org.apache.spark.sql.catalyst.AbstractSparkSQLParser.apply(AbstractSpark
> > SQLParser.scala:38)
> >         at
> > org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.sca
> > la:134)
> >         at
> > org.apache.spark.sql.SQLContext$$anonfun$parseSql$1.apply(SQLContext.sca
> > la:134)
> >         at scala.Option.getOrElse(Option.scala:120)
> >         at
> > org.apache.spark.sql.SQLContext.parseSql(SQLContext.scala:134)
> >         at org.apache.spark.sql.SQLContext.sql(SQLContext.scala:915)
> >
> > -----Original Message-----
> > From: Cheng, Hao [mailto:hao.ch...@intel.com]
> > Sent: Monday, March 02, 2015 9:05 PM
> > To: Haopu Wang; user
> > Subject: RE: Is SQLContext thread-safe?
> >
> > Yes it is thread safe, at least it's supposed to be.
> >
> > -----Original Message-----
> > From: Haopu Wang [mailto:hw...@qilinsoft.com]
> > Sent: Monday, March 2, 2015 4:43 PM
> > To: user
> > Subject: Is SQLContext thread-safe?
> >
> > Hi, is it safe to use the same SQLContext to do Select operations in
> > different threads at the same time? Thank you very much!
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional
> > commands, e-mail: user-h...@spark.apache.org
> >
> >
> > ---------------------------------------------------------------------
> > To unsubscribe, e-mail: dev-unsubscr...@spark.apache.org
> > For additional commands, e-mail: dev-h...@spark.apache.org
> >
>

Reply via email to