Re: Passing parameters to spark SQL
Yeah, that's what I thought. In this specific case, I'm porting over some scripts from an existing RDBMS platform. I had been porting them (slowly) to in-code notation with python or scala. However, to expedite my efforts (and presumably theirs since I'm not doing this forever), I went down the SQL path. The problem is the loss of type and the possibility for SQL injection. No biggie, just means that where parameterized queries are in-play, we'll have to write it out in-code rather than in SQL. Thanks, Aaron On Sun, Dec 27, 2015 at 8:06 PM, Michael Armbrust <mich...@databricks.com> wrote: > The only way to do this for SQL is though the JDBC driver. > > However, you can use literal values without lossy/unsafe string > conversions by using the DataFrame API. For example, to filter: > > import org.apache.spark.sql.functions._ > df.filter($"columnName" === lit(value)) > > On Sun, Dec 27, 2015 at 1:11 PM, Ajaxx <ajack...@pobox.com> wrote: > >> Given a SQLContext (or HiveContext) is it possible to pass in parameters >> to a >> query. There are several reasons why this makes sense, including loss of >> data type during conversion to string, SQL injection, etc. >> >> But currently, it appears that SQLContext.sql() only takes a single >> parameter which is a string. >> >> >> >> -- >> View this message in context: >> http://apache-spark-user-list.1001560.n3.nabble.com/Passing-parameters-to-spark-SQL-tp25806.html >> Sent from the Apache Spark User List mailing list archive at Nabble.com. >> >> - >> To unsubscribe, e-mail: user-unsubscr...@spark.apache.org >> For additional commands, e-mail: user-h...@spark.apache.org >> >> >
Passing parameters to spark SQL
Given a SQLContext (or HiveContext) is it possible to pass in parameters to a query. There are several reasons why this makes sense, including loss of data type during conversion to string, SQL injection, etc. But currently, it appears that SQLContext.sql() only takes a single parameter which is a string. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Passing-parameters-to-spark-SQL-tp25806.html Sent from the Apache Spark User List mailing list archive at Nabble.com. - To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org
Re: Passing parameters to spark SQL
You can do it using scala string interpolation http://docs.scala-lang.org/overviews/core/string-interpolation.html On Mon, Dec 28, 2015 at 5:11 AM, Ajaxx <ajack...@pobox.com> wrote: > Given a SQLContext (or HiveContext) is it possible to pass in parameters > to a > query. There are several reasons why this makes sense, including loss of > data type during conversion to string, SQL injection, etc. > > But currently, it appears that SQLContext.sql() only takes a single > parameter which is a string. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Passing-parameters-to-spark-SQL-tp25806.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > > -- Best Regards Jeff Zhang
Re: Passing parameters to spark SQL
The only way to do this for SQL is though the JDBC driver. However, you can use literal values without lossy/unsafe string conversions by using the DataFrame API. For example, to filter: import org.apache.spark.sql.functions._ df.filter($"columnName" === lit(value)) On Sun, Dec 27, 2015 at 1:11 PM, Ajaxx <ajack...@pobox.com> wrote: > Given a SQLContext (or HiveContext) is it possible to pass in parameters > to a > query. There are several reasons why this makes sense, including loss of > data type during conversion to string, SQL injection, etc. > > But currently, it appears that SQLContext.sql() only takes a single > parameter which is a string. > > > > -- > View this message in context: > http://apache-spark-user-list.1001560.n3.nabble.com/Passing-parameters-to-spark-SQL-tp25806.html > Sent from the Apache Spark User List mailing list archive at Nabble.com. > > - > To unsubscribe, e-mail: user-unsubscr...@spark.apache.org > For additional commands, e-mail: user-h...@spark.apache.org > >