e$.error(package.scala:27)
at
org.apache.spark.sql.catalyst.AbstractSparkSQLParser.parse(AbstractSparkSQLParser.scala:36)
From: Gourav Sengupta [mailto:gourav.sengu...@gmail.com]
Sent: 15 June 2017 19:35
To: Michael Mior
Cc: Jenkins, Mark (UK Guildford); user@spark.apache.org
Subject: Re: [SparkSQL] Escaping a qu
It might be something that I am saying wrong but sometimes it may just make
sense to see the difference between *” *and "
<”> 8221, Hex 201d, Octal 20035
<"> 34, Hex 22, Octal 042
Regards,
Gourav
On Thu, Jun 15, 2017 at 6:45 PM, Michael Mior wrote:
> Assuming the
Assuming the parameter to your UDF should be start"end (with a quote in the
middle) then you need to insert a backslash into the query (which must also
be escaped in your code). So just add two extra backslashes before the
quote inside the string.
sqlContext.sql("SELECT * FROM mytable WHERE
Hi,
I have a query sqlContext.sql("SELECT * FROM mytable WHERE (mycolumn BETWEEN 1
AND 2) AND (myudfsearchfor(\"start\"end\"))"
How should I escape the double quote so that it successfully parses?
I know I can use single quotes but I do not want to since I may need to search
for a single