Hi everybody,
When trying to upgrade from Spark 1.1.1 to Spark 1.2.x (tried both 1.2.0
and 1.2.1) I encounter a weird error never occurred before about which I'd
kindly ask for any possible help.
In particular, all my Spark SQL queries fail with the following exception:
Would you mind to provide the query? If it's confidential, could you
please help constructing a query that reproduces this issue?
Cheng
On 3/18/15 6:03 PM, Roberto Coluccio wrote:
Hi everybody,
When trying to upgrade from Spark 1.1.1 to Spark 1.2.x (tried both
1.2.0 and 1.2.1) I encounter a
I suspect that you hit this bug
https://issues.apache.org/jira/browse/SPARK-6250, it depends on the
actual contents of your query.
Yin had opened a PR for this, although not merged yet, it should be a
valid fix https://github.com/apache/spark/pull/5078
This fix will be included in 1.3.1.
You know, I actually have one of the columns called timestamp ! This may
really cause the problem reported in the bug you linked, I guess.
On Wed, Mar 18, 2015 at 3:37 PM, Cheng Lian lian.cs@gmail.com wrote:
I suspect that you hit this bug
https://issues.apache.org/jira/browse/SPARK-6250,
Hi Cheng, thanks for your reply.
The query is something like:
SELECT * FROM (
SELECT m.column1, IF (d.columnA IS NOT null, d.columnA, m.column2), ...,
m.columnN FROM tableD d RIGHT OUTER JOIN tableM m on m.column2 = d.columnA
WHERE m.column2!=\None\ AND d.columnA!=\\
UNION ALL
SELECT
Hey Cheng, thank you so much for your suggestion, the problem was actually
a column/field called timestamp in one of the case classes!! Once I
changed its name everything worked out fine again. Let me say it was kinda
frustrating ...
Roberto
On Wed, Mar 18, 2015 at 4:07 PM, Roberto Coluccio
Hi Roberto,
For now, if the timestamp is a top level column (not a field in a
struct), you can use use backticks to quote the column name like `timestamp
`.
Thanks,
Yin
On Wed, Mar 18, 2015 at 12:10 PM, Roberto Coluccio
roberto.coluc...@gmail.com wrote:
Hey Cheng, thank you so much for your