Hi,

I would like to discuss issue SPARK-29176 to see if this is considered a bug and if so, to sketch out a fix.

In short, the issue is that a valid inner join with condition gets optimized so that no condition is left, but the type is still INNER. Then CheckCartesianProducts throws an exception. The type should have changed to CROSS when it gets optimized in that way.

I understand that with spark.sql.crossJoin.enabled you can make Spark not throw this exception, but I think you should not need this work-around for a valid query.

Please let me know what you think about this issue and how I could fix it. It might affect more rules than the two given in the Jira ticket.

Thanks,
Enrico

Reply via email to