Yuandong Song created SPARK-23689:
-------------------------------------

             Summary: Spark 2.3.0/2.2.1 Some changes cause 
org.apache.spark.sql.catalyst.errors.package$TreeNodeException:
                 Key: SPARK-23689
                 URL: https://issues.apache.org/jira/browse/SPARK-23689
             Project: Spark
          Issue Type: Bug
          Components: SQL
    Affects Versions: 2.3.0, 2.2.1
         Environment: My spark standalone cluster has two workers (MEM 250G) 
and one master.

Mysql's version is 5.7.
            Reporter: Yuandong Song


{code:java}
    String[] str = {"supplier", "nation", "partsupp", "part","lineitem"};


    for (int i = 0; i < str.length; i++) {
        System.out.println(i);
        Dataset<Row> t1 = ss.read().format("jdbc")
                .option("useSSL", false)
                .option("driver", "com.mysql.jdbc.Driver")
                .option("url", "jdbc:mysql://172.16.50.104:19100/test_tpch_1g")
                .option("user", "dbscale")
                .option("password", "abc123")
                .option("dbtable", str[i])
                .load();
        t1.createOrReplaceTempView(str[i]);
        t1.show();
    }

    Properties connProp = new Properties();
    connProp.put("driver", "com.mysql.jdbc.Driver");
    connProp.put("useSSL", "false");
    connProp.put("user", "dbscale");
    connProp.put("password", "abc123");

    String sqlstr = "SELECT ps_suppkey FROM partsupp WHERE ps_partkey IN " +
            "( SELECT p_partkey FROM part WHERE p_name LIKE 'dark%' ) AND 
ps_availqty > " +
            "( SELECT 0.5 * SUM(l_quantity) FROM lineitem WHERE l_partkey = 
partsupp.ps_partkey AND l_suppkey = partsupp.ps_suppkey AND l_shipdate >= 
'1993-01-01' AND l_shipdate < '1994-01-01' ) ";
    ss.sql(sqlstr).show();{code}
I am using mysql as datasourse to make some tpch tests on spark.

These codes runs successfully on Spark 2.2.0.

But on Spark 2.2.1 and Spark 2.3.0, 
 [there is a exception|https://i.stack.imgur.com/zoRoo.png]

I guess SPARK-22472 causes it.

[http://spark.apache.org/releases/spark-release-2-2-1.html]

How can I resolve it in Spark 2.3.0 ?



--
This message was sent by Atlassian JIRA
(v7.6.3#76005)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to