[ https://issues.apache.org/jira/browse/SPARK-29492?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Dongjoon Hyun updated SPARK-29492: ---------------------------------- Affects Version/s: (was: 2.4.0) > SparkThriftServer can't support jar class as table serde class when > executestatement in sync mode > -------------------------------------------------------------------------------------------------- > > Key: SPARK-29492 > URL: https://issues.apache.org/jira/browse/SPARK-29492 > Project: Spark > Issue Type: Improvement > Components: SQL > Affects Versions: 3.0.0 > Reporter: angerszhu > Priority: Major > > Add UT in HiveThriftBinaryServerSuit: > {code} > test("jar in sync mode") { > withCLIServiceClient { client => > val user = System.getProperty("user.name") > val sessionHandle = client.openSession(user, "") > val confOverlay = new java.util.HashMap[java.lang.String, > java.lang.String] > val jarFile = HiveTestJars.getHiveHcatalogCoreJar().getCanonicalPath > Seq(s"ADD JAR $jarFile", > "CREATE TABLE smallKV(key INT, val STRING)", > s"LOAD DATA LOCAL INPATH '${TestData.smallKv}' OVERWRITE INTO TABLE > smallKV") > .foreach(query => client.executeStatement(sessionHandle, query, > confOverlay)) > client.executeStatement(sessionHandle, > """CREATE TABLE addJar(key string) > |ROW FORMAT SERDE 'org.apache.hive.hcatalog.data.JsonSerDe' > """.stripMargin, confOverlay) > client.executeStatement(sessionHandle, > "INSERT INTO TABLE addJar SELECT 'k1' as key FROM smallKV limit 1", > confOverlay) > val operationHandle = client.executeStatement( > sessionHandle, > "SELECT key FROM addJar", > confOverlay) > // Fetch result first time > assertResult(1, "Fetching result first time from next row") { > val rows_next = client.fetchResults( > operationHandle, > FetchOrientation.FETCH_NEXT, > 1000, > FetchType.QUERY_OUTPUT) > rows_next.numRows() > } > } > } > {code} > Run it then got ClassNotFound error. -- This message was sent by Atlassian Jira (v8.3.4#803005) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org