[ https://issues.apache.org/jira/browse/SPARK-6546?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel ]
Cheng Lian updated SPARK-6546: ------------------------------ Component/s: (was: Build) SQL Description: PR [#4289|https://github.com/apache/spark/pull/4289] was using Guava's {{com.google.common.io.Files}} according to the first commit of that PR, see [here|https://github.com/jeanlyn/spark/blob/3b27af36f82580c2171df965140c9a14e62fd5f0/sql/hive/src/test/scala/org/apache/spark/sql/hive/InsertIntoHiveTableSuite.scala#L22]. However, [PR #5029|https://github.com/apache/spark/pull/5029] was merged earlier, and deprecated Guava {{Files}} by {{Utils.Files}}. These two combined caused this build failure. (There're no conflicts in the eyes of Git, but there do exist semantic conflicts.) (was: wrong code : val tmpDir = Files.createTempDir() not Files should Utils) Target Version/s: 1.4.0 Summary: Build failure caused by PR #5029 together with #4289 (was: Using the wrong code that will make spark compile failed!! ) > Build failure caused by PR #5029 together with #4289 > ---------------------------------------------------- > > Key: SPARK-6546 > URL: https://issues.apache.org/jira/browse/SPARK-6546 > Project: Spark > Issue Type: Bug > Components: SQL > Affects Versions: 1.4.0 > Reporter: Pei, Zhongshuai > Assignee: Pei, Zhongshuai > Fix For: 1.4.0 > > > PR [#4289|https://github.com/apache/spark/pull/4289] was using Guava's > {{com.google.common.io.Files}} according to the first commit of that PR, see > [here|https://github.com/jeanlyn/spark/blob/3b27af36f82580c2171df965140c9a14e62fd5f0/sql/hive/src/test/scala/org/apache/spark/sql/hive/InsertIntoHiveTableSuite.scala#L22]. > However, [PR #5029|https://github.com/apache/spark/pull/5029] was merged > earlier, and deprecated Guava {{Files}} by {{Utils.Files}}. These two > combined caused this build failure. (There're no conflicts in the eyes of > Git, but there do exist semantic conflicts.) -- This message was sent by Atlassian JIRA (v6.3.4#6332) --------------------------------------------------------------------- To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org For additional commands, e-mail: issues-h...@spark.apache.org