Hi

I get WARNINGS when try to build spark 1.6.0

overall I get SUCCESS message on all projects

command I used :

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Dscala-2.10 -Phive
-Phive-thriftserver  -DskipTests clean package

from pom.xml

 <scala.version>2.10.5</scala.version>
 <scala.binary.version>2.10</scala.binary.version>


example of warnings :


[INFO]
------------------------------------------------------------------------
[INFO] Building Spark Project Core 1.6.0
[INFO]
------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-core_2.10
---
[INFO] Deleting C:\spark-1.6.0\core\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
spark-core_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
spark-core_2.10 ---
[INFO] Add Source directory: C:\spark-1.6.0\core\src\main\scala
[INFO] Add Test Source directory: C:\spark-1.6.0\core\src\test\scala
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-core_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 21 resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal
incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 486 Scala sources and 76 Java sources to
C:\spark-1.6.0\core\target\scala-2.10\classes...
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:104:
value TRY_CACHE in object WriteType is deprecated: see corresponding
Javadoc fo
r more information.
[WARNING]     val os = file.getOutStream(WriteType.TRY_CACHE)
[WARNING]                                          ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:118:
value TRY_CACHE in object WriteType is deprecated: see corresponding
Javadoc fo
r more information.
[WARNING]     val os = file.getOutStream(WriteType.TRY_CACHE)
[WARNING]                                          ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:186:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING]     this(master, appName, null, Nil, Map())
[WARNING]     ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:196:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING]     this(master, appName, sparkHome, Nil, Map())
[WARNING]     ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:208:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING]     this(master, appName, sparkHome, jars, Map())
[WARNING]     ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:871:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING]     val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]               ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:920:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING]     val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]               ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1099:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING]     val job = new NewHadoopJob(conf)
[WARNING]               ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1366:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more informatio
n.
[WARNING]       val isDir = fs.getFileStatus(hadoopPath).isDir
[WARNING]                                                ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkEnv.scala:101:
value actorSystem in class SparkEnv is deprecated: Actor system is no
longer supported as of 1.4.0

[WARNING]         actorSystem.shutdown()
[WARNING]         ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkHadoopWriter.scala:153:
constructor TaskID in class TaskID is deprecated: see corresponding Javadoc
for more info
rmation.
[WARNING]         new TaskAttemptID(new TaskID(jID.value, true, splitID),
attemptID))
[WARNING]                           ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkHadoopWriter.scala:174:
method makeQualified in class Path is deprecated: see corresponding Javadoc
for more info
rmation.
[WARNING]     outputPath.makeQualified(fs)
[WARNING]                ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\api\java\JavaSparkContext.scala:105:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferre
d locations has no effect at all, see SPARK-10921
[WARNING]     this(new SparkContext(master, appName, sparkHome, jars.toSeq,
environment.asScala, Map()))
[WARNING]          ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\SparkHadoopUtil.scala:236:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more i
nformation.
[WARNING]       val (directories, leaves) =
fs.listStatus(status.getPath).partition(_.isDir)
[WARNING]
          ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\SparkHadoopUtil.scala:240:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more i
nformation.
[WARNING]     if (baseStatus.isDir) recurse(baseStatus) else Seq(baseStatus)
[WARNING]                    ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\SparkHadoopUtil.scala:249:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more i
nformation.
[WARNING]       val (directories, files) =
fs.listStatus(status.getPath).partition(_.isDir)
[WARNING]
         ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\SparkHadoopUtil.scala:254:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more i
nformation.
[WARNING]     assert(baseStatus.isDir)
[WARNING]                       ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\history\FsHistoryProvider.scala:170:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more information.
[WARNING]     if (!fs.getFileStatus(path).isDir) {
[WARNING]                                 ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\deploy\history\FsHistoryProvider.scala:307:
method delete in class FileSystem is deprecated: see corresponding Javadoc
 for more information.


Thanks !

Reply via email to