Re: a lot of warnings when build spark 1.6.0

2016-01-20 Thread Eli Super
Thanks Sean

in command : mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Dscala-2.10
-Phive -Phive-thriftserver  -DskipTests clean package

is the sting : -Phadoop-2.4 -Dhadoop.version=2.4.0 kind of duplication ?

can I use only one string defines hadoop version ?

and I don't have hadoop , I just build local spark only with csv package
and thrift server , what hadoop version to use to avoid warnings ?

Thanks a lot !

On Thu, Jan 21, 2016 at 9:08 AM, Eli Super  wrote:

> Hi
>
> I get WARNINGS when try to build spark 1.6.0
>
> overall I get SUCCESS message on all projects
>
> command I used :
>
> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Dscala-2.10 -Phive
> -Phive-thriftserver  -DskipTests clean package
>
> from pom.xml
>
>  2.10.5
>  2.10
>
>
> example of warnings :
>
>
> [INFO]
> 
> [INFO] Building Spark Project Core 1.6.0
> [INFO]
> 
> [INFO]
> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-core_2.10 ---
> [INFO] Deleting C:\spark-1.6.0\core\target
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-core_2.10 ---
> [INFO] Add Source directory: C:\spark-1.6.0\core\src\main\scala
> [INFO] Add Test Source directory: C:\spark-1.6.0\core\src\test\scala
> [INFO]
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-core_2.10 ---
> [INFO] Using 'UTF-8' encoding to copy filtered resources.
> [INFO] Copying 21 resources
> [INFO] Copying 3 resources
> [INFO]
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-core_2.10 ---
> [WARNING] Zinc server is not available at port 3030 - reverting to normal
> incremental compile
> [INFO] Using incremental compilation
> [INFO] Compiling 486 Scala sources and 76 Java sources to
> C:\spark-1.6.0\core\target\scala-2.10\classes...
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:104:
> value TRY_CACHE in object WriteType is deprecated: see corresponding
> Javadoc fo
> r more information.
> [WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
> [WARNING]  ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:118:
> value TRY_CACHE in object WriteType is deprecated: see corresponding
> Javadoc fo
> r more information.
> [WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
> [WARNING]  ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:186:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, null, Nil, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:196:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, sparkHome, Nil, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:208:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, sparkHome, jars, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:871:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(hadoopConfiguration)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:920:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(hadoopConfiguration)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1099:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(conf)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1366:
> method isDir in class FileStatus is deprecated: see corresponding Javadoc
> for more informatio
> n.
> [WARNING]   val isDir = fs.getFileStatus(hadoopPath).isDir
> [WARNING]^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkEnv.scala:101:
> value actorSystem in 

a lot of warnings when build spark 1.6.0

2016-01-20 Thread Eli Super
Hi

I get WARNINGS when try to build spark 1.6.0

overall I get SUCCESS message on all projects

command I used :

mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Dscala-2.10 -Phive
-Phive-thriftserver  -DskipTests clean package

from pom.xml

 2.10.5
 2.10


example of warnings :


[INFO]

[INFO] Building Spark Project Core 1.6.0
[INFO]

[INFO]
[INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @ spark-core_2.10
---
[INFO] Deleting C:\spark-1.6.0\core\target
[INFO]
[INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
spark-core_2.10 ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
spark-core_2.10 ---
[INFO] Add Source directory: C:\spark-1.6.0\core\src\main\scala
[INFO] Add Test Source directory: C:\spark-1.6.0\core\src\test\scala
[INFO]
[INFO] --- maven-remote-resources-plugin:1.5:process (default) @
spark-core_2.10 ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
spark-core_2.10 ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 21 resources
[INFO] Copying 3 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
spark-core_2.10 ---
[WARNING] Zinc server is not available at port 3030 - reverting to normal
incremental compile
[INFO] Using incremental compilation
[INFO] Compiling 486 Scala sources and 76 Java sources to
C:\spark-1.6.0\core\target\scala-2.10\classes...
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:104:
value TRY_CACHE in object WriteType is deprecated: see corresponding
Javadoc fo
r more information.
[WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
[WARNING]  ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:118:
value TRY_CACHE in object WriteType is deprecated: see corresponding
Javadoc fo
r more information.
[WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
[WARNING]  ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:186:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING] this(master, appName, null, Nil, Map())
[WARNING] ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:196:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING] this(master, appName, sparkHome, Nil, Map())
[WARNING] ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:208:
constructor SparkContext in class SparkContext is deprecated: Passing in
preferred locations h
as no effect at all, see SPARK-10921
[WARNING] this(master, appName, sparkHome, jars, Map())
[WARNING] ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:871:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING] val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]   ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:920:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING] val job = new NewHadoopJob(hadoopConfiguration)
[WARNING]   ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1099:
constructor Job in class Job is deprecated: see corresponding Javadoc for
more information.
[WARNING] val job = new NewHadoopJob(conf)
[WARNING]   ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1366:
method isDir in class FileStatus is deprecated: see corresponding Javadoc
for more informatio
n.
[WARNING]   val isDir = fs.getFileStatus(hadoopPath).isDir
[WARNING]^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkEnv.scala:101:
value actorSystem in class SparkEnv is deprecated: Actor system is no
longer supported as of 1.4.0

[WARNING] actorSystem.shutdown()
[WARNING] ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkHadoopWriter.scala:153:
constructor TaskID in class TaskID is deprecated: see corresponding Javadoc
for more info
rmation.
[WARNING] new TaskAttemptID(new TaskID(jID.value, true, splitID),
attemptID))
[WARNING]   ^
[WARNING]
C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkHadoopWriter.scala:174:
method makeQualified in class Path is deprecated: see corresponding Javadoc
for more info
rmation.
[WARNING] outputPath.makeQualified(fs)
[WARNING]^
[WARNING]

Re: a lot of warnings when build spark 1.6.0

2016-01-20 Thread Sean Owen
These are just warnings. Most are unavoidable given the version of Hadoop
supported vs what you build with.

On Thu, Jan 21, 2016, 08:08 Eli Super  wrote:

> Hi
>
> I get WARNINGS when try to build spark 1.6.0
>
> overall I get SUCCESS message on all projects
>
> command I used :
>
> mvn -Pyarn -Phadoop-2.4 -Dhadoop.version=2.4.0 -Dscala-2.10 -Phive
> -Phive-thriftserver  -DskipTests clean package
>
> from pom.xml
>
>  2.10.5
>  2.10
>
>
> example of warnings :
>
>
> [INFO]
> 
> [INFO] Building Spark Project Core 1.6.0
> [INFO]
> 
> [INFO]
> [INFO] --- maven-clean-plugin:2.6.1:clean (default-clean) @
> spark-core_2.10 ---
> [INFO] Deleting C:\spark-1.6.0\core\target
> [INFO]
> [INFO] --- maven-enforcer-plugin:1.4:enforce (enforce-versions) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- scala-maven-plugin:3.2.2:add-source (eclipse-add-source) @
> spark-core_2.10 ---
> [INFO] Add Source directory: C:\spark-1.6.0\core\src\main\scala
> [INFO] Add Test Source directory: C:\spark-1.6.0\core\src\test\scala
> [INFO]
> [INFO] --- maven-remote-resources-plugin:1.5:process (default) @
> spark-core_2.10 ---
> [INFO]
> [INFO] --- maven-resources-plugin:2.6:resources (default-resources) @
> spark-core_2.10 ---
> [INFO] Using 'UTF-8' encoding to copy filtered resources.
> [INFO] Copying 21 resources
> [INFO] Copying 3 resources
> [INFO]
> [INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @
> spark-core_2.10 ---
> [WARNING] Zinc server is not available at port 3030 - reverting to normal
> incremental compile
> [INFO] Using incremental compilation
> [INFO] Compiling 486 Scala sources and 76 Java sources to
> C:\spark-1.6.0\core\target\scala-2.10\classes...
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:104:
> value TRY_CACHE in object WriteType is deprecated: see corresponding
> Javadoc fo
> r more information.
> [WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
> [WARNING]  ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\storage\TachyonBlockManager.scala:118:
> value TRY_CACHE in object WriteType is deprecated: see corresponding
> Javadoc fo
> r more information.
> [WARNING] val os = file.getOutStream(WriteType.TRY_CACHE)
> [WARNING]  ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:186:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, null, Nil, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:196:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, sparkHome, Nil, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:208:
> constructor SparkContext in class SparkContext is deprecated: Passing in
> preferred locations h
> as no effect at all, see SPARK-10921
> [WARNING] this(master, appName, sparkHome, jars, Map())
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:871:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(hadoopConfiguration)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:920:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(hadoopConfiguration)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1099:
> constructor Job in class Job is deprecated: see corresponding Javadoc for
> more information.
> [WARNING] val job = new NewHadoopJob(conf)
> [WARNING]   ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkContext.scala:1366:
> method isDir in class FileStatus is deprecated: see corresponding Javadoc
> for more informatio
> n.
> [WARNING]   val isDir = fs.getFileStatus(hadoopPath).isDir
> [WARNING]^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkEnv.scala:101:
> value actorSystem in class SparkEnv is deprecated: Actor system is no
> longer supported as of 1.4.0
>
> [WARNING] actorSystem.shutdown()
> [WARNING] ^
> [WARNING]
> C:\spark-1.6.0\core\src\main\scala\org\apache\spark\SparkHadoopWriter.scala:153:
> constructor TaskID in class TaskID is deprecated: see corresponding Javadoc
>