[ 
https://issues.apache.org/jira/browse/SPARK-6631?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14390114#comment-14390114
 ] 

Frank Domoney commented on SPARK-6631:
--------------------------------------

Incidentally can you get the Debian build of Spark 1.3 work?  mvn -Pdeb 
-DskipTests clean package

Mine fails to build.  I suspect that the Debian package might be the correct 
one for Ubuntu 14.04. and Java 8

Caused by: org.vafer.jdeb.PackagingException: Could not create deb package
        at org.vafer.jdeb.Processor.createDeb(Processor.java:171)
        at org.vafer.jdeb.maven.DebMaker.makeDeb(DebMaker.java:244)
        ... 22 more
Caused by: org.vafer.jdeb.PackagingException: Control file descriptor keys are 
invalid [Version]. The following keys are mandatory [Package, Version, Section, 
Priority, Architecture, Maintainer, Description]. Please check your 
pom.xml/build.xml and your control file.
        at org.vafer.jdeb.Processor.createDeb(Processor.java:142)
        ... 23 more
[INFO

> I am unable to get the Maven Build file in Example 2.13 to build anything but 
> an empty file
> -------------------------------------------------------------------------------------------
>
>                 Key: SPARK-6631
>                 URL: https://issues.apache.org/jira/browse/SPARK-6631
>             Project: Spark
>          Issue Type: Bug
>          Components: Build
>    Affects Versions: 1.3.0
>         Environment: Ubuntu 14.04
>            Reporter: Frank Domoney
>            Priority: Blocker
>
> I have downloaded and built spark 1.3.0 under Ubuntu 14.04 but have been 
> unable to get reduceByKey to work on what seems to be a valid RDD using the 
> command line.
> scala> counts.take(10)
> res17: Array[(String, Int)] = Array((Vladimir,1), (Putin,1), (has,1), 
> (said,1), (Russia,1), (will,1), (fight,1), (for,1), (an,1), (independent,1))
> scala> val counts1 = counts.reduceByKey{case (x, y) => x + y}
> counts1.take(10)
> res16: Array[(String, Int)] = Array()
> I am attempting to build the Maven sequence in example 2.15 but get the 
> following results
> Building example 0.0.1
> [INFO] 
> ------------------------------------------------------------------------
> [INFO] 
> [INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ 
> learning-spark-mini-example ---
> [WARNING] Using platform encoding (UTF-8 actually) to copy filtered 
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory 
> /home/panzerfrank/Downloads/spark-1.3.0/wordcount/src/main/resources
> [INFO] 
> [INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ 
> learning-spark-mini-example ---
> [INFO] No sources to compile
> [INFO] 
> [INFO] --- maven-resources-plugin:2.3:testResources (default-testResources) @ 
> learning-spark-mini-example ---
> [WARNING] Using platform encoding (UTF-8 actually) to copy filtered 
> resources, i.e. build is platform dependent!
> [INFO] skip non existing resourceDirectory 
> /home/panzerfrank/Downloads/spark-1.3.0/wordcount/src/test/resources
> [INFO] 
> [INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ 
> learning-spark-mini-example ---
> [INFO] No sources to compile
> [INFO] 
> [INFO] --- maven-surefire-plugin:2.10:test (default-test) @ 
> learning-spark-mini-example ---
> [INFO] No tests to run.
> [INFO] Surefire report directory: 
> /home/panzerfrank/Downloads/spark-1.3.0/wordcount/target/surefire-reports
>  --- maven-jar-plugin:2.2:jar (default-jar) @ learning-spark-mini-example ---
> [WARNING] JAR will be empty - no content was marked for inclusion!
> [INFO] Building jar: 
> /home/panzerfrank/Downloads/spark-1.3.0/wordcount/target/learning-spark-mini-example-0.0.1.jar
> I am using the POM file from Example 2-13.  Java is Java -8  
> Am I doing something really stupid?



--
This message was sent by Atlassian JIRA
(v6.3.4#6332)

---------------------------------------------------------------------
To unsubscribe, e-mail: issues-unsubscr...@spark.apache.org
For additional commands, e-mail: issues-h...@spark.apache.org

Reply via email to