[ https://issues.apache.org/jira/browse/HBASE-15516?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=15208320#comment-15208320 ]
Hadoop QA commented on HBASE-15516: ----------------------------------- | (x) *{color:red}-1 overall{color}* | \\ \\ || Vote || Subsystem || Runtime || Comment || | {color:green}+1{color} | {color:green} @author {color} | {color:green} 0m 0s {color} | {color:green} The patch does not contain any @author tags. {color} | | {color:red}-1{color} | {color:red} test4tests {color} | {color:red} 0m 0s {color} | {color:red} The patch doesn't appear to include any new or modified tests. Please justify why no new tests are needed for this patch. Also please list what manual steps were performed to verify this patch. {color} | | {color:green}+1{color} | {color:green} mvninstall {color} | {color:green} 4m 12s {color} | {color:green} master passed {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 40s {color} | {color:green} master passed with JDK v1.8.0 {color} | | {color:green}+1{color} | {color:green} compile {color} | {color:green} 1m 49s {color} | {color:green} master passed with JDK v1.7.0_79 {color} | | {color:green}+1{color} | {color:green} scaladoc {color} | {color:green} 0m 48s {color} | {color:green} master passed {color} | | {color:red}-1{color} | {color:red} compile {color} | {color:red} 1m 3s {color} | {color:red} hbase-spark in the patch failed with JDK v1.8.0. {color} | | {color:red}-1{color} | {color:red} scalac {color} | {color:red} 1m 3s {color} | {color:red} hbase-spark in the patch failed with JDK v1.8.0. {color} | | {color:red}-1{color} | {color:red} compile {color} | {color:red} 0m 59s {color} | {color:red} hbase-spark in the patch failed with JDK v1.7.0_79. {color} | | {color:red}-1{color} | {color:red} scalac {color} | {color:red} 0m 59s {color} | {color:red} hbase-spark in the patch failed with JDK v1.7.0_79. {color} | | {color:green}+1{color} | {color:green} whitespace {color} | {color:green} 0m 0s {color} | {color:green} Patch has no whitespace issues. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 2m 48s {color} | {color:red} Patch causes 17 errors with Hadoop v2.4.0. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 5m 50s {color} | {color:red} Patch causes 17 errors with Hadoop v2.4.1. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 8m 48s {color} | {color:red} Patch causes 17 errors with Hadoop v2.5.0. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 11m 53s {color} | {color:red} Patch causes 17 errors with Hadoop v2.5.1. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 14m 46s {color} | {color:red} Patch causes 17 errors with Hadoop v2.5.2. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 17m 40s {color} | {color:red} Patch causes 17 errors with Hadoop v2.6.1. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 20m 41s {color} | {color:red} Patch causes 17 errors with Hadoop v2.6.2. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 23m 40s {color} | {color:red} Patch causes 17 errors with Hadoop v2.6.3. {color} | | {color:red}-1{color} | {color:red} hadoopcheck {color} | {color:red} 26m 31s {color} | {color:red} Patch causes 17 errors with Hadoop v2.7.1. {color} | | {color:green}+1{color} | {color:green} scaladoc {color} | {color:green} 0m 43s {color} | {color:green} the patch passed {color} | | {color:green}+1{color} | {color:green} scaladoc {color} | {color:green} 0m 49s {color} | {color:green} the patch passed {color} | | {color:red}-1{color} | {color:red} unit {color} | {color:red} 0m 57s {color} | {color:red} hbase-spark in the patch failed. {color} | | {color:green}+1{color} | {color:green} asflicense {color} | {color:green} 0m 9s {color} | {color:green} Patch does not generate ASF License warnings. {color} | | {color:black}{color} | {color:black} {color} | {color:black} 39m 56s {color} | {color:black} {color} | \\ \\ || Subsystem || Report/Notes || | JIRA Patch URL | https://issues.apache.org/jira/secure/attachment/12794964/HBASE-15516.patch | | JIRA Issue | HBASE-15516 | | Optional Tests | asflicense scalac scaladoc unit compile | | uname | Linux asf910.gq1.ygridcore.net 3.13.0-36-lowlatency #63-Ubuntu SMP PREEMPT Wed Sep 3 21:56:12 UTC 2014 x86_64 x86_64 x86_64 GNU/Linux | | Build tool | maven | | Personality | /home/jenkins/jenkins-slave/workspace/PreCommit-HBASE-Build@2/component/dev-support/hbase-personality.sh | | git revision | master / cadfb21 | | compile | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/artifact/patchprocess/patch-compile-hbase-spark-jdk1.8.0.txt | | scalac | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/artifact/patchprocess/patch-compile-hbase-spark-jdk1.8.0.txt | | compile | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/artifact/patchprocess/patch-compile-hbase-spark-jdk1.7.0_79.txt | | scalac | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/artifact/patchprocess/patch-compile-hbase-spark-jdk1.7.0_79.txt | | unit | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/artifact/patchprocess/patch-unit-hbase-spark.txt | | Test Results | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/testReport/ | | modules | C: hbase-spark U: hbase-spark | | Console output | https://builds.apache.org/job/PreCommit-HBASE-Build/1155/console | | Powered by | Apache Yetus 0.2.0 http://yetus.apache.org | This message was automatically generated. > Add flatMap to hbaseRDD > ----------------------- > > Key: HBASE-15516 > URL: https://issues.apache.org/jira/browse/HBASE-15516 > Project: HBase > Issue Type: Improvement > Components: spark > Reporter: MahmoudHanafy > Priority: Minor > Attachments: HBASE-15516.patch > > > HBaseContext supports reading RDDs using hbaseRDD method with map function, I > think it will be good also to add flatMap to hbaseRDD. > Another improvement: > Currently reading RDDs is done by using method hbaseRDD with map function as > the default method for loading RDDs. When trying to just load the RDD without > mapping, you are also loading the RDD and map all elements to itself. So I > think using hbaseRDD method without mapping function for loading RDDs would > be better. > Also, I can contribute to hbase-spark module. If there any issues related to > this module, Please tell me about them -- This message was sent by Atlassian JIRA (v6.3.4#6332)