+1 (non-binding)

On 2016/03/09 4:28, Burak Yavuz wrote:
+1

On Tue, Mar 8, 2016 at 10:59 AM, Andrew Or <and...@databricks.com <mailto:and...@databricks.com>> wrote:

    +1

    2016-03-08 10:59 GMT-08:00 Yin Huai <yh...@databricks.com
    <mailto:yh...@databricks.com>>:

        +1

        On Mon, Mar 7, 2016 at 12:39 PM, Reynold Xin
        <r...@databricks.com <mailto:r...@databricks.com>> wrote:

            +1 (binding)


            On Sun, Mar 6, 2016 at 12:08 PM, Egor Pahomov
            <pahomov.e...@gmail.com <mailto:pahomov.e...@gmail.com>>
            wrote:

                +1

                Spark ODBC server is fine, SQL is fine.

                2016-03-03 12:09 GMT-08:00 Yin Yang
                <yy201...@gmail.com <mailto:yy201...@gmail.com>>:

                    Skipping docker tests, the rest are green:

                    [INFO] Spark Project External Kafka
                    ....................... SUCCESS [01:28 min]
                    [INFO] Spark Project Examples
                    ............................. SUCCESS [02:59 min]
                    [INFO] Spark Project External Kafka Assembly
                    .............. SUCCESS [ 11.680 s]
                    [INFO]
                    
------------------------------------------------------------------------
                    [INFO] BUILD SUCCESS
                    [INFO]
                    
------------------------------------------------------------------------
                    [INFO] Total time: 02:16 h
                    [INFO] Finished at: 2016-03-03T11:17:07-08:00
                    [INFO] Final Memory: 152M/4062M

                    On Thu, Mar 3, 2016 at 8:55 AM, Yin Yang
                    <yy201...@gmail.com <mailto:yy201...@gmail.com>>
                    wrote:

                        When I ran test suite using the following
                        command:

                        build/mvn clean -Phive -Phive-thriftserver
                        -Pyarn -Phadoop-2.6 -Dhadoop.version=2.7.0 package

                        I got failure in Spark Project Docker
                        Integration Tests :

                        16/03/02 17:36:46 INFO
                        RemoteActorRefProvider$RemotingTerminator:
                        Remote daemon shut down; proceeding with
                        flushing remote transports.
                        ^[[31m*** RUN ABORTED ***^[[0m
                        ^[[31m
                         com.spotify.docker.client.DockerException:
                        java.util.concurrent.ExecutionException:
                        
com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException:
                        java.io <http://java.io>.  IOException: No
                        such file or directory^[[0m
                        ^[[31m  at
                        
com.spotify.docker.client.DefaultDockerClient.propagate(DefaultDockerClient.java:1141)^[[0m
                        ^[[31m  at
                        
com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1082)^[[0m
                        ^[[31m  at
                        
com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)^[[0m
                        ^[[31m  at
                        
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)^[[0m
                        ^[[31m  at
                        
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)^[[0m
                        ^[[31m  at
                        
org.scalatest.Suite$class.callExecuteOnSuite$1(Suite.scala:1492)^[[0m
                        ^[[31m  at
                        
org.scalatest.Suite$$anonfun$runNestedSuites$1.apply(Suite.scala:1528)^[[0m
                        ^[[31m  ...^[[0m
                        ^[[31m  Cause:
                        java.util.concurrent.ExecutionException:
                        
com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException:
                        java.io.IOException: No such file or
                        directory^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.getValue(AbstractFuture.java:299)^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.AbstractFuture$Sync.get(AbstractFuture.java:286)^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.AbstractFuture.get(AbstractFuture.java:116)^[[0m
                        ^[[31m  at
                        
com.spotify.docker.client.DefaultDockerClient.request(DefaultDockerClient.java:1080)^[[0m
                        ^[[31m  at
                        
com.spotify.docker.client.DefaultDockerClient.ping(DefaultDockerClient.java:281)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:76)^[[0m
                        ^[[31m  at
                        
org.scalatest.BeforeAndAfterAll$class.beforeAll(BeforeAndAfterAll.scala:187)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.beforeAll(DockerJDBCIntegrationSuite.scala:58)^[[0m
                        ^[[31m  at
                        
org.scalatest.BeforeAndAfterAll$class.run(BeforeAndAfterAll.scala:253)^[[0m
                        ^[[31m  at
                        
org.apache.spark.sql.jdbc.DockerJDBCIntegrationSuite.run(DockerJDBCIntegrationSuite.scala:58)^[[0m
                        ^[[31m  ...^[[0m
                        ^[[31m  Cause:
                        
com.spotify.docker.client.shaded.javax.ws.rs.ProcessingException:
                        java.io.IOException: No such file or
                        directory^[[0m
                        ^[[31m  at
                        
org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:481)^[[0m
                        ^[[31m  at
                        
org.glassfish.jersey.apache.connector.ApacheConnector$1.run(ApacheConnector.java:491)^[[0m
                        ^[[31m  at
                        
java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:471)^[[0m
                        ^[[31m  at
                        
java.util.concurrent.FutureTask.run(FutureTask.java:262)^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.MoreExecutors$DirectExecutorService.execute(MoreExecutors.java:299)^[[0m
                        ^[[31m  at
                        
java.util.concurrent.AbstractExecutorService.submit(AbstractExecutorService.java:110)^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:50)^[[0m
                        ^[[31m  at
                        
jersey.repackaged.com.google.common.util.concurrent.AbstractListeningExecutorService.submit(AbstractListeningExecutorService.java:37)^[[0m
                        ^[[31m  at
                        
org.glassfish.jersey.apache.connector.ApacheConnector.apply(ApacheConnector.java:487)^[[0m
                        ^[[31m  at
                        
org.glassfish.jersey.client.ClientRuntime$2.run(ClientRuntime.java:177)^[[0m
                        ^[[31m  ...^[[0m
                        ^[[31m  Cause: java.io.IOException: No such
                        file or directory^[[0m
                        ^[[31m  at
                        
jnr.unixsocket.UnixSocketChannel.doConnect(UnixSocketChannel.java:94)^[[0m

                        Has anyone seen the above ?

                        On Wed, Mar 2, 2016 at 2:45 PM, Michael
                        Armbrust <mich...@databricks.com
                        <mailto:mich...@databricks.com>> wrote:

                            Please vote on releasing the following
                            candidate as Apache Spark version 1.6.1!

                            The vote is open until Saturday, March 5,
                            2016 at 20:00 UTC and passes if a majority
                            of at least 3+1 PMC votes are cast.

                            [ ] +1 Release this package as Apache
                            Spark 1.6.1
                            [ ] -1 Do not release this package because ...

                            To learn more about Apache Spark, please
                            see http://spark.apache.org/

                            The tag to be voted on is _v1.6.1-rc1
                            (15de51c238a7340fa81cb0b80d029a05d97bfc5c)
                            <https://github.com/apache/spark/tree/v1.6.1-rc1>_

                            The release files, including signatures,
                            digests, etc. can be found at:
                            
https://home.apache.org/~pwendell/spark-releases/spark-1.6.1-rc1-bin/
                            
<https://home.apache.org/%7Epwendell/spark-releases/spark-1.6.1-rc1-bin/>

                            Release artifacts are signed with the
                            following key:
                            
https://people.apache.org/keys/committer/pwendell.asc

                            The staging repository for this release
                            can be found at:
                            
https://repository.apache.org/content/repositories/orgapachespark-1180/

                            The test repository (versioned as
                            v1.6.1-rc1) for this release can be found at:
                            
https://repository.apache.org/content/repositories/orgapachespark-1179/

                            The documentation corresponding to this
                            release can be found at:
                            
https://home.apache.org/~pwendell/spark-releases/spark-1.6.1-rc1-docs/
                            
<https://home.apache.org/%7Epwendell/spark-releases/spark-1.6.1-rc1-docs/>


                            =======================================
                            == How can I help test this release? ==
                            =======================================
                            If you are a Spark user, you can help us
                            test this release by taking an existing
                            Spark workload and running on this release
                            candidate, then reporting any regressions
                            from 1.6.0.

                            ================================================
                            == What justifies a -1 vote for this
                            release? ==
                            ================================================
                            This is a maintenance release in the 1.6.x
                            series.  Bugs already present in 1.6.0,
                            missing features, or bugs related to new
                            features will not necessarily block
                            this release.

                            
===============================================================
                            == What should happen to JIRA tickets
                            still targeting 1.6.0? ==
                            
===============================================================
                            1. It is OK for documentation patches to
                            target 1.6.1 and still go into branch-1.6,
                            since documentations will be published
                            separately from the release.
                            2. New features for non-alpha-modules
                            should target 1.7+.
                            3. Non-blocker bug fixes should target
                            1.6.2 or 2.0.0, or drop the target version.






-- *Sincerely yours
                Egor Pakhomov
                *






Reply via email to