scalable-deeplearning 1.0.0 released

2016-09-09 Thread Ulanov, Alexander
Dear Spark users and developers, I have released version 1.0.0 of scalable-deeplearning package. This package is based on the implementation of artificial neural networks in Spark ML. It is intended for new Spark deep learning features that were not yet merged to Spark ML or that are too

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
The infra ticket has been updated so I'd say let's stick to running tests on master branch. We can of course create JIRAs for tests that fail in branch 2.0 and 1.6 Shivaram On Sep 9, 2016 09:33, "Hyukjin Kwon" wrote: > FYI, I just ran the SparkR tests on Windows for

Re: Unable to run docker jdbc integrations test ?

2016-09-09 Thread Suresh Thalamati
I agree with Josh. These tests are valuable , even if then can not be run on Jenkins due to setup issues. It will be good to run them atleast manually, when jdbc data source specific changes are made . Filed Jira for this problem. https://issues.apache.org/jira/browse/SPARK-17473 > On Sep

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
FYI, I just ran the SparkR tests on Windows for branch-2.0 and 1.6. branch-2.0 - https://github.com/spark-test/spark/pull/7 branch-1.6 - https://github.com/spark-test/spark/pull/8 2016-09-10 0:59 GMT+09:00 Hyukjin Kwon : > Yes, if we don't have any PRs to other branches

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Yes, if we don't have any PRs to other branches on branch-1.5 and lower versions, I think it'd be fine. One concern is, I am not sure if SparkR tests can pass on branch-1.6 (I checked it passes on branch-2.0 before). I can try to check if it passes and identify the related causes if it does not

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
One thing we could do is to backport the commit to branch-2.0 and branch-1.6 -- Do you think that will fix the problem ? On Fri, Sep 9, 2016 at 8:50 AM, Hyukjin Kwon wrote: > Ah, thanks! I wasn't too sure on this so I thought asking here somehow > reaches out to who's in

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Ah, thanks! I wasn't too sure on this so I thought asking here somehow reaches out to who's in charge of the account :). On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman" < shiva...@eecs.berkeley.edu> wrote: > Thanks for debugging - I'll reply on >

Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
Thanks for debugging - I'll reply on https://issues.apache.org/jira/browse/INFRA-12590 and ask for this change. FYI I don't any of the committers have access to the appveyor account which is at https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark . To request changes that need to be

Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Hi all, Currently, it seems the settings in AppVeyor is default and runs some tests on different branches. For example, https://github.com/apache/spark/pull/15023 https://github.com/apache/spark/pull/15022 It seems it happens only in other branches as they don’t have appveyor.yml and try to

Re: @scala.annotation.varargs or @_root_.scala.annotation.varargs?

2016-09-09 Thread Sean Owen
Oh I get it now. I was necessary in the past. Sure, seems like it could be standardized now. On Fri, Sep 9, 2016 at 1:13 AM, Reynold Xin wrote: > Yea but the earlier email was asking they were introduced in the first > place. > > > On Friday, September 9, 2016, Marcelo

Video analytics on SPark

2016-09-09 Thread Priya Ch
Hi All, I have video surveillance data and this needs to be processed in Spark. I am going through the Spark + OpenCV. How to load .mp4 images into an RDD ? Can we directly do this or the video needs to be coverted to sequenceFile ? Thanks, Padma CH