scalable-deeplearning 1.0.0 released

2016-09-09 Thread Ulanov, Alexander
Dear Spark users and developers,

I have released version 1.0.0 of scalable-deeplearning package. This package is 
based on the implementation of artificial neural networks in Spark ML. It is 
intended for new Spark deep learning features that were not yet merged to Spark 
ML or that are too specific to be merged. The package provides ML pipeline API, 
distributed training, optimized numerical processing with tensor library, and 
extensible API for developers. Current features are the multilayer perceptron 
classifier and stacked autoencoder.

As a Spark package: 
https://spark-packages.org/package/avulanov/scalable-deeplearning

The source code: https://github.com/avulanov/scalable-deeplearning

Contributions are very welcome! Please, let me know if you have any comment or 
questions.

Best regards, Alexander


Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
The infra ticket has been updated so I'd say let's stick to running tests
on master branch. We can of course create JIRAs for tests that fail in
branch 2.0 and 1.6

Shivaram

On Sep 9, 2016 09:33, "Hyukjin Kwon"  wrote:

> FYI, I just ran the SparkR tests on Windows for branch-2.0 and 1.6.
>
> branch-2.0 - https://github.com/spark-test/spark/pull/7
> branch-1.6 - https://github.com/spark-test/spark/pull/8
>
>
>
>
> 2016-09-10 0:59 GMT+09:00 Hyukjin Kwon :
>
>> Yes, if we don't have any PRs to other branches on branch-1.5 and lower
>> versions, I think it'd be fine.
>>
>> One concern is, I am not sure if SparkR tests can pass on  branch-1.6 (I
>> checked it passes on branch-2.0 before).
>>
>> I can try to check if it passes and identify the related causes if it
>> does not pass.
>>
>> On 10 Sep 2016 12:52 a.m., "Shivaram Venkataraman" <
>> shiva...@eecs.berkeley.edu> wrote:
>>
>>> One thing we could do is to backport the commit to branch-2.0 and
>>> branch-1.6 -- Do you think that will fix the problem ?
>>>
>>> On Fri, Sep 9, 2016 at 8:50 AM, Hyukjin Kwon 
>>> wrote:
>>> > Ah, thanks! I wasn't too sure on this so I thought asking here somehow
>>> > reaches out to who's in charge of the account :).
>>> >
>>> >
>>> > On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman"
>>> >  wrote:
>>> >>
>>> >> Thanks for debugging - I'll reply on
>>> >> https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
>>> >> change.
>>> >>
>>> >> FYI I don't any of the committers have access to the appveyor account
>>> >> which is at https://ci.appveyor.com/projec
>>> t/ApacheSoftwareFoundation/spark
>>> >>  . To request changes that need to be done in the UI we need to open a
>>> >> INFRA ticket.
>>> >>
>>> >> Thanks
>>> >> Shivaram
>>> >>
>>> >> On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon 
>>> wrote:
>>> >> > Hi all,
>>> >> >
>>> >> >
>>> >> > Currently, it seems the settings in AppVeyor is default and runs
>>> some
>>> >> > tests
>>> >> > on different branches. For example,
>>> >> >
>>> >> >
>>> >> > https://github.com/apache/spark/pull/15023
>>> >> >
>>> >> > https://github.com/apache/spark/pull/15022
>>> >> >
>>> >> >
>>> >> > It seems it happens only in other branches as they don’t have
>>> >> > appveyor.yml
>>> >> > and try to refer the configuration in the web (although I have to
>>> test
>>> >> > this).
>>> >> >
>>> >> >
>>> >> > I’d be great if any of auhorized one sets the branch to test to
>>> master
>>> >> > brunch only as described in
>>> >> >
>>> >> >
>>> >> > https://github.com/apache/spark/blob/master/dev/appveyor-gui
>>> de.md#specifying-the-branch-for-building-and-setting-the-build-schedule
>>> >> >
>>> >> >
>>> >> > I just manually tested this. With the setting, it would not trigger
>>> the
>>> >> > test
>>> >> > for another branch, for example,
>>> >> > https://github.com/spark-test/spark/pull/5
>>> >> >
>>> >> > Currently, with the default settings, it will run the tests on
>>> another
>>> >> > branch, for example, https://github.com/spark-test/spark/pull/4
>>> >> >
>>> >> >
>>> >> > Thanks.
>>> >> >
>>> >> >
>>> >> >
>>> >>
>>> >> -
>>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>> >>
>>> >
>>>
>>
>


Re: Unable to run docker jdbc integrations test ?

2016-09-09 Thread Suresh Thalamati
I agree with Josh. These tests are valuable , even if  then  can not be run on 
Jenkins due to setup issues. It will be good to run them atleast manually, when 
jdbc data source specific changes are made . Filed Jira for this problem. 

https://issues.apache.org/jira/browse/SPARK-17473



> On Sep 7, 2016, at 4:58 PM, Luciano Resende  wrote:
> 
> That might be a reasonable and much more simpler approach to try... but if we 
> resolve these issues, we should make it part of some frequent build to make 
> sure the build don't regress and that the actual functionality don't regress 
> either. Let me look into this again...
> 
> On Wed, Sep 7, 2016 at 2:46 PM, Josh Rosen  > wrote:
> I think that these tests are valuable so I'd like to keep them. If possible, 
> though, we should try to get rid of our dependency on the Spotify 
> docker-client library, since it's a dependency hell nightmare. Given our 
> relatively simple use of Docker here, I wonder whether we could just write 
> some simple scripting over the `docker` command-line tool instead of pulling 
> in such a problematic library.
> 
> On Wed, Sep 7, 2016 at 2:36 PM Luciano Resende  > wrote:
> It looks like there is nobody running these tests, and after some dependency 
> upgrades in Spark 2.0 this has stopped working. I have tried to bring up this 
> but I am having some issues with getting the right dependencies loaded and 
> satisfying the docker-client expectations. 
> 
> The question then is: Does the community find value on having these tests 
> available ? Then we can focus on bringing them up and I can go push my 
> previous experiments as a WIP PR. Otherwise we should just get rid of these 
> tests.
> 
> Thoughts ?
> 
> 
> On Tue, Sep 6, 2016 at 4:05 PM, Suresh Thalamati  > wrote:
> Hi, 
> 
> 
> I am getting the following error , when I am trying to run jdbc docker 
> integration tests on my laptop.   Any ideas , what I might be be doing wrong ?
> 
> build/mvn -Pyarn -Phadoop-2.6 -Dhadoop.version=2.6.0  -Phive-thriftserver 
> -Phive -DskipTests clean install
> build/mvn -Pdocker-integration-tests -pl :spark-docker-integration-tests_2.11 
>  compile test
> 
> Java HotSpot(TM) 64-Bit Server VM warning: ignoring option MaxPermSize=512m; 
> support was removed in 8.0
> Discovery starting.
> Discovery completed in 200 milliseconds.
> Run starting. Expected test count is: 10
> MySQLIntegrationSuite:
> 
> Error:
> 16/09/06 11:52:00 INFO BlockManagerMaster: Registered BlockManager 
> BlockManagerId(driver, 9.31.117.25, 51868)
> *** RUN ABORTED ***
>   java.lang.AbstractMethodError:
>   at 
> org.glassfish.jersey.model.internal.CommonConfig.configureAutoDiscoverableProviders(CommonConfig.java:622)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.configureAutoDiscoverableProviders(ClientConfig.java:357)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.initRuntime(ClientConfig.java:392)
>   at 
> org.glassfish.jersey.client.ClientConfig$State.access$000(ClientConfig.java:88)
>   at 
> org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:120)
>   at 
> org.glassfish.jersey.client.ClientConfig$State$3.get(ClientConfig.java:117)
>   at 
> org.glassfish.jersey.internal.util.collection.Values$LazyValueImpl.get(Values.java:340)
>   at 
> org.glassfish.jersey.client.ClientConfig.getRuntime(ClientConfig.java:726)
>   at 
> org.glassfish.jersey.client.ClientRequest.getConfiguration(ClientRequest.java:285)
>   at 
> org.glassfish.jersey.client.JerseyInvocation.validateHttpMethodAndEntity(JerseyInvocation.java:126)
>   ...
> 16/09/06 11:52:00 INFO SparkContext: Invoking stop() from shutdown hook
> 16/09/06 11:52:00 INFO MapOutputTrackerMasterEndpoint: 
> MapOutputTrackerMasterEndpoint stopped!
> 
> 
> 
> Thanks
> -suresh
> 
> 
> 
> 
> -- 
> Luciano Resende
> http://twitter.com/lresende1975 
> http://lresende.blogspot.com/ 
> 
> 
> -- 
> Luciano Resende
> http://twitter.com/lresende1975 
> http://lresende.blogspot.com/ 


Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
FYI, I just ran the SparkR tests on Windows for branch-2.0 and 1.6.

branch-2.0 - https://github.com/spark-test/spark/pull/7
branch-1.6 - https://github.com/spark-test/spark/pull/8




2016-09-10 0:59 GMT+09:00 Hyukjin Kwon :

> Yes, if we don't have any PRs to other branches on branch-1.5 and lower
> versions, I think it'd be fine.
>
> One concern is, I am not sure if SparkR tests can pass on  branch-1.6 (I
> checked it passes on branch-2.0 before).
>
> I can try to check if it passes and identify the related causes if it does
> not pass.
>
> On 10 Sep 2016 12:52 a.m., "Shivaram Venkataraman" <
> shiva...@eecs.berkeley.edu> wrote:
>
>> One thing we could do is to backport the commit to branch-2.0 and
>> branch-1.6 -- Do you think that will fix the problem ?
>>
>> On Fri, Sep 9, 2016 at 8:50 AM, Hyukjin Kwon  wrote:
>> > Ah, thanks! I wasn't too sure on this so I thought asking here somehow
>> > reaches out to who's in charge of the account :).
>> >
>> >
>> > On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman"
>> >  wrote:
>> >>
>> >> Thanks for debugging - I'll reply on
>> >> https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
>> >> change.
>> >>
>> >> FYI I don't any of the committers have access to the appveyor account
>> >> which is at https://ci.appveyor.com/project/ApacheSoftwareFoundation/
>> spark
>> >>  . To request changes that need to be done in the UI we need to open a
>> >> INFRA ticket.
>> >>
>> >> Thanks
>> >> Shivaram
>> >>
>> >> On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon 
>> wrote:
>> >> > Hi all,
>> >> >
>> >> >
>> >> > Currently, it seems the settings in AppVeyor is default and runs some
>> >> > tests
>> >> > on different branches. For example,
>> >> >
>> >> >
>> >> > https://github.com/apache/spark/pull/15023
>> >> >
>> >> > https://github.com/apache/spark/pull/15022
>> >> >
>> >> >
>> >> > It seems it happens only in other branches as they don’t have
>> >> > appveyor.yml
>> >> > and try to refer the configuration in the web (although I have to
>> test
>> >> > this).
>> >> >
>> >> >
>> >> > I’d be great if any of auhorized one sets the branch to test to
>> master
>> >> > brunch only as described in
>> >> >
>> >> >
>> >> > https://github.com/apache/spark/blob/master/dev/appveyor-
>> guide.md#specifying-the-branch-for-building-and-settin
>> g-the-build-schedule
>> >> >
>> >> >
>> >> > I just manually tested this. With the setting, it would not trigger
>> the
>> >> > test
>> >> > for another branch, for example,
>> >> > https://github.com/spark-test/spark/pull/5
>> >> >
>> >> > Currently, with the default settings, it will run the tests on
>> another
>> >> > branch, for example, https://github.com/spark-test/spark/pull/4
>> >> >
>> >> >
>> >> > Thanks.
>> >> >
>> >> >
>> >> >
>> >>
>> >> -
>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>
>> >
>>
>


Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Yes, if we don't have any PRs to other branches on branch-1.5 and lower
versions, I think it'd be fine.

One concern is, I am not sure if SparkR tests can pass on  branch-1.6 (I
checked it passes on branch-2.0 before).

I can try to check if it passes and identify the related causes if it does
not pass.

On 10 Sep 2016 12:52 a.m., "Shivaram Venkataraman" <
shiva...@eecs.berkeley.edu> wrote:

> One thing we could do is to backport the commit to branch-2.0 and
> branch-1.6 -- Do you think that will fix the problem ?
>
> On Fri, Sep 9, 2016 at 8:50 AM, Hyukjin Kwon  wrote:
> > Ah, thanks! I wasn't too sure on this so I thought asking here somehow
> > reaches out to who's in charge of the account :).
> >
> >
> > On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman"
> >  wrote:
> >>
> >> Thanks for debugging - I'll reply on
> >> https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
> >> change.
> >>
> >> FYI I don't any of the committers have access to the appveyor account
> >> which is at https://ci.appveyor.com/project/
> ApacheSoftwareFoundation/spark
> >>  . To request changes that need to be done in the UI we need to open a
> >> INFRA ticket.
> >>
> >> Thanks
> >> Shivaram
> >>
> >> On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon 
> wrote:
> >> > Hi all,
> >> >
> >> >
> >> > Currently, it seems the settings in AppVeyor is default and runs some
> >> > tests
> >> > on different branches. For example,
> >> >
> >> >
> >> > https://github.com/apache/spark/pull/15023
> >> >
> >> > https://github.com/apache/spark/pull/15022
> >> >
> >> >
> >> > It seems it happens only in other branches as they don’t have
> >> > appveyor.yml
> >> > and try to refer the configuration in the web (although I have to test
> >> > this).
> >> >
> >> >
> >> > I’d be great if any of auhorized one sets the branch to test to master
> >> > brunch only as described in
> >> >
> >> >
> >> > https://github.com/apache/spark/blob/master/dev/
> appveyor-guide.md#specifying-the-branch-for-building-and-
> setting-the-build-schedule
> >> >
> >> >
> >> > I just manually tested this. With the setting, it would not trigger
> the
> >> > test
> >> > for another branch, for example,
> >> > https://github.com/spark-test/spark/pull/5
> >> >
> >> > Currently, with the default settings, it will run the tests on another
> >> > branch, for example, https://github.com/spark-test/spark/pull/4
> >> >
> >> >
> >> > Thanks.
> >> >
> >> >
> >> >
> >>
> >> -
> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
> >>
> >
>


Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
One thing we could do is to backport the commit to branch-2.0 and
branch-1.6 -- Do you think that will fix the problem ?

On Fri, Sep 9, 2016 at 8:50 AM, Hyukjin Kwon  wrote:
> Ah, thanks! I wasn't too sure on this so I thought asking here somehow
> reaches out to who's in charge of the account :).
>
>
> On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman"
>  wrote:
>>
>> Thanks for debugging - I'll reply on
>> https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
>> change.
>>
>> FYI I don't any of the committers have access to the appveyor account
>> which is at https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark
>>  . To request changes that need to be done in the UI we need to open a
>> INFRA ticket.
>>
>> Thanks
>> Shivaram
>>
>> On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon  wrote:
>> > Hi all,
>> >
>> >
>> > Currently, it seems the settings in AppVeyor is default and runs some
>> > tests
>> > on different branches. For example,
>> >
>> >
>> > https://github.com/apache/spark/pull/15023
>> >
>> > https://github.com/apache/spark/pull/15022
>> >
>> >
>> > It seems it happens only in other branches as they don’t have
>> > appveyor.yml
>> > and try to refer the configuration in the web (although I have to test
>> > this).
>> >
>> >
>> > I’d be great if any of auhorized one sets the branch to test to master
>> > brunch only as described in
>> >
>> >
>> > https://github.com/apache/spark/blob/master/dev/appveyor-guide.md#specifying-the-branch-for-building-and-setting-the-build-schedule
>> >
>> >
>> > I just manually tested this. With the setting, it would not trigger the
>> > test
>> > for another branch, for example,
>> > https://github.com/spark-test/spark/pull/5
>> >
>> > Currently, with the default settings, it will run the tests on another
>> > branch, for example, https://github.com/spark-test/spark/pull/4
>> >
>> >
>> > Thanks.
>> >
>> >
>> >
>>
>> -
>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Ah, thanks! I wasn't too sure on this so I thought asking here somehow
reaches out to who's in charge of the account :).

On 10 Sep 2016 12:41 a.m., "Shivaram Venkataraman" <
shiva...@eecs.berkeley.edu> wrote:

> Thanks for debugging - I'll reply on
> https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
> change.
>
> FYI I don't any of the committers have access to the appveyor account
> which is at https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark
>  . To request changes that need to be done in the UI we need to open a
> INFRA ticket.
>
> Thanks
> Shivaram
>
> On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon  wrote:
> > Hi all,
> >
> >
> > Currently, it seems the settings in AppVeyor is default and runs some
> tests
> > on different branches. For example,
> >
> >
> > https://github.com/apache/spark/pull/15023
> >
> > https://github.com/apache/spark/pull/15022
> >
> >
> > It seems it happens only in other branches as they don’t have
> appveyor.yml
> > and try to refer the configuration in the web (although I have to test
> > this).
> >
> >
> > I’d be great if any of auhorized one sets the branch to test to master
> > brunch only as described in
> >
> > https://github.com/apache/spark/blob/master/dev/
> appveyor-guide.md#specifying-the-branch-for-building-and-
> setting-the-build-schedule
> >
> >
> > I just manually tested this. With the setting, it would not trigger the
> test
> > for another branch, for example, https://github.com/spark-test/
> spark/pull/5
> >
> > Currently, with the default settings, it will run the tests on another
> > branch, for example, https://github.com/spark-test/spark/pull/4
> >
> >
> > Thanks.
> >
> >
> >
>
> -
> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>
>


Re: Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Shivaram Venkataraman
Thanks for debugging - I'll reply on
https://issues.apache.org/jira/browse/INFRA-12590 and ask for this
change.

FYI I don't any of the committers have access to the appveyor account
which is at https://ci.appveyor.com/project/ApacheSoftwareFoundation/spark
 . To request changes that need to be done in the UI we need to open a
INFRA ticket.

Thanks
Shivaram

On Fri, Sep 9, 2016 at 6:55 AM, Hyukjin Kwon  wrote:
> Hi all,
>
>
> Currently, it seems the settings in AppVeyor is default and runs some tests
> on different branches. For example,
>
>
> https://github.com/apache/spark/pull/15023
>
> https://github.com/apache/spark/pull/15022
>
>
> It seems it happens only in other branches as they don’t have appveyor.yml
> and try to refer the configuration in the web (although I have to test
> this).
>
>
> I’d be great if any of auhorized one sets the branch to test to master
> brunch only as described in
>
> https://github.com/apache/spark/blob/master/dev/appveyor-guide.md#specifying-the-branch-for-building-and-setting-the-build-schedule
>
>
> I just manually tested this. With the setting, it would not trigger the test
> for another branch, for example, https://github.com/spark-test/spark/pull/5
>
> Currently, with the default settings, it will run the tests on another
> branch, for example, https://github.com/spark-test/spark/pull/4
>
>
> Thanks.
>
>
>

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Change the settings in AppVeyor to prevent triggering the tests in other PRs in other branches

2016-09-09 Thread Hyukjin Kwon
Hi all,


Currently, it seems the settings in AppVeyor is default and runs some tests
on different branches. For example,


https://github.com/apache/spark/pull/15023

https://github.com/apache/spark/pull/15022


It seems it happens only in other branches as they don’t have appveyor.yml
and try to refer the configuration in the web (although I have to test
this).


I’d be great if any of auhorized one sets the branch to test to master
brunch only as described in

https://github.com/apache/spark/blob/master/dev/appveyor-guide.md#specifying-the-branch-for-building-and-setting-the-build-schedule


I just manually tested this. With the setting, it would not trigger the
test for another branch, for example,
https://github.com/spark-test/spark/pull/5

Currently, with the default settings, it will run the tests on another
branch, for example, https://github.com/spark-test/spark/pull/4


Thanks.


​


Re: @scala.annotation.varargs or @_root_.scala.annotation.varargs?

2016-09-09 Thread Sean Owen
Oh I get it now. I was necessary in the past. Sure, seems like it
could be standardized now.

On Fri, Sep 9, 2016 at 1:13 AM, Reynold Xin  wrote:
> Yea but the earlier email was asking they were introduced in the first
> place.
>
>
> On Friday, September 9, 2016, Marcelo Vanzin  wrote:
>>
>> Not after SPARK-14642, right?
>>
>> On Thu, Sep 8, 2016 at 5:07 PM, Reynold Xin  wrote:
>> > There is a package called scala.
>> >
>> >
>> > On Friday, September 9, 2016, Hyukjin Kwon  wrote:
>> >>
>> >> I was also actually wondering why it is being written like this.
>> >>
>> >> I actually took a look for this before and wanted to fix them but I
>> >> found
>> >> https://github.com/apache/spark/pull/12077/files#r58041468
>> >>
>> >> So, I kind of persuaded myself that committers already know about it
>> >> and
>> >> there is a reason for this.
>> >>
>> >> I'd like to know the full details why we don't import but write full
>> >> path
>> >> though.
>> >>
>> >>
>> >> On 9 Sep 2016 5:28 a.m., "Jakob Odersky"  wrote:
>> >>>
>> >>> +1 to Sean's answer, importing varargs.
>> >>> In this case the _root_ is also unnecessary (it would be required in
>> >>> case you were using it in a nested package called "scala" itself)
>> >>>
>> >>> On Thu, Sep 8, 2016 at 9:27 AM, Sean Owen  wrote:
>> >>> > I think the @_root_ version is redundant because
>> >>> > @scala.annotation.varargs is redundant. Actually wouldn't we just
>> >>> > import varargs and write @varargs?
>> >>> >
>> >>> > On Thu, Sep 8, 2016 at 1:24 PM, Jacek Laskowski 
>> >>> > wrote:
>> >>> >> Hi,
>> >>> >>
>> >>> >> The code is not consistent with @scala.annotation.varargs
>> >>> >> annotation.
>> >>> >> There are classes with @scala.annotation.varargs like
>> >>> >> DataFrameReader
>> >>> >> or functions as well as examples of
>> >>> >> @_root_.scala.annotation.varargs,
>> >>> >> e.g. Window or UserDefinedAggregateFunction.
>> >>> >>
>> >>> >> I think it should be consistent and @scala.annotation.varargs only.
>> >>> >> WDYT?
>> >>> >>
>> >>> >> Pozdrawiam,
>> >>> >> Jacek Laskowski
>> >>> >> 
>> >>> >> https://medium.com/@jaceklaskowski/
>> >>> >> Mastering Apache Spark 2.0 http://bit.ly/mastering-apache-spark
>> >>> >> Follow me at https://twitter.com/jaceklaskowski
>> >>> >>
>> >>> >>
>> >>> >> -
>> >>> >> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>> >>
>> >>> >
>> >>> >
>> >>> > -
>> >>> > To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>> >
>> >>>
>> >>> -
>> >>> To unsubscribe e-mail: dev-unsubscr...@spark.apache.org
>> >>>
>> >
>>
>>
>>
>> --
>> Marcelo

-
To unsubscribe e-mail: dev-unsubscr...@spark.apache.org



Video analytics on SPark

2016-09-09 Thread Priya Ch
Hi All,

I have video surveillance data and this needs to be processed in Spark. I
am going through the Spark + OpenCV. How to load .mp4 images into an RDD ?
Can we directly do this or the video needs to be coverted to sequenceFile ?

Thanks,
Padma CH