[ 
https://issues.apache.org/jira/browse/MAHOUT-1603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14088433#comment-14088433
 ] 

ASF GitHub Bot commented on MAHOUT-1603:
----------------------------------------

Github user dlyubimov commented on the pull request:

    https://github.com/apache/mahout/pull/40#issuecomment-51409540
  
    On Wed, Aug 6, 2014 at 3:55 PM, Pat Ferrel <notificati...@github.com> wrote:
    
    > Sorry was off the internet during a move (curse you giant nameless cable
    > company!)
    >
    > Anyway these tests are substantially changed in #36
    > <https://github.com/apache/mahout/pull/36> but I haven't been able to get
    > the new build until now, will check and push 36 first.
    >
    > As to building and tearing down contexts I'm not helping things. For each
    > driver test DistributedSparkSuite in the beforeEach creates a context so I
    > use that to start the test. Then the driver I am using needs to start a
    > context so for every time I call a driver I precede it with the 
"afterEach"
    > call to shut down the context. Then call the driver, then call 
"beforeEach"
    > to restore the test context. I also had to tell the driver in a special
    > invisible option not to load Mahout jars with a "--dontAddMahoutJars". So
    > the context is being built 3 times for every test. but that hasn't 
changed,
    > it's always been that way.
    >
    > We could reuse a single context per test but it would require disabling
    > some stuff in the driver along the lines of what I had to do with
    > "--dontAddMahoutJars". Since I've already had to do this I don't think it
    > would be a big deal to disable a little more. I'll look at it once 36 is
    > pushed.
    >
    > Is there any reason to build the context more than once per suite?
    >
    Usually, there's not and that's exactly what this branch is moving towards
    (note: this PR is not against master but to  to a side branch called
    `spark-1.0.x`).
    Also that's what they seem to have done in Spark 1.0 as well.
    
    There are sometimes (in my other projects) a need to create a custom
    context but not in Mahout codebase.
    
    
    > Seems like if I disable the context things in the driver we could run all
    > tests in a single context, right?
    >
    Right. This branch has already switched to doing that. All algebra tests
    seem to be fine but these tests are failing now. not sure why. seems
    functional to me.
    
    > —
    > Reply to this email directly or view it on GitHub
    > <https://github.com/apache/mahout/pull/40#issuecomment-51408987>.
    >


> Tweaks for Spark 1.0.x 
> -----------------------
>
>                 Key: MAHOUT-1603
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1603
>             Project: Mahout
>          Issue Type: Task
>    Affects Versions: 0.9
>            Reporter: Dmitriy Lyubimov
>            Assignee: Dmitriy Lyubimov
>             Fix For: 1.0
>
>
> Tweaks necessary current codebase on top of spark 1.0.x



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to