[ 
https://issues.apache.org/jira/browse/MAHOUT-1603?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14088427#comment-14088427
 ] 

ASF GitHub Bot commented on MAHOUT-1603:
----------------------------------------

Github user pferrel commented on the pull request:

    https://github.com/apache/mahout/pull/40#issuecomment-51408987
  
    Sorry was off the internet during a move (curse you giant nameless cable 
company!)
    
    Anyway these tests are substantially changed in 
https://github.com/apache/mahout/pull/36 but I haven't been able to get the new 
build until now, will check and push 36 first.
    
    As to building and tearing down contexts I'm not helping things. For each 
driver test DistributedSparkSuite in the beforeEach creates a context so I use 
that to start the test. Then the driver I am using needs to start a context so 
for every time I call a driver I precede it with the "afterEach" call to shut 
down the context. Then call the driver, then call "beforeEach" to restore the 
test context. I also had to tell the driver in a special invisible option not 
to load Mahout jars with a "--dontAddMahoutJars". So the context is being built 
3 times for every test. but that hasn't changed, it's always been that way.
    
    We could reuse a single context per test but it would require disabling 
some stuff in the driver along the lines of what I had to do with 
"--dontAddMahoutJars". Since I've already had to do this I don't think it would 
be a big deal to disable a little more. I'll look at it once 36 is pushed.
    
    Is there any reason to build the context more than once per suite? Seems 
like if I disable the context things in the driver we could run all tests in a 
single context, right?


> Tweaks for Spark 1.0.x 
> -----------------------
>
>                 Key: MAHOUT-1603
>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1603
>             Project: Mahout
>          Issue Type: Task
>    Affects Versions: 0.9
>            Reporter: Dmitriy Lyubimov
>            Assignee: Dmitriy Lyubimov
>             Fix For: 1.0
>
>
> Tweaks necessary current codebase on top of spark 1.0.x



--
This message was sent by Atlassian JIRA
(v6.2#6252)

Reply via email to