the hack i have only takes MAHOUT_OPTS.

it normally actually makes more sense to set it there since spark
options are too numerous and too long to enter on command line.

so i'd say we need to support MAHOUT_OPTS at minimum; or both.

On Thu, Mar 5, 2015 at 4:04 PM, Andrew Palumbo (JIRA) <j...@apache.org> wrote:
>
>     [ 
> https://issues.apache.org/jira/browse/MAHOUT-1643?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=14349655#comment-14349655
>  ]
>
> Andrew Palumbo commented on MAHOUT-1643:
> ----------------------------------------
>
> Yeah- talking about the shell.  Do we want to process CLI args here ie:
>
> {code}
> $bin/mahout spark-shell -D:k=n
> {code}
>
> or should i just close this and we'll just go off of MAHOUT_OPTS?
>
>
>> CLI arguments are not being processed in spark-shell
>> ----------------------------------------------------
>>
>>                 Key: MAHOUT-1643
>>                 URL: https://issues.apache.org/jira/browse/MAHOUT-1643
>>             Project: Mahout
>>          Issue Type: Bug
>>          Components: CLI, spark
>>    Affects Versions: 1.0
>>         Environment: spark spark-shell
>>            Reporter: Andrew Palumbo
>>              Labels: DSL, scala, spark, spark-shell
>>             Fix For: 1.0
>>
>>
>> The CLI arguments are not being processed in spark-shell.  Most importantly 
>> the spark options are not being passed to the spark configuration via:
>> {code}
>> $ mahout spark-shell -D:k=n
>> {code}
>> The arguments are preserved it through {code}$ bin/mahout{code}There should 
>> be a relatively easy fix either by using the MahoutOptionParser, Scopt or by 
>> simply parsing the args array.
>
>
>
> --
> This message was sent by Atlassian JIRA
> (v6.3.4#6332)

Reply via email to