rawkintrevo closed pull request #349:
URL: https://github.com/apache/mahout/pull/349
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go
Andrew Palumbo created MAHOUT-2027:
--
Summary: spark-ec2 launch scripts with ViennaCL/JCuda installation
Key: MAHOUT-2027
URL: https://issues.apache.org/jira/browse/MAHOUT-2027
Project: Mahout
[
https://issues.apache.org/jira/browse/MAHOUT-1914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Trevor Grant updated MAHOUT-1914:
-
Fix Version/s: (was: 0.13.1)
0.13.2
> Spark tests are not picking up
[
https://issues.apache.org/jira/browse/MAHOUT-1914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Andrew Palumbo updated MAHOUT-1914:
---
Sprint: (was: Jan/Feb-2017)
Fix Version/s: (was: 0.13.0)
[
https://issues.apache.org/jira/browse/MAHOUT-1914?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Andrew Palumbo updated MAHOUT-1914:
---
Priority: Major (was: Blocker)
Honestly I am not sure this is a huge issue. Bringing it
[
https://issues.apache.org/jira/browse/MAHOUT-1914?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15879018#comment-15879018
]
Saikat Kanjilal commented on MAHOUT-1914:
-
[~Andrew_Palumbo] just wondering how much progress
Assignee: Trevor Grant
>Priority: Minor
> Fix For: 0.13.0
>
>
> The spark dependency reduced jar includes:
> org.apache.mahout:mahout-native-viennacl_2.10
> but not
> org.apache.mahout:mahout-native-viennacl-omp_2.10
> In mahout/spark/src/main/assembly/depend
Assignee: Trevor Grant
>Priority: Minor
> Fix For: 0.13.0
>
>
> The spark dependency reduced jar includes:
> org.apache.mahout:mahout-native-viennacl_2.10
> but not
> org.apache.mahout:mahout-native-viennacl-omp_2.10
> In mahout/spark/src/main/as
at:
https://github.com/apache/mahout/pull/270
> Spark Dependency Reduced Jar Needs Open-MP
> --
>
> Key: MAHOUT-1920
> URL: https://issues.apache.org/jira/browse/MAHOUT-1920
> Project: Mahout
>
t; org.apache.mahout:mahout-native-viennacl_2.10
> but not
> org.apache.mahout:mahout-native-viennacl-omp_2.10
> In mahout/spark/src/main/assembly/dependency-reduced.xml
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
Fix For: 0.13.0
>
>
> The spark dependency reduced jar includes:
> org.apache.mahout:mahout-native-viennacl_2.10
> but not
> org.apache.mahout:mahout-native-viennacl-omp_2.10
> In mahout/spark/src/main/assembly/dependency-reduced.xml
--
This message was sent by Atlassian JIRA
(v6.3.15#6346)
Trevor Grant created MAHOUT-1920:
Summary: Spark Dependency Reduced Jar Needs Open-MP
Key: MAHOUT-1920
URL: https://issues.apache.org/jira/browse/MAHOUT-1920
Project: Mahout
Issue Type: Bug
Andrew Palumbo created MAHOUT-1914:
--
Summary: Spark tests are not picking up OpenCL/OpenMP jars
Key: MAHOUT-1914
URL: https://issues.apache.org/jira/browse/MAHOUT-1914
Project: Mahout
Issue
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15760262#comment-15760262
]
Andrew Palumbo commented on MAHOUT-1788:
[~pferrel], [~shashidongur] Is there anything to be done
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15574059#comment-15574059
]
Suneel Marthi commented on MAHOUT-1788:
---
Is someone still working on this?
> spark-itemsimilarity
okay thanks - i’ll run those tests. i actually ran a few others as well like
the MatrixWritableTest.
> On Apr 18, 2016, at 8:22 PM, Dmitriy Lyubimov wrote:
>
> I am not sure of your question about tests...
>
> there are in-memory tests which you can by 'mvn test' in
On Tue, Apr 19, 2016 at 11:08 AM, Khurrum Nasim
wrote:
> Thank you Dimitry.
>
> So is there an architectural blueprint for mahout ? What I mean is how
> can get the 1000 feet overview ? Or the bird eye view of the project.
> I do see Mahout is very modularized -
I am not sure of your question about tests...
there are in-memory tests which you can by 'mvn test' in /math-scala
module; distributed tests are done per engine under 'spark', 'h2o' or
'flink' modules.
On Mon, Apr 18, 2016 at 5:19 PM, Dmitriy Lyubimov wrote:
> i meant "not
i meant "not so much a library"
On Mon, Apr 18, 2016 at 5:18 PM, Dmitriy Lyubimov wrote:
> Khurrum,
>
> mahout is so much a library at this point.
>
> if you mean if it can be used to build networks with 2d inputs, yes i did
> some of that. multi-epoch SGD based systems
Khurrum,
mahout is so much a library at this point.
if you mean if it can be used to build networks with 2d inputs, yes i did
some of that. multi-epoch SGD based systems should be easy enough to build,
and will probably have a reasonable performance -- although I think
dedicated CNN systems
Hi Guys,
Can Mahout be used for things like face detection ?Also which unit tests or
integration tests do you recommend I should run just to get a better feel of
the execution flow.
I’m still slowly acclimating to the project. But hopefully should come up to
speed soon.
Many
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15225099#comment-15225099
]
shashi bushan dongur commented on MAHOUT-1788:
--
[~smarthi] I currently have mahout installed
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15223282#comment-15223282
]
Suneel Marthi commented on MAHOUT-1788:
---
[~shashidongur] Any progress on this yet? Do you need any
Thanks everyone - I’m glad to be a part of this.
Khurrum
> On Mar 30, 2016, at 3:10 PM, Suneel Marthi wrote:
>
> Thanks Khurrum for stepping up.
>
> You just need basic programming skills - Java/Scala to be able to
> contribute. We can help you with the algorithms and
Thanks Khurrum for stepping up.
You just need basic programming skills - Java/Scala to be able to
contribute. We can help you with the algorithms and linear algebra stuff.
Welcome aboard !!
On Wed, Mar 30, 2016 at 3:05 PM, Khurrum Nasim
wrote:
> Thanks for the
Thanks for the advice Dimitry. I’m already signed up on ASF jira.My handle
is “nasimk”
Do I need to be a linear algebra expert and or math phd to contribute ?
I have 10 plus years of computer programming experience. my background is comp
sci.
Khurrum
> On Mar 30, 2016, at 2:57
PS You may also want to sign up with ASF Jira so we can assign issues to
yourself.
On Wed, Mar 30, 2016 at 11:52 AM, Dmitriy Lyubimov
wrote:
>
>
> On Wed, Mar 30, 2016 at 11:43 AM, Khurrum Nasim
> wrote:
>
>> Thanks Dimirtry.
>>
>> I take a look at
On Wed, Mar 30, 2016 at 11:43 AM, Khurrum Nasim
wrote:
> Thanks Dimirtry.
>
> I take a look at see where I can start pitching in. Do I need contributor
> access ? how would I create feature branch of my work ?
>
Khurrum,
you only need github account. What you need
Thanks Dimirtry.
I take a look at see where I can start pitching in. Do I need contributor
access ? how would I create feature branch of my work ?
Khurrum
> On Mar 30, 2016, at 1:12 PM, Dmitriy Lyubimov wrote:
>
> Oh but of course! please do!
>
> You may work on any
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15218531#comment-15218531
]
Dmitriy Lyubimov commented on MAHOUT-1788:
--
[~shashidongur]
Oh but of course! please do!
You
Oh but of course! please do!
You may work on any issue, this or any other of your choice, or even on any
new issue you can think of (for sizeable contributions it is recommended to
start discussion on the @dev list first though, to make sure to benefit
from experience of others. Please file any
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=15218216#comment-15218216
]
shashi bushan dongur commented on MAHOUT-1788:
--
Hello. I would like to start contributing to
[
https://issues.apache.org/jira/browse/MAHOUT-1788?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Pat Ferrel updated MAHOUT-1788:
---
Fix Version/s: (was: 0.12.0)
1.0.0
Issue Type: Improvement (was: Bug)
Pat Ferrel created MAHOUT-1788:
--
Summary: spark-itemsimilarity integration test script cleanup
Key: MAHOUT-1788
URL: https://issues.apache.org/jira/browse/MAHOUT-1788
Project: Mahout
Issue Type
[
https://issues.apache.org/jira/browse/MAHOUT-1670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14972270#comment-14972270
]
Andrew Palumbo commented on MAHOUT-1670:
It would be good to get a brief Flink overview similar
[
https://issues.apache.org/jira/browse/MAHOUT-1670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Andrew Palumbo resolved MAHOUT-1670.
Resolution: Done
Makes more sense to resolve this and start a new one for flink.
> Spark
[
https://issues.apache.org/jira/browse/MAHOUT-1670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel=14972064#comment-14972064
]
Suneel Marthi commented on MAHOUT-1670:
---
[~Andrew_Palumbo] Can this be marked as Resolved?
> Spark
on the pull request:
https://github.com/apache/mahout/pull/146#issuecomment-120956293
Commited to master on 7/10.. Closing manually.. did not close by commit
message.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse
the pull request at:
https://github.com/apache/mahout/pull/146
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
.
—
Reply to this email directly or view it on GitHub
https://github.com/apache/mahout/pull/146#issuecomment-120956293.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
if there are no
objections.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
if it is not already set eg. in MAHOUT_OPTS.
This should be good to go. I'll push it this afternoon if there are no
objections.
—
Reply to this email directly or view it on GitHub
https://github.com/apache/mahout/pull/146#issuecomment-120433440.
Spark 1.3
/apache/mahout/sparkbindings/drm/CheckpointedDrmSpark.scala
* CHANGELOG
*
spark-shell/src/main/scala/org/apache/mahout/sparkbindings/shell/MahoutSparkILoop.scala
* spark/src/main/scala/org/apache/mahout/sparkbindings/drm/package.scala
Spark 1.3
-
Key: MAHOUT-1653
request at:
https://github.com/apache/mahout/pull/136
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
on the pull request:
https://github.com/apache/mahout/pull/136#issuecomment-120506390
Andy pushed his 1.3 branch to master so no longer needed
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project
further
than that.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
:
-
Builds and passes tests with Spark 1.4 as well. though have not tested 1.4
further than that.
was (Author: andrew_palumbo):
Builds and passes tests with Spark 1.4 as well. though have not tested further
than that.
Spark 1.3
-
Key: MAHOUT-1653
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+sdc = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark context
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+sdc = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark context
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+sdc = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark context
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
--- End diff
://github.com/apache/spark/blob/branch-1.3/repl/scala-2.10/src/main/scala/org/apache/spark/repl/SparkILoopInit.scala#L121
Let me know if you think there's a better way around it.
Spark 1.3
-
Key: MAHOUT-1653
URL: https
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
--- End diff
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
--- End diff
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
--- End diff
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark
on the pull request:
https://github.com/apache/mahout/pull/146#issuecomment-118941874
I will push this in the next couple of days if there are no objections.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created spark
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
-echo(Created
(spark.executor.memory, 1g)
-sparkContext = mahoutSparkContext(
+_interp.sparkContext = mahoutSparkContext(
masterUrl = master,
appName = Mahout Spark Shell,
customJars = jars,
sparkConf = conf
)
--- End diff
/sparkbindings/shell/Main.scala
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee
is
not fully compatible with 1.3 it seems.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee
on the pull request:
https://github.com/apache/mahout/pull/136#issuecomment-118213955
oh man.. @pferrel I didn't realize that i based that branch off of 0.10.x.
I just did it with another one. I'll have to fix them up.
Spark 1.3
-
Key: MAHOUT-1653
tested). I can push it as part of #146 when
that's ready. or if you need it ahead of that you can push now or whenever.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue
on the right to filter sprint (i.e. active issues) on
per release basis. handy.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
:
--
no, it wasnt.
there is version tab on the left to filter sprint (i.e. active issues) on
per release basis. handy.
was (Author: dlyubimov):
no, it wasnt.
there is version tab on the right to filter sprint (i.e. active issues) on
per release basis. handy.
Spark 1.3
but release target should still be the same. the
sprint is called July-`15 or something just to indicate what we are currently
working on (since there's a really big pile in the backlog which is hard to dig
thru).
Spark 1.3
-
Key: MAHOUT-1653
URL: https
that we were going to upgrade the
0.10.x branch to 1.3-- was pretty sure that wasn't the plan. Would be nice to
make it possible (though keeping the default at Spark 1.2.1) but the developer
API for the shell isn't backward compatible.
Spark 1.3
-
Key: MAHOUT
[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Dmitriy Lyubimov updated MAHOUT-1653:
-
Sprint: 0.10.2-Jul-2015
Spark 1.3
-
Key: MAHOUT-1653
branch.
If I can get this to build against the current master/0.11.0-SNAPSHOT I'd
be inclined to push it so someone speak up if there is an objection. Once
again, it will disable the shell build, for now.
Spark 1.3
-
Key: MAHOUT-1653
URL: https
that is..)
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
Fix
]) {
rddInput.toDrmRdd()
.map( x =(new LongWritable(x._1.asInstanceOf[Long]), new
VectorWritable(x._2))).saveAsSequenceFile(path)
} else throw new IllegalArgumentException(Do not know how to convert
class tag %s to Writable..format(ktag))```
Spark 1.3
deprecate dfsWrite? Meaning leave it as is
with a deprecated Spark function for now and replace it wherever it's called?
Then when Spark removes the deprecated function we remove dfsWrite?
Can we handle your code with an implicit conversion that will get called
before dfsWrite?
Spark 1.3
(...)` we shouldn't incur any additional
overhead.
Actually we may just be able to call `.saveAsHadoopFile(...)` directly on a
mapped = Writable RDD from `.dfsWrite(...)`.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse
tests):
https://github.com/andrewpalumbo/mahout/blob/MAHOUT-1653-jun/spark/src/main/scala/org/apache/mahout/sparkbindings/drm/CheckpointedDrmSpark.scala#L157
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT
Serializable: ```org.apache.spark.SparkException: Task not
serializable```. Any thoughts on how to do this correctly?
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type
[
https://issues.apache.org/jira/browse/MAHOUT-1670?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanelfocusedCommentId=14592470#comment-14592470
]
Andrew Palumbo commented on MAHOUT-1670:
the h2o doc is actually already on the
the pull request at:
https://github.com/apache/mahout/pull/117
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
on the pull request:
https://github.com/apache/mahout/pull/117#issuecomment-113284486
see #136
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency
to be tinkered with.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
Fix
-keyed.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
Fix
.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Reporter: Andrew Musselman
Assignee: Andrew Palumbo
Fix For: 0.11.0
on the pull request:
https://github.com/apache/mahout/pull/117#issuecomment-111293993
This should be closed in favor of #136 so it goes into 0.11.0
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
reflection to get the Writable factory from
the RDD by setting it to null. not sure exactly how to do that here.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type
(path: String) = {
val ktag = implicitly[ClassTag[K]]
+val vtag = implicitly[ClassTag[Vector]]
--- End diff --
I think this line can come out for now.
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse
as a stop-gap while we
investigate the shell issues
--- End diff --
yeah- this definitely can wait.. I'll re-synch it after #135 is pushed and
we can keep it here as a patch for people that want to use 1.3 on the 0.10.x
branch.
Spark 1.3
-
Key: MAHOUT-1653
-gap while we
investigate the shell issues
--- End diff --
ok stop-gap but perhaps we can do it on top of #135 ..
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
with this.
remove this line and rebuild Mahout.
https://github.com/apache/mahout/blob/mahout-0.10.x/spark/src/main/scala/org/apache/mahout/drivers/TextDelimitedReaderWriter.scala#L157
The errant line reads:
interactions.collect()
This forces the user action data into memory, a bad thing
(), forcing all interaction data into
memory of the client/driver. Increasing the executor memory will not help with
this.
remove this line and rebuild Mahout.
https://github.com/apache/mahout/blob/mahout-0.10.x/spark/src/main/scala/org/apache/mahout/drivers/TextDelimitedReaderWriter.scala#L157
[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Stevo Slavic updated MAHOUT-1653:
-
Affects Version/s: (was: 0.11.0)
Spark 1.3
-
Key: MAHOUT-1653
[
https://issues.apache.org/jira/browse/MAHOUT-1653?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Stevo Slavic updated MAHOUT-1653:
-
Fix Version/s: 0.11.0
Spark 1.3
-
Key: MAHOUT-1653
request:
https://github.com/apache/mahout/pull/117
MAHOUT-1653 Spark-1.3 for 0.10.0
Uses a deprecated method in `CheckpointedDrmSpark.scala` and has the shell
build disabled.
You can merge this pull request into a Git repository by running:
$ git pull https://github.com
[
https://issues.apache.org/jira/browse/MAHOUT-1670?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Stevo Slavic updated MAHOUT-1670:
-
Fix Version/s: (was: 0.10.0)
0.11.0
Spark and h2o Engine Documentation
the pull request at:
https://github.com/apache/mahout/pull/82
Spark 1.3
-
Key: MAHOUT-1653
URL: https://issues.apache.org/jira/browse/MAHOUT-1653
Project: Mahout
Issue Type: Dependency upgrade
Affects Versions: 0.11.0
1 - 100 of 196 matches
Mail list logo