Decoupling mlllib and core is difficult...it is not intended to run spark
core 1.5 with spark mllib 1.6 snapshot...core is more stabilized due to new
algorithms getting added to mllib and sometimes you might be tempted to do
that but its not recommend.
On Nov 21, 2015 8:04 PM, "Reynold Xin" <r...@databricks.com> wrote:

> You can use MLlib and Spark directly without "installing anything". Just
> run Spark in local mode.
>
>
> On Sat, Nov 21, 2015 at 4:05 PM, Rad Gruchalski <ra...@gruchalski.com>
> wrote:
>
>> Bowen,
>>
>> What Andy is doing in the notebook is a slightly different thing. He’s
>> using sbt to bring all spark jars (core, mllib, repl, what have you). You
>> could use maven for that. He then creates a repl and submits all the spark
>> code into it.
>> Pretty sure spark unit tests cover similar uses cases. Maybe not mllib
>> per se but this kind of submission.
>>
>> Kind regards,
>> Radek Gruchalski
>> ra...@gruchalski.com <ra...@gruchalski.com>
>> de.linkedin.com/in/radgruchalski/
>>
>>
>> *Confidentiality:*This communication is intended for the above-named
>> person and may be confidential and/or legally privileged.
>> If it has come to you in error you must take no action based on it, nor
>> must you copy or show it to anyone; please delete/destroy and inform the
>> sender immediately.
>>
>> On Sunday, 22 November 2015 at 01:01, bowen zhang wrote:
>>
>> Thanks Rad for info. I looked into the repo and see some .snb file using
>> spark mllib. Can you give me a more specific place to look for when
>> invoking the mllib functions? What if I just want to invoke some of the ML
>> functions in my HelloWorld.java?
>>
>> ------------------------------
>> *From:* Rad Gruchalski <ra...@gruchalski.com>
>> *To:* bowen zhang <bowenzhang...@yahoo.com>
>> *Cc:* "dev@spark.apache.org" <dev@spark.apache.org>
>> *Sent:* Saturday, November 21, 2015 3:43 PM
>> *Subject:* Re: Using spark MLlib without installing Spark
>>
>> Bowen,
>>
>> One project to look at could be spark-notebook:
>> https://github.com/andypetrella/spark-notebook
>> It uses Spark you in the way you intend to use it.
>> Kind regards,
>> Radek Gruchalski
>> ra...@gruchalski.com <ra...@gruchalski.com>
>> de.linkedin.com/in/radgruchalski/
>>
>>
>> *Confidentiality:*This communication is intended for the above-named
>> person and may be confidential and/or legally privileged.
>> If it has come to you in error you must take no action based on it, nor
>> must you copy or show it to anyone; please delete/destroy and inform the
>> sender immediately.
>>
>>
>> On Sunday, 22 November 2015 at 00:38, bowen zhang wrote:
>>
>> Hi folks,
>> I am a big fan of Spark's Mllib package. I have a java web app where I
>> want to run some ml jobs inside the web app. My question is: is there a way
>> to just import spark-core and spark-mllib jars to invoke my ML jobs without
>> installing the entire Spark package? All the tutorials related Spark seems
>> to indicate installing Spark is a pre-condition for this.
>>
>> Thanks,
>> Bowen
>>
>>
>>
>>
>>
>>
>

Reply via email to