Hello guys..i know its irrelevant to this topic but i've been looking
desperately for the solution. I am facing en exception
http://apache-spark-user-list.1001560.n3.nabble.com/how-to-resolve-you-must-build-spark-with-hive-exception-td27390.html

plz help me.. I couldn't find any solution.. plz

On Fri, Jul 22, 2016 at 6:12 PM, Sean Owen <so...@cloudera.com> wrote:

> No there isn't anything in particular, beyond the various bits of
> serialization support that write out something to put in your storage
> to begin with. What you do with it after reading and before writing is
> up to your app, on purpose.
>
> If you mean you're producing data outside the model that your model
> uses, your model data might be produced by an RDD operation, and saved
> that way. There it's no different than anything else you do with RDDs.
>
> What part are you looking to automate beyond those things? that's most of
> it.
>
> On Fri, Jul 22, 2016 at 2:04 PM, Sergio Fernández <wik...@apache.org>
> wrote:
> > Hi Sean,
> >
> > On Fri, Jul 22, 2016 at 12:52 PM, Sean Owen <so...@cloudera.com> wrote:
> >>
> >> If you mean, how do you distribute a new model in your application,
> >> then there's no magic to it. Just reference the new model in the
> >> functions you're executing in your driver.
> >>
> >> If you implemented some other manual way of deploying model info, just
> >> do that again. There's no special thing to know.
> >
> >
> > Well, because some huge model, we typically bundle both logic
> > (pipeline/application)  and models separately. Normally we use a shared
> > stores (e.g., HDFS) or coordinated distribution of the models. But I
> wanted
> > to know if there is any infrastructure in Spark that specifically
> addresses
> > such need.
> >
> > Thanks.
> >
> > Cheers,
> >
> > P.S.: sorry Jacek, with "ml" I meant "Machine Learning". I thought is a
> > quite spread acronym. Sorry for the possible confusion.
> >
> >
> > --
> > Sergio Fernández
> > Partner Technology Manager
> > Redlink GmbH
> > m: +43 6602747925
> > e: sergio.fernan...@redlink.co
> > w: http://redlink.co
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to