Hi,

The new operator is literally MapOperator under a different name. To test
creating a new operator. I literally copied the code of a mapoperator but I
only changed the name. All the functionality is the same and I only copied
it.

However, It is possible and I' ll check the input and output channel.

Best regards,
Matthew

On Fri, Aug 26, 2022, 12:22 Zoi Kaoudi <[email protected]> wrote:

>  It could also be a problem with the input or output channels of your
> operator and then the enumerator cannot concatenate the partial plans.
> Are you using the RddChannel or something else?
> Best
> --
> Zoi
>
>     Στις Παρασκευή 26 Αυγούστου 2022, 10:17:39 π.μ. CEST, ο χρήστης
> MatthewJ Sanyoto <[email protected]> έγραψε:
>
>  Hi,
>
> I have already register the new mapping in the mappings Spark Platform and
> the mapping will map the new operator with the new Spark Operator.
>
> It could also be that Spark doesn't recognize the implementation or the
> mapping isn't done correctly from my side, however if it is so then I
> should got an error no plan to enumerate. However I believe Spark recognize
> the operator but the planenumerator doesn't know what to do with it? This
> is my guess.
>
> Best regards,
> Matthew
>
> On Fri, Aug 26, 2022, 10:02 Zoi Kaoudi <[email protected]> wrote:
>
> >  Hi Matthew,
> > did you register the new mapping? There should be a class or structure
> > that lists all the mappings available.
> > Best
> > --
> > Zoi
> >
> >    Στις Παρασκευή 26 Αυγούστου 2022, 09:20:24 π.μ. CEST, ο χρήστης jorge
> > Arnulfo Quiané Ruiz <[email protected]> έγραψε:
> >
> >  I see. It seems the mapping is not being done correctly as wayang is not
> > being able to find a Spark implementation for your wayang operator.
> >
> > I could walk the code with you early next week. In the meantime, perhaps
> > Bertty or Rodrigo had a similar issue before, @Bertty and @Rodrigo did
> you
> > face a similar problem before?
> >
> > —
> > Jorge
> >
> > On Thu, 25 Aug 2022 at 9:00 PM MatthewJ Sanyoto <
> [email protected]
> > >
> > wrote:
> >
> > > Hi,
> > >
> > > So I created a new operator MapHackItOperator in the wayang basic
> modules
> > > operator. This functions essentially the same as MapOperator in the
> same
> > > module.
> > >
> > > Since wayang need a platform to execute this operator and to test, in
> the
> > > wayang platform java modules I updated the Mapping and add
> > MapHackitMapping
> > > and JavaMapHackItOperator , this is the translation part as well as the
> > > execution, it works with Java Plugin.
> > >
> > > The problem is for only SparkPlugin I updated the wayang platform spark
> > > modules for the SparkPlugin so I created Mapping, MapHackItMapping and
> > > SparkMapHackItOperator that extends to MapHackItOperator with the same
> > code
> > > as the SparkMap to translate it into SparkOperator, I think the
> > translation
> > > part is working but Wayang don't know how to enumerate the plan
> according
> > > to the exception or don't know where to output the results? correct me
> if
> > > I'm wrong.
> > >
> > > I hope it doesn't make it more confusing.
> > >
> > > Best regards,
> > > Matthew
> > >
> > > On Thu, Aug 25, 2022, 20:45 jorge Arnulfo Quiané Ruiz <
> > [email protected]
> > > >
> > > wrote:
> > >
> > > > Hello Matthew,
> > > >
> > > > To better understand the problem, are you trying to copy the
> > > > JavaMapOperator into a Spark operator? Or you are trying to copy a
> > > > SparkOperator into a new spark operator such as you did for Java?
> > > >
> > > > Best,
> > > > Jorge
> > > >
> > > > On Thu, 25 Aug 2022 at 5:56 PM MatthewJ Sanyoto <
> > > [email protected]
> > > > >
> > > > wrote:
> > > >
> > > > > Hi,
> > > > >
> > > > > I have a problem regarding implementing a new custom operator using
> > > > Spark.
> > > > > So I was trying to map a new operator from Wayang Basic operator to
> > the
> > > > > Spark Platform to execute in Spark.
> > > > > I tried using the Java Platform (only Java Plugin) and just copied
> > the
> > > > > code(MapOperator, Mapping, JavaMapOperator and MapMapping) but with
> > > > > different operators name to test and it worked but not for Spark
> > > > Platform.
> > > > > (only Spark Plugin)
> > > > >
> > > > > Maybe someone here could please explain to me why I got this error.
> > > > > Caused by: org.apache.wayang.core.api.exception.WayangException: No
> > > > > implementations that concatenate out@Alternative[2x
> > > > > ~Alternative[[MapHackIt[1->1, id=3c947bc5]]], 1e683a3e] with
> > > > > [in@Alternative[2x
> > > > > ~Alternative[[LocalCallbackSink[1->0, id=69fb6037]]], 6babf3bf]].
> > > > >
> > > > > Maybe I am missing out on something for Spark?
> > > > >
> > > > > Best Regards,
> > > > > Matthew
> > > > >
> > > >
> > >
> >
>

Reply via email to