I see. It seems the mapping is not being done correctly as wayang is not
being able to find a Spark implementation for your wayang operator.

I could walk the code with you early next week. In the meantime, perhaps
Bertty or Rodrigo had a similar issue before, @Bertty and @Rodrigo did you
face a similar problem before?

—
Jorge

On Thu, 25 Aug 2022 at 9:00 PM MatthewJ Sanyoto <[email protected]>
wrote:

> Hi,
>
> So I created a new operator MapHackItOperator in the wayang basic modules
> operator. This functions essentially the same as MapOperator in the same
> module.
>
> Since wayang need a platform to execute this operator and to test, in the
> wayang platform java modules I updated the Mapping and add MapHackitMapping
> and JavaMapHackItOperator , this is the translation part as well as the
> execution, it works with Java Plugin.
>
> The problem is for only SparkPlugin I updated the wayang platform spark
> modules for the SparkPlugin so I created Mapping, MapHackItMapping and
> SparkMapHackItOperator that extends to MapHackItOperator with the same code
> as the SparkMap to translate it into SparkOperator, I think the translation
> part is working but Wayang don't know how to enumerate the plan according
> to the exception or don't know where to output the results? correct me if
> I'm wrong.
>
> I hope it doesn't make it more confusing.
>
> Best regards,
> Matthew
>
> On Thu, Aug 25, 2022, 20:45 jorge Arnulfo Quiané Ruiz <[email protected]
> >
> wrote:
>
> > Hello Matthew,
> >
> > To better understand the problem, are you trying to copy the
> > JavaMapOperator into a Spark operator? Or you are trying to copy a
> > SparkOperator into a new spark operator such as you did for Java?
> >
> > Best,
> > Jorge
> >
> > On Thu, 25 Aug 2022 at 5:56 PM MatthewJ Sanyoto <
> [email protected]
> > >
> > wrote:
> >
> > > Hi,
> > >
> > > I have a problem regarding implementing a new custom operator using
> > Spark.
> > > So I was trying to map a new operator from Wayang Basic operator to the
> > > Spark Platform to execute in Spark.
> > > I tried using the Java Platform (only Java Plugin) and just copied the
> > > code(MapOperator, Mapping, JavaMapOperator and MapMapping) but with
> > > different operators name to test and it worked but not for Spark
> > Platform.
> > > (only Spark Plugin)
> > >
> > > Maybe someone here could please explain to me why I got this error.
> > > Caused by: org.apache.wayang.core.api.exception.WayangException: No
> > > implementations that concatenate out@Alternative[2x
> > > ~Alternative[[MapHackIt[1->1, id=3c947bc5]]], 1e683a3e] with
> > > [in@Alternative[2x
> > > ~Alternative[[LocalCallbackSink[1->0, id=69fb6037]]], 6babf3bf]].
> > >
> > > Maybe I am missing out on something for Spark?
> > >
> > > Best Regards,
> > > Matthew
> > >
> >
>

Reply via email to