Hi,

I'm trying to use a Map[String,String] object output of a Scala UDF (
scala.collection.immutable.map) as a valid data type in the Table API,
namely via Java type (java.util.Map) as recommended here
<https://stackoverflow.com/questions/45471503/flink-table-api-sql-and-map-types-scala>
,

However I get below error.

Any idea about the right way to proceed ?
If yes, is there a way to generalize the conversion to a (nested) Scala
object of type Map[String,Any] ?


*Code*
*Scala UDF*

class dummyMap() extends ScalarFunction {
  def eval() = {
val whatevermap = Map("key1" -> "val1", "key2" -> "val2")
whatevermap.asInstanceOf[java.util.Map[java.lang.String,java.lang.String]]
  }
}

*Sink*

my_sink_ddl = f"""
    create table mySink (
        output_of_dummyMap_udf MAP<STRING,STRING>
    ) with (
        ...
    )

*Error*

Py4JJavaError: An error occurred while calling o430.execute.
: org.apache.flink.table.api.ValidationException: Field types of query
result and registered TableSink
`default_catalog`.`default_database`.`mySink` do not match.
Query result schema: [output_of_my_scala_udf: GenericType<java.util.Map>]
TableSink schema:    [output_of_my_scala_udf: Map<String, String>]

Thanks for your support !

Best regards,

-- 
Pierre

Reply via email to