Hi, I have a very simple schema where one python statefun application reads from a kafka topic and writes in another kafka topic, those topics are produced and consumed with another python script as it is done in the Python Flink Walkthrough <https://nightlies.apache.org/flink/flink-statefun-docs-release-2.0/getting-started/python_walkthrough.html#what-are-you-building>, Is there a way to read and write in those topics a plain string (as a JSON) and not to use Protobuf?
More concrete: I'm trying to use statefun as a solution for not finding some libraries in JAVA or SCALA that exists in python, then, I'm trying to combine embedded Flink applications with statefun applications using with docker a master and worker with the embedded applications with JobManager and TaskManager, All the embedded applications communicate using JSON, now that I want to use an statefun Application in between, is there a way to communicate using JSON and not protobuf? Thanks in advance.