I found the solution to this problem , it was a dependency issue, I had to
exclude "xml-apis" to get this fixed. Also the s3-presto jar provides
better error messages which was helpful.
Thanks,
Vishwas
On Thu, Jul 18, 2019 at 8:14 PM Vishwas Siravara
wrote:
> I am using ecs S3 instance to
Hi Ravi,
Tried with both new and legacy mode, it works locally but on cluster I am
getting this exception, I am passing jackson ObjectNode class, should be
serializable. What do you think?
On Sat, 20 Jul 2019, 12:11 Ravi Bhushan Ratnakar, <
ravibhushanratna...@gmail.com> wrote:
> Hi Vinay,
>
>
Hi Dante,
Nice finding! I just miss this powerful project :-)
Best,
tison.
Dante Van den Broeke 于2019年7月20日周六 下午5:28写道:
>
> Hi Tison, Jeff,
>
> Thanks a lot for the help! I’ll definitly look into the python API and
> py4j support next. I was also thinking about trying to create the pipeline
Hi Tison, Jeff,
Thanks a lot for the help! I’ll definitly look into the python API and py4j
support next. I was also thinking about trying to create the pipeline through
beam and flink instead of kafka and flink, as i see that python is a full class
citizen in the beam framework!
Regards,
Hi,
I am trying to run a pipeline on Flink 1.8.1 ,getting the following
exception:
*java.lang.StackOverflowError at
java.lang.Exception.(Exception.java:66) at
java.lang.ReflectiveOperationException.(ReflectiveOperationException.java:56)
at