nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1302901252
@danny0405 @xiarixiaoyao : do we know if we have fixed this anytime.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub
nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1216664212
@CrazyBeeline : gentle ping.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the speci
nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1210062619
@CrazyBeeline : can you put up a patch w/ the fix you have. Happy to review
and get it to landing. btw, are you using hbase or some other set up. wondering
how did you end up w/ a fi
nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1125084414
@xiarixiaoyao : do we have a tracking ticket for this. Alexey did a revamp
of all query engine code paths recently w/ 0.11. Do we have this issue even now
after 0.11? do you have any
nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1125082789
but might be good to get it fixed irrespective of that. just trying to guage
the use-case.
--
This is an automated message from the Apache Git Service.
To respond to the message, p
nsivabalan commented on issue #4978:
URL: https://github.com/apache/hudi/issues/4978#issuecomment-1125081869
is this written using flink? using spark, we won't create log files
directly. data files will be created first and then only log files will be
created for file groups.
--
Thi