Hi devs,

Not sure I can hear back the response sooner since Spark summit is just
around the corner, but just would want to post and wait.

While playing with Spark 2.4.0-SNAPSHOT, I found nc command exits before
reading actual data so the query also exits with error.

The reason is due to launching temporary reader for reading schema, and
closing reader, and re-opening reader. While reliable socket server should
be able to handle this without any issue, nc command normally can't handle
multiple connections and simply exits when closing temporary reader.

I would like to file an issue and contribute on fixing this if we think
this is a bug (otherwise we need to replace nc utility with another one,
maybe our own implementation?), but not sure we are happy to apply
workaround for specific source.

Would like to hear opinions before giving a shot.

Thanks,
Jungtaek Lim (HeartSaVioR)

Reply via email to