Is the connection pool configured by mongodb full?

Daniel Stojanov <m...@danielstojanov.com> 于2020年10月26日周一 上午10:28写道:

> Hi,
>
>
> I receive an error message from the MongoDB server if there are too many
> Spark applications trying to access the database at the same time (about
> 3 or 4), "Cannot open a new cursor since too many cursors are already
> opened." I am not too sure of how to remedy this. I am not sure how the
> plugin behaves when it's pulling data.
>
> It appears that a given running application will open many connections
> to the database. The total number of cursors in the database's setting
> is many more than the number of read operations occurring in Spark.
>
>
> Does the plugin keep a connection/cursor open to the database even after
> it has pulled out the data into a dataframe?
>
> Why are there so many open cursors for a single read operation?
>
> Does catching the exception, sleeping for a while, then trying again
> make sense? If cursors are kept open throughout the life of the
> application this would not make sense.
>
>
> Plugin version: org.mongodb.spark:mongo-spark-connector_2.12:2.4.1
>
>
> ---------------------------------------------------------------------
> To unsubscribe e-mail: user-unsubscr...@spark.apache.org
>
>

Reply via email to