Batch insert using Python API executemany()
I'm investigating the executemany() api call to insert batches of records consumed from Kafka. Does this API call operate on the batch as a whole, or does it call execute() for each insert? The source code seems like it does the latter, but I'm not 100% sure. If executemany() operates on the bat
Does the Spark Plugin support Phoenix as a Spark Structured Streaming sink?
Does the Spark Plugin (https://phoenix.apache.org/phoenix_spark.html) support Phoenix as a Spark Structured Streaming sink? If yes, can anyone share an example of how to do this?