Hi,

I hope this is the correct mailinglist. I've been adding v2 support to the
MongoDB Spark connector using Spark 2.3.1.  I've noticed one of my tests
pass when using the original DefaultSource but errors with my v2
implementation:

The code I'm running is:
val df = spark.loadDS[Character]()
df.createOrReplaceTempView("people")
spark.sql("INSERT INTO table people SELECT 'Mort', 1000")

The error I see is:
unresolved operator 'InsertIntoTable DataSourceV2Relation [name#0, age#1],
MongoDataSourceReader ...
'InsertIntoTable DataSourceV2Relation [name#0, age#1],
MongoDataSourceReader ....
+- Project [Mort AS Mort#7, 1000 AS 1000#8]
   +- OneRowRelation

My DefaultSource V2 implementation extends DataSourceV2 with ReadSupport
with ReadSupportWithSchema with WriteSupport

I'm wondering if there is something I'm not implementing, or if there is a
bug in my implementation or its an issue with Spark?

Any pointers would be great,

Ross

Reply via email to