Data source v2 catalog support(table/view) is still in progress. There are
several threads in the dev list discussing it, please join the discussion
if you are interested. Thanks for trying!

On Thu, Sep 6, 2018 at 7:23 PM Ross Lawley <ross.law...@gmail.com> wrote:

> Hi,
>
> I hope this is the correct mailinglist. I've been adding v2 support to the
> MongoDB Spark connector using Spark 2.3.1.  I've noticed one of my tests
> pass when using the original DefaultSource but errors with my v2
> implementation:
>
> The code I'm running is:
> val df = spark.loadDS[Character]()
> df.createOrReplaceTempView("people")
> spark.sql("INSERT INTO table people SELECT 'Mort', 1000")
>
> The error I see is:
> unresolved operator 'InsertIntoTable DataSourceV2Relation [name#0, age#1],
> MongoDataSourceReader ...
> 'InsertIntoTable DataSourceV2Relation [name#0, age#1],
> MongoDataSourceReader ....
> +- Project [Mort AS Mort#7, 1000 AS 1000#8]
>    +- OneRowRelation
>
> My DefaultSource V2 implementation extends DataSourceV2 with ReadSupport
> with ReadSupportWithSchema with WriteSupport
>
> I'm wondering if there is something I'm not implementing, or if there is a
> bug in my implementation or its an issue with Spark?
>
> Any pointers would be great,
>
> Ross
>

Reply via email to