I'm trying to insert data into Cassandra table with Spark SQL as follows:

    String query = "CREATE TEMPORARY TABLE my_table USING
org.apache.spark.sql.cassandra OPTIONS (table \"my_table\",keyspace
\"my_keyspace\", pushdown \"true\")";
                    spark.sparkSession.sql(query);
                    spark.sparkSession.sql("INSERT INTO
my_keyspace.my_table (column0, column1) VALUES ('value0', 'value1');

however, it fails with the following exception:
    Exception in thread "main"
org.apache.spark.sql.catalyst.parser.ParseException:
    mismatched input 'column0' expecting {'(', 'SELECT', 'FROM', 'VALUES',
'TABLE', 'INSERT', 'MAP', 'REDUCE'}(line 1, pos 33)

I tried it without the column names and it worked.
My point here is to insert data for some columns, not all of them.

Reply via email to