http://www.sparkexpert.com/2015/04/17/save-apache-spark-dataframe-to-database/
Hi i tried to load dataframes(parquet files) using the above link into mysql it worked. But when i tried to load it into vertica database this is the error i am facing Exception in thread “main” java.sql.SQLSyntaxErrorException: [Vertica][VJDBC](5108) ERROR: Type “TEXT” does not exist at com.vertica.util.ServerErrorData.buildException(Unknown Source) at com.vertica.io.ProtocolStream.readExpectedMessage(Unknown Source) at com.vertica.dataengine.VDataEngine.prepareImpl(Unknown Source) at com.vertica.dataengine.VDataEngine.prepare(Unknown Source) at com.vertica.dataengine.VDataEngine.prepare(Unknown Source) at com.vertica.jdbc.common.SPreparedStatement.(Unknown Source) at com.vertica.jdbc.jdbc4.S4PreparedStatement.(Unknown Source) at com.vertica.jdbc.VerticaJdbc4PreparedStatementImpl.(Unknown Source) at com.vertica.jdbc.VJDBCObjectFactory.createPreparedStatement(Unknown Source) at com.vertica.jdbc.common.SConnection.prepareStatement(Unknown Source) at org.apache.spark.sql.DataFrameWriter.jdbc(DataFrameWriter.scala:275) at org.apache.spark.sql.DataFrame.createJDBCTable(DataFrame.scala:1611) at com.sparkread.SparkVertica.JdbctoVertica.main(JdbctoVertica.java:51) Caused by: com.vertica.support.exceptions.SyntaxErrorException: [Vertica][VJDBC](5108) ERROR: Type “TEXT” does not exist … 13 more This error is because vertica db doesn’t support the datatypes(String) which is in the dataframes(parquet file). I do not wanted to type cast the columns since its going to be a performance issue. we are looking to load around 280 million rows. Could you please suggest the best way to load the data into vertica db. -- View this message in context: http://apache-spark-user-list.1001560.n3.nabble.com/Loading-dataframes-to-vertica-database-tp25229.html Sent from the Apache Spark User List mailing list archive at Nabble.com. --------------------------------------------------------------------- To unsubscribe, e-mail: user-unsubscr...@spark.apache.org For additional commands, e-mail: user-h...@spark.apache.org