Re: Saving to JDBC

2015-12-18 Thread Akhil Das
You will have to properly order the columns before writing or you can
change the column order in the actual table according to your job.

Thanks
Best Regards

On Tue, Dec 15, 2015 at 1:47 AM, Bob Corsaro  wrote:

> Is there anyway to map pyspark.sql.Row columns to JDBC table columns, or
> do I have to just put them in the right order before saving?
>
> I'm using code like this:
>
> ```
> rdd = rdd.map(lambda i: Row(name=i.name, value=i.value))
> sqlCtx.createDataFrame(rdd).write.jdbc(dbconn_string, tablename,
> mode='append')
> ```
>
> Since the Row class orders them alphabetically, they are inserted into the
> sql table in alphabetical order instead of matching Row columns to table
> columns.
>


Saving to JDBC

2015-12-14 Thread Bob Corsaro
Is there anyway to map pyspark.sql.Row columns to JDBC table columns, or do
I have to just put them in the right order before saving?

I'm using code like this:

```
rdd = rdd.map(lambda i: Row(name=i.name, value=i.value))
sqlCtx.createDataFrame(rdd).write.jdbc(dbconn_string, tablename,
mode='append')
```

Since the Row class orders them alphabetically, they are inserted into the
sql table in alphabetical order instead of matching Row columns to table
columns.