Yes, I can do that, it's what we are doing now, but I think the best
approach would be delegate the create table action to spark.

On Tue, May 3, 2016 at 8:17 PM, Mich Talebzadeh <mich.talebza...@gmail.com>
wrote:

> Can you create the MSSQL (target) table first with the correct column
> setting and insert data from Spark to it with JDBC as opposed to JDBC
> creating target table itself?
>
>
>
> Dr Mich Talebzadeh
>
>
>
> LinkedIn * 
> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>
>
>
> http://talebzadehmich.wordpress.com
>
>
>
> On 3 May 2016 at 22:19, Andrés Ivaldi <iaiva...@gmail.com> wrote:
>
>> Ok, Spark MSSQL dataType mapping is not right for me, ie. string is Text
>> instead of varchar(MAX) , so how can I override default SQL Mapping?
>>
>> regards.
>>
>> On Sun, May 1, 2016 at 5:23 AM, Mich Talebzadeh <
>> mich.talebza...@gmail.com> wrote:
>>
>>> Well if MSSQL cannot create that column then it is more like
>>> compatibility between Spark and RDBMS.
>>>
>>> What value that column has in MSSQL. Can you create table the table in
>>> MSSQL database or map it in Spark to a valid column before opening JDBC
>>> connection?
>>>
>>> HTH
>>>
>>> Dr Mich Talebzadeh
>>>
>>>
>>>
>>> LinkedIn * 
>>> https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
>>> <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*
>>>
>>>
>>>
>>> http://talebzadehmich.wordpress.com
>>>
>>>
>>>
>>> On 29 April 2016 at 16:16, Andrés Ivaldi <iaiva...@gmail.com> wrote:
>>>
>>>> Hello, Spark is executing a create table sentence (using JDBC) to
>>>> MSSQLServer with a mapping column type like ColName Bit(1) for boolean
>>>> types, This create table cannot be executed on MSSQLServer.
>>>>
>>>> In class JdbcDialect the mapping for Boolean type is Bit(1), so the
>>>> question is, this is a problem of spark or JDBC driver who is not mapping
>>>> right?
>>>>
>>>> Anyway it´s possible to override that mapping in Spark?
>>>>
>>>> Regards
>>>>
>>>> --
>>>> Ing. Ivaldi Andres
>>>>
>>>
>>>
>>
>>
>> --
>> Ing. Ivaldi Andres
>>
>
>


-- 
Ing. Ivaldi Andres

Reply via email to