Hi Dirk,

Happy to report that there are more projects using dw.  I have the same 
issues here.  Using Azure SQL DW at the moment and building a serverless 
function app that reads and sends data back to the SQL DW.
Did you eventually find a solution other than looping through the dataframe?

Cheers,
Nicole

On Wednesday, September 13, 2017 at 5:39:47 PM UTC-5, dirk.biesinger wrote:
>
> yeah, I ran into some 'nice' 'features' in this project.
> the azure datawarehouse does not behave or work like a typical old-school 
> sql server.
> There might be some potential there, just not sure how many projects would 
> be using the datawarehouse....
>
> On Wednesday, September 13, 2017 at 3:33:20 PM UTC-7, Mike Bayer wrote:
>>
>> On Wed, Sep 13, 2017 at 6:13 PM, dirk.biesinger 
>> <dirk.bi...@gmail.com> wrote: 
>> > "oh that is very interesting." he says and then it's getting eerily 
>> quiet. 
>> > I guess Mike and Dilly are somewhere in the depth of code and docs... 
>>
>>
>> unfortunately not, azure seems to offer free trials if you are willing 
>> to give them a credit card number so I can perhaps eventually get 
>> around to working with that but at the moment it would be preferable 
>> if someone wants to work on an azure variant of the SQL Server 
>> dialect.    It shouldn't be hard but the various glitches need to be 
>> understood for anything to be committed. 
>>
>> > 
>> > On Wednesday, September 13, 2017 at 2:05:22 PM UTC-7, Mike Bayer wrote: 
>> >> 
>> >> On Wed, Sep 13, 2017 at 4:29 PM, dirk.biesinger 
>> >> <dirk.bi...@gmail.com> wrote: 
>> >> > 
>> >> > as for the rollback call, 
>> >> > adding it after the print command, does not change anything. 
>> >> > When I insert it between cursor.execute() and rows = 
>> cursor.fetchall(), 
>> >> > I 
>> >> > get an error as expected. 
>> >> 
>> >> oh that is very interesting. 
>> >> 
>> >> 
>> >> > 
>> >> > 
>> >> > On Wednesday, September 13, 2017 at 1:16:59 PM UTC-7, Mike Bayer 
>> wrote: 
>> >> >> 
>> >> >> On Wed, Sep 13, 2017 at 3:58 PM, dirk.biesinger 
>> >> >> <dirk.bi...@gmail.com> wrote: 
>> >> >> > using 
>> >> >> > 
>> >> >> > connection = pyodbc.connect(....) 
>> >> >> > connection.autocommit = 0 
>> >> >> > cursor = connection.cursor() 
>> >> >> > cursor.execute([proper sql statement that references a table]) 
>> >> >> > rows = corsor.fetchall() 
>> >> >> > print(rows) 
>> >> >> > cursor.close() 
>> >> >> > 
>> >> >> > gives me the output out of the table that I expect. 
>> >> >> > So if you were wondering if a raw pyodbc connection works, this 
>> is 
>> >> >> > confirmed. 
>> >> >> 
>> >> >> did you call: 
>> >> >> 
>> >> >> connection.rollback() 
>> >> >> 
>> >> >> the stack traces you have given me indicate this method cannot be 
>> >> >> called else Azure raises an error.  This must be illustrated as 
>> >> >> definitely the problem, and not a side effect of something else. 
>> >> >> 
>> >> >> This is why this would go a lot quicker if someone had a *blank* 
>> azure 
>> >> >> database on a cloud node somewhere for me to log into.  I don't 
>> need 
>> >> >> your customer data. 
>> >> >> 
>> >> >> 
>> >> >> 
>> >> >> 
>> >> >> 
>> >> >> 
>> >> >> > 
>> >> >> > On Wednesday, September 13, 2017 at 12:49:33 PM UTC-7, Mike Bayer 
>> >> >> > wrote: 
>> >> >> >> 
>> >> >> >> On Wed, Sep 13, 2017 at 3:27 PM, dirk.biesinger 
>> >> >> >> <dirk.bi...@gmail.com> wrote: 
>> >> >> >> > I have the 'patched' pyodbc.py file active. 
>> >> >> >> > Executing your code snippet does NOT produce an error or any 
>> >> >> >> > output 
>> >> >> >> > for 
>> >> >> >> > that 
>> >> >> >> > matter. 
>> >> >> >> > 
>> >> >> >> > 
>> >> >> >> > On Wednesday, September 13, 2017 at 12:22:30 PM UTC-7, Mike 
>> Bayer 
>> >> >> >> > wrote: 
>> >> >> >> >> 
>> >> >> >> >> On Wed, Sep 13, 2017 at 3:14 PM, dirk.biesinger 
>> >> >> >> >> <dirk.bi...@gmail.com> wrote: 
>> >> >> >> >> > Got ya, 
>> >> >> >> >> > 
>> >> >> >> >> > so we could solve the issue on the sqlalchemy end with the 
>> >> >> >> >> > alteration 
>> >> >> >> >> > of 
>> >> >> >> >> > the 
>> >> >> >> >> > pyodbc.py file. 
>> >> >> >> >> > I assume you'll include this in the next release? 
>> >> >> >> >> 
>> >> >> >> >> um. 
>> >> >> >> >> 
>> >> >> >> >> can you just confirm for me this makes the error? 
>> >> >> >> >> 
>> >> >> >> >> 
>> >> >> >> >> connection = pyodbc.connect(....) 
>> >> >> >> >> connection.autocommit = 0 
>> >> >> >> >> connection.rollback() 
>> >> >> >> 
>> >> >> >> 
>> >> >> >> try it like this: 
>> >> >> >> 
>> >> >> >> 
>> >> >> >> connection = pyodbc.connect(....) 
>> >> >> >> connection.autocommit = 0 
>> >> >> >> cursor = connection.cursor() 
>> >> >> >> cursor.execute("SELECT 1") 
>> >> >> >> cursor.close() 
>> >> >> >> 
>> >> >> >> connection.rollback() 
>> >> >> >> 
>> >> >> >> 
>> >> >> >> >> 
>> >> >> >> >> 
>> >> >> >> >> 
>> >> >> >> >> > The issue with creating a table when the option 
>> >> >> >> >> > "if_exists='append'" 
>> >> >> >> >> > is 
>> >> >> >> >> > set 
>> >> >> >> >> > in the df.to_sql() call, is a pandas problem. 
>> >> >> >> >> > 
>> >> >> >> >> > Thank you for your help. 
>> >> >> >> >> > 
>> >> >> >> >> > Best, 
>> >> >> >> >> > DB 
>> >> >> >> >> > 
>> >> >> >> >> > On Wednesday, September 13, 2017 at 11:45:15 AM UTC-7, Mike 
>> >> >> >> >> > Bayer 
>> >> >> >> >> > wrote: 
>> >> >> >> >> >> 
>> >> >> >> >> >> On Wed, Sep 13, 2017 at 2:41 PM, dirk.biesinger 
>> >> >> >> >> >> <dirk.bi...@gmail.com> wrote: 
>> >> >> >> >> >> > I don't get why the table is getting created in the 
>> first 
>> >> >> >> >> >> > place. A 
>> >> >> >> >> >> > table 
>> >> >> >> >> >> > with this name exists, and the option 
>> "if_exists='append'" 
>> >> >> >> >> >> > should 
>> >> >> >> >> >> > append 
>> >> >> >> >> >> > the 
>> >> >> >> >> >> > dataframe to the existing table. 
>> >> >> >> >> >> > There should not be a dropping of the table (which I 
>> have 
>> >> >> >> >> >> > not 
>> >> >> >> >> >> > seen) 
>> >> >> >> >> >> > nor 
>> >> >> >> >> >> > creation of the table. 
>> >> >> >> >> >> > 
>> >> >> >> >> >> > And in case of creating the table, I think it should be 
>> >> >> >> >> >> > possible 
>> >> >> >> >> >> > to 
>> >> >> >> >> >> > define 
>> >> >> >> >> >> > the length of the field, so 
>> >> >> >> >> >> > varchar([variable_to_be_submitted]). 
>> >> >> >> >> >> > In my case I expect this particular table to grow to 
>> several 
>> >> >> >> >> >> > hundred 
>> >> >> >> >> >> > million 
>> >> >> >> >> >> > rows, so assigned storage space is a factor. 
>> >> >> >> >> >> > 
>> >> >> >> >> >> > the existing table was created like this: 
>> >> >> >> >> >> > 
>> >> >> >> >> >> > CREATE TABLE dbo.DSI 
>> >> >> >> >> >> > ( 
>> >> >> >> >> >> > a datetime 
>> >> >> >> >> >> > b varchar(10) null, 
>> >> >> >> >> >> > c varchar(100) null, 
>> >> >> >> >> >> > d varchar(10) null, 
>> >> >> >> >> >> > e varchar(100) null, 
>> >> >> >> >> >> > f decimal (8,6) null, 
>> >> >> >> >> >> > g decimal (8,6) null 
>> >> >> >> >> >> > ) 
>> >> >> >> >> >> > 
>> >> >> >> >> >> > If I read and understand the error stack correct, the 
>> cause 
>> >> >> >> >> >> > is 
>> >> >> >> >> >> > the 
>> >> >> >> >> >> > create 
>> >> >> >> >> >> > table statement, which I would very strongly hope to 
>> cause 
>> >> >> >> >> >> > an 
>> >> >> >> >> >> > error, 
>> >> >> >> >> >> > as 
>> >> >> >> >> >> > the 
>> >> >> >> >> >> > table exists. 
>> >> >> >> >> >> > Should the create table statement not be omitted if the 
>> >> >> >> >> >> > option 
>> >> >> >> >> >> > "if_exists='append'" option is set? 
>> >> >> >> >> >> 
>> >> >> >> >> >> This is all on the Pandas side so you'd need to talk to 
>> them. 
>> >> >> >> >> >> 
>> >> >> >> >> >> 
>> >> >> >> >> >> > 
>> >> >> >> >> >> > On Wednesday, September 13, 2017 at 11:15:27 AM UTC-7, 
>> Mike 
>> >> >> >> >> >> > Bayer 
>> >> >> >> >> >> > wrote: 
>> >> >> >> >> >> >> 
>> >> >> >> >> >> >> OK so....don't use VARCHAR(max)?  What datatype would 
>> you 
>> >> >> >> >> >> >> like? 
>> >> >> >> >> >> >> We 
>> >> >> >> >> >> >> have most of them and you can make new ones too. 
>> >> >> >> >> >> >> 
>> >> >> >> >> >> >> On Wed, Sep 13, 2017 at 1:07 PM, dirk.biesinger 
>> >> >> >> >> >> >> <dirk.bi...@gmail.com> wrote: 
>> >> >> >> >> >> >> > Mike, 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > here's the error stack (I had to mask some details): 
>> >> >> >> >> >> >> > The columns (dataformats) in the create table 
>> statement 
>> >> >> >> >> >> >> > are 
>> >> >> >> >> >> >> > wrong. 
>> >> >> >> >> >> >> > Also, 
>> >> >> >> >> >> >> > this table does not have an index. 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,200 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > SELECT 
>> >> >> >> >> >> >> > SERVERPROPERTY('ProductVersion') 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,202 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > () 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,246 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > SELECT 
>> >> >> >> >> >> >> > schema_name() 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,247 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > () 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,522 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > SELECT 
>> >> >> >> >> >> >> > CAST('test 
>> >> >> >> >> >> >> > plain returns' AS VARCHAR(60)) AS anon_1 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,522 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > () 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,562 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > SELECT 
>> >> >> >> >> >> >> > CAST('test 
>> >> >> >> >> >> >> > unicode returns' AS NVARCHAR(60)) AS anon_1 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,563 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > () 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,604 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > SELECT 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_NAME], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[IS_NULLABLE], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[DATA_TYPE], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[ORDINAL_POSITION], 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> [INFORMATION_SCHEMA].[COLUMNS].[CHARACTER_MAXIMUM_LENGTH], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_PRECISION], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[NUMERIC_SCALE], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[COLUMN_DEFAULT], 
>> >> >> >> >> >> >> > [INFORMATION_SCHEMA].[COLUMNS].[COLLATION_NAME] 
>> >> >> >> >> >> >> > FROM [INFORMATION_SCHEMA].[COLUMNS] 
>> >> >> >> >> >> >> > WHERE [INFORMATION_SCHEMA].[COLUMNS].[TABLE_NAME] = 
>> >> >> >> >> >> >> > CAST(? 
>> >> >> >> >> >> >> > AS 
>> >> >> >> >> >> >> > NVARCHAR(max)) 
>> >> >> >> >> >> >> > AND [INFORMATION_SCHEMA].[COLUMNS].[TABLE_SCHEMA] = 
>> >> >> >> >> >> >> > CAST(? 
>> >> >> >> >> >> >> > AS 
>> >> >> >> >> >> >> > NVARCHAR(max)) 
>> >> >> >> >> >> >> > 2017-09-13 15:07:50,605 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > ('dbo.MSODS_DSI', 
>> >> >> >> >> >> >> > 'dbo') 
>> >> >> >> >> >> >> > 2017-09-13 15:07:52,151 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > CREATE TABLE [dbo.MSODS_DSI] ( 
>> >> >> >> >> >> >> > [a] DATETIME NULL, 
>> >> >> >> >> >> >> > [b] VARCHAR(max) NULL, 
>> >> >> >> >> >> >> > [c] VARCHAR(max) NULL, 
>> >> >> >> >> >> >> > [d] VARCHAR(max) NULL, 
>> >> >> >> >> >> >> > [e] VARCHAR(max) NULL, 
>> >> >> >> >> >> >> > [f] FLOAT(53) NULL, 
>> >> >> >> >> >> >> > [g] FLOAT(53) NULL 
>> >> >> >> >> >> >> > ) 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 2017-09-13 15:07:52,152 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > () 
>> >> >> >> >> >> >> > 2017-09-13 15:07:54,374 INFO 
>> >> >> >> >> >> >> > sqlalchemy.engine.base.Engine 
>> >> >> >> >> >> >> > ROLLBACK 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> --------------------------------------------------------------------------- 
>> >> >> >> >> >> >> > ProgrammingError                          Traceback 
>> (most 
>> >> >> >> >> >> >> > recent 
>> >> >> >> >> >> >> > call 
>> >> >> >> >> >> >> > last) 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/engine/base.py
>>  
>>
>> >> >> >> >> >> >> > in _execute_context(self, dialect, constructor, 
>> >> >> >> >> >> >> > statement, 
>> >> >> >> >> >> >> > parameters, 
>> >> >> >> >> >> >> > *args) 
>> >> >> >> >> >> >> >    1181                         parameters, 
>> >> >> >> >> >> >> > -> 1182                         context) 
>> >> >> >> >> >> >> >    1183         except BaseException as e: 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/engine/default.py
>>  
>>
>> >> >> >> >> >> >> > in do_execute(self, cursor, statement, parameters, 
>> >> >> >> >> >> >> > context) 
>> >> >> >> >> >> >> >     469     def do_execute(self, cursor, statement, 
>> >> >> >> >> >> >> > parameters, 
>> >> >> >> >> >> >> > context=None): 
>> >> >> >> >> >> >> > --> 470         cursor.execute(statement, parameters) 
>> >> >> >> >> >> >> >     471 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > ProgrammingError: ('42000', "[42000] [Microsoft][ODBC 
>> >> >> >> >> >> >> > Driver 
>> >> >> >> >> >> >> > 13 
>> >> >> >> >> >> >> > for 
>> >> >> >> >> >> >> > SQL 
>> >> >> >> >> >> >> > Server][SQL Server]The statement failed. Column 'b' 
>> has a 
>> >> >> >> >> >> >> > data 
>> >> >> >> >> >> >> > type 
>> >> >> >> >> >> >> > that 
>> >> >> >> >> >> >> > cannot participate in a columnstore 
>> index.\r\nOperation 
>> >> >> >> >> >> >> > cancelled 
>> >> >> >> >> >> >> > by 
>> >> >> >> >> >> >> > user. 
>> >> >> >> >> >> >> > (35343) (SQLExecDirectW)") 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > The above exception was the direct cause of the 
>> following 
>> >> >> >> >> >> >> > exception: 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > ProgrammingError                          Traceback 
>> (most 
>> >> >> >> >> >> >> > recent 
>> >> >> >> >> >> >> > call 
>> >> >> >> >> >> >> > last) 
>> >> >> >> >> >> >> > <ipython-input-16-290e9c1020c9> in <module>() 
>> >> >> >> >> >> >> >      17 #cnxn = pyodbc.connect(connection_str) 
>> >> >> >> >> >> >> >      18 #engn.connect() 
>> >> >> >> >> >> >> > ---> 19 df.to_sql(tbl_server_out, engn, 
>> >> >> >> >> >> >> > if_exists='append', 
>> >> >> >> >> >> >> > index=False) 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/pandas/core/generic.py 
>> >> >> >> >> >> >> > in to_sql(self, name, con, flavor, schema, if_exists, 
>> >> >> >> >> >> >> > index, 
>> >> >> >> >> >> >> > index_label, 
>> >> >> >> >> >> >> > chunksize, dtype) 
>> >> >> >> >> >> >> >    1343         sql.to_sql(self, name, con, 
>> >> >> >> >> >> >> > flavor=flavor, 
>> >> >> >> >> >> >> > schema=schema, 
>> >> >> >> >> >> >> >    1344                    if_exists=if_exists, 
>> >> >> >> >> >> >> > index=index, 
>> >> >> >> >> >> >> > index_label=index_label, 
>> >> >> >> >> >> >> > -> 1345                    chunksize=chunksize, 
>> >> >> >> >> >> >> > dtype=dtype) 
>> >> >> >> >> >> >> >    1346 
>> >> >> >> >> >> >> >    1347     def to_pickle(self, path, 
>> >> >> >> >> >> >> > compression='infer'): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py 
>> >> >> >> >> >> >> > in 
>> >> >> >> >> >> >> > to_sql(frame, name, con, flavor, schema, if_exists, 
>> >> >> >> >> >> >> > index, 
>> >> >> >> >> >> >> > index_label, 
>> >> >> >> >> >> >> > chunksize, dtype) 
>> >> >> >> >> >> >> >     469     pandas_sql.to_sql(frame, name, 
>> >> >> >> >> >> >> > if_exists=if_exists, 
>> >> >> >> >> >> >> > index=index, 
>> >> >> >> >> >> >> >     470                       
>> index_label=index_label, 
>> >> >> >> >> >> >> > schema=schema, 
>> >> >> >> >> >> >> > --> 471                       chunksize=chunksize, 
>> >> >> >> >> >> >> > dtype=dtype) 
>> >> >> >> >> >> >> >     472 
>> >> >> >> >> >> >> >     473 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py 
>> >> >> >> >> >> >> > in 
>> >> >> >> >> >> >> > to_sql(self, frame, name, if_exists, index, 
>> index_label, 
>> >> >> >> >> >> >> > schema, 
>> >> >> >> >> >> >> > chunksize, 
>> >> >> >> >> >> >> > dtype) 
>> >> >> >> >> >> >> >    1148                          if_exists=if_exists, 
>> >> >> >> >> >> >> > index_label=index_label, 
>> >> >> >> >> >> >> >    1149                          schema=schema, 
>> >> >> >> >> >> >> > dtype=dtype) 
>> >> >> >> >> >> >> > -> 1150         table.create() 
>> >> >> >> >> >> >> >    1151         table.insert(chunksize) 
>> >> >> >> >> >> >> >    1152         if (not name.isdigit() and not 
>> >> >> >> >> >> >> > name.islower()): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py 
>> >> >> >> >> >> >> > in 
>> >> >> >> >> >> >> > create(self) 
>> >> >> >> >> >> >> >     596                     "'{0}' is not valid for 
>> >> >> >> >> >> >> > if_exists".format(self.if_exists)) 
>> >> >> >> >> >> >> >     597         else: 
>> >> >> >> >> >> >> > --> 598             self._execute_create() 
>> >> >> >> >> >> >> >     599 
>> >> >> >> >> >> >> >     600     def insert_statement(self): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/pandas/io/sql.py 
>> >> >> >> >> >> >> > in 
>> >> >> >> >> >> >> > _execute_create(self) 
>> >> >> >> >> >> >> >     581         # Inserting table into database, add 
>> to 
>> >> >> >> >> >> >> > MetaData 
>> >> >> >> >> >> >> > object 
>> >> >> >> >> >> >> >     582         self.table = 
>> >> >> >> >> >> >> > self.table.tometadata(self.pd_sql.meta) 
>> >> >> >> >> >> >> > --> 583         self.table.create() 
>> >> >> >> >> >> >> >     584 
>> >> >> >> >> >> >> >     585     def create(self): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/sql/schema.py 
>>
>> >> >> >> >> >> >> > in create(self, bind, checkfirst) 
>> >> >> >> >> >> >> >     754         
>> bind._run_visitor(ddl.SchemaGenerator, 
>> >> >> >> >> >> >> >     755                           self, 
>> >> >> >> >> >> >> > --> 756                           
>> checkfirst=checkfirst) 
>> >> >> >> >> >> >> >     757 
>> >> >> >> >> >> >> >     758     def drop(self, bind=None, 
>> checkfirst=False): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/engine/base.py
>>  
>>
>> >> >> >> >> >> >> > in _run_visitor(self, visitorcallable, element, 
>> >> >> >> >> >> >> > connection, 
>> >> >> >> >> >> >> > **kwargs) 
>> >> >> >> >> >> >> >    1927                      connection=None, 
>> **kwargs): 
>> >> >> >> >> >> >> >    1928         with 
>> >> >> >> >> >> >> > self._optional_conn_ctx_manager(connection) 
>> >> >> >> >> >> >> > as 
>> >> >> >> >> >> >> > conn: 
>> >> >> >> >> >> >> > -> 1929             
>> conn._run_visitor(visitorcallable, 
>> >> >> >> >> >> >> > element, 
>> >> >> >> >> >> >> > **kwargs) 
>> >> >> >> >> >> >> >    1930 
>> >> >> >> >> >> >> >    1931     class _trans_ctx(object): 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/engine/base.py
>>  
>>
>> >> >> >> >> >> >> > in _run_visitor(self, visitorcallable, element, 
>> **kwargs) 
>> >> >> >> >> >> >> >    1536     def _run_visitor(self, visitorcallable, 
>> >> >> >> >> >> >> > element, 
>> >> >> >> >> >> >> > **kwargs): 
>> >> >> >> >> >> >> >    1537         visitorcallable(self.dialect, self, 
>> >> >> >> >> >> >> > -> 1538 
>> >> >> >> >> >> >> > **kwargs).traverse_single(element) 
>> >> >> >> >> >> >> >    1539 
>> >> >> >> >> >> >> >    1540 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> >> >> >> >> >> >> > 
>> /home/saravji/anaconda3/lib/python3.6/site-packages/sqlalchemy/sql/visitors.py
>>  
>>
>> >> >> >> >> >> >> > in traverse_single(self, obj, **kw) 
>> >> >> >> >> >> >> >     119             meth = getattr(v, "visit_%s" % 
>> >> >> >> >> >> >> > obj.__visit_name__, <b
>
>

-- 
SQLAlchemy - 
The Python SQL Toolkit and Object Relational Mapper

http://www.sqlalchemy.org/

To post example code, please provide an MCVE: Minimal, Complete, and Verifiable 
Example.  See  http://stackoverflow.com/help/mcve for a full description.
--- 
You received this message because you are subscribed to the Google Groups 
"sqlalchemy" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to sqlalchemy+unsubscr...@googlegroups.com.
To post to this group, send email to sqlalchemy@googlegroups.com.
Visit this group at https://groups.google.com/group/sqlalchemy.
For more options, visit https://groups.google.com/d/optout.

Reply via email to