[ 
https://issues.apache.org/jira/browse/DRILL-8005?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17432779#comment-17432779
 ] 

ASF GitHub Bot commented on DRILL-8005:
---------------------------------------

cgivre commented on pull request #2327:
URL: https://github.com/apache/drill/pull/2327#issuecomment-949248528


   @dzamo 
   Per your request, I thought about this some more and added the ability to 
configure the batch size for the `INSERT` queries.  What happens now is that 
the user can set the batch size depending on their environment and the database 
to which they are inserting data.  
   
   The unit tests pass and I ran this locally with a 1M row CSV insert into a 
MySQL database which worked perfectly.  Previously, this ran into the 
`max_packet_size` limit in MySQL, but now this is not an issue. 


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: dev-unsubscr...@drill.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


> Add Writer to JDBC Storage Plugin
> ---------------------------------
>
>                 Key: DRILL-8005
>                 URL: https://issues.apache.org/jira/browse/DRILL-8005
>             Project: Apache Drill
>          Issue Type: Improvement
>          Components: Storage - JDBC
>    Affects Versions: 1.19.0
>            Reporter: Charles Givre
>            Assignee: Charles Givre
>            Priority: Major
>             Fix For: 1.20.0
>
>
> Current implementation of Drill only allows writing to file systems.  This 
> issue proposes extending the JDBC plugin to allow writing to JDBC data 
> sources.  This will do so by implementing: 
> CREATE TABLE AS
> DROP TABLE
>  



--
This message was sent by Atlassian Jira
(v8.3.4#803005)

Reply via email to