Re: PostgreSQL JDBC Connections

2017-01-05 Thread t p
YMMV and I don’t think my approach will work for your use case. Here is a suggestion based on what I’ve done. In the first paragraph you can register tables with code as such. %spark val example = sqlContext.read.format("jdbc").options( Map("url" ->

Re: PostgreSQL JDBC Connections

2017-01-05 Thread Benjamin Kim
We are using the JDBC interpreter. The business analysts only know SQL and run ad-hoc queries for their report exports to CSV. Cheers, Ben > On Jan 5, 2017, at 2:21 PM, t p wrote: > > Are you using JDBC or the PSQL interpreter? I had encountered something > similar

Re: PostgreSQL JDBC Connections

2017-01-05 Thread t p
Are you using JDBC or the PSQL interpreter? I had encountered something similar while using the PSQL interpreter and I had to restart Zeppelin. My experience using PSQL (Postgresql, HAWK) was not as good as using spark/scala wrappers (JDBC data source) to connect via JDBC and then register

Re: JDBC Connections

2016-10-18 Thread Hyung Sung Shim
Hello. AFAIK The connections did not closed until restart JDBC Interpreter. so https://github.com/apache/zeppelin/pull/1396 use ConnectionPool for control sessions. 2016-10-19 2:43 GMT+09:00 Benjamin Kim : > We are using Zeppelin 0.6.0 as a self-service for our clients to

JDBC Connections

2016-10-18 Thread Benjamin Kim
We are using Zeppelin 0.6.0 as a self-service for our clients to query our PostgreSQL databases. We are noticing that the connections are not closing after each one of them are done. What is the normal operating procedure to have these connections close when idle? Our scope for the JDBC