YMMV and I don’t think my approach will work for your use case.
Here is a suggestion based on what I’ve done. In the first paragraph you can
register tables with code as such.
%spark
val example = sqlContext.read.format("jdbc").options(
Map("url" ->
We are using the JDBC interpreter. The business analysts only know SQL and run
ad-hoc queries for their report exports to CSV.
Cheers,
Ben
> On Jan 5, 2017, at 2:21 PM, t p wrote:
>
> Are you using JDBC or the PSQL interpreter? I had encountered something
> similar
Are you using JDBC or the PSQL interpreter? I had encountered something similar
while using the PSQL interpreter and I had to restart Zeppelin.
My experience using PSQL (Postgresql, HAWK) was not as good as using
spark/scala wrappers (JDBC data source) to connect via JDBC and then register
Hello.
AFAIK The connections did not closed until restart JDBC Interpreter.
so https://github.com/apache/zeppelin/pull/1396 use ConnectionPool for
control sessions.
2016-10-19 2:43 GMT+09:00 Benjamin Kim :
> We are using Zeppelin 0.6.0 as a self-service for our clients to
We are using Zeppelin 0.6.0 as a self-service for our clients to query our
PostgreSQL databases. We are noticing that the connections are not closing
after each one of them are done. What is the normal operating procedure to have
these connections close when idle? Our scope for the JDBC