Re: ingesting data from postgres server to Druid not sing data set many datasets

2019-08-08 Thread Sashidhar Thallam
As Gaurav suggested SqlFirehoseFactory can be used. See https://druid.apache.org/docs/latest/ingestion/firehose.html On Fri, Aug 9, 2019 at 12:25 AM Gaurav Bhatnagar wrote: > Here is ingestion spec for MySQL. You can change this spec to for > PostgreSQL. Make sure you add extension for PostgreSQ

Re: ingesting data from postgres server to Druid not sing data set many datasets

2019-08-08 Thread Gaurav Bhatnagar
Here is ingestion spec for MySQL. You can change this spec to for PostgreSQL. Make sure you add extension for PostgreSQL in your extensions list. This ingestion spec will need to be updated with your values e.g. database name, connection URI, column names, table name, user name etc. { "type": "ind

Re: ingesting data from postgres server to Druid not sing data set many datasets

2019-08-07 Thread Sashidhar Thallam
Hi Uday, Druid supports CSV and TSV formats among others for data ingestion. One way is to export your tables into one of these formats if possible and ingesting them. Also each row needs a timestamp. Alternatively you could dump the data to HDFS and use Hadoop batch ingestion. https://druid.apac

ingesting data from postgres server to Druid not sing data set many datasets

2019-08-07 Thread yadavelli uday
hi team, I want ingest data from postgres to Druid like many tables 1000 tables how can i do that Please put examples possible - To unsubscribe, e-mail: dev-unsubscr...@druid.apache.org For additional commands, e-mail: dev-h...@