Hi Uday,

Druid supports CSV and TSV formats among others for data ingestion. One way
is to export your tables into one of these formats if possible and
ingesting them. Also each row needs a timestamp.
Alternatively you could dump the data to HDFS and use Hadoop batch
ingestion.

https://druid.apache.org/docs/latest/ingestion/data-formats.html
https://druid.apache.org/docs/latest/ingestion/batch-ingestion.html
https://druid.apache.org/docs/latest/ingestion/native_tasks.html
https://druid.apache.org/docs/latest/ingestion/hadoop.html

Thanks,
Sashi

On Wed, Aug 7, 2019 at 8:25 PM yadavelli uday <mailmetoyadave...@gmail.com>
wrote:

> hi team,
>
> I want ingest data from postgres to Druid like many tables 1000 tables how
> can i do that Please put examples possible
>
> ---------------------------------------------------------------------
> To unsubscribe, e-mail: dev-unsubscr...@druid.apache.org
> For additional commands, e-mail: dev-h...@druid.apache.org
>
>

Reply via email to