As Gaurav suggested SqlFirehoseFactory can be used. See
https://druid.apache.org/docs/latest/ingestion/firehose.html
On Fri, Aug 9, 2019 at 12:25 AM Gaurav Bhatnagar wrote:
> Here is ingestion spec for MySQL. You can change this spec to for
> PostgreSQL. Make sure you add extension for PostgreSQ
Here is ingestion spec for MySQL. You can change this spec to for
PostgreSQL. Make sure you add extension for PostgreSQL in your extensions
list. This ingestion spec will need to be updated with your values e.g.
database name, connection URI, column names, table name, user name etc.
{
"type": "ind
Hi Uday,
Druid supports CSV and TSV formats among others for data ingestion. One way
is to export your tables into one of these formats if possible and
ingesting them. Also each row needs a timestamp.
Alternatively you could dump the data to HDFS and use Hadoop batch
ingestion.
https://druid.apac
hi team,
I want ingest data from postgres to Druid like many tables 1000 tables how can
i do that Please put examples possible
-
To unsubscribe, e-mail: dev-unsubscr...@druid.apache.org
For additional commands, e-mail: dev-h...@