Hi Syed, If I follow correctly, are you asking how to do a bulk load first and then use delta streamer on top of that dataset to apply binlogs from Kafka?
Thanks Vinoth On Mon, Jan 13, 2020 at 12:39 AM Syed Abdul Kather <[email protected]> wrote: > Hi Team, > > We have on-board a few tables that have really huge number of records (100 > M records ). The plan is like enable the binlog for database that is no > issues as stream can handle the load . But for loading the snapshot . We > have use sqoop to import whole table to s3. > > What we required here? > Can we load the whole dump sqooped record to hudi table then we would use > the stream(binlog data comes vai kafka) > > Thanks and Regards, > S SYED ABDUL KATHER > *Bigdata [email protected]* > * +91-7411011661* >
