Hi All,

One of our Business requirement is to offload large dataset 1000's of
Terabytes of data processing need to be offloaded to Hadoop over a period
of few months. I am curious to learn and understand the possibilities
around the below points?

1.       what is the efficient Data Ingestion Framework which can help with
it
2.       Choice of picking right tools to overcome big data ingestion
problems
3.       Data ingestion model for synchronizing, slicing, splitting, etc..
any help, suggestions and pointers would be very much appreciated.

Regards,
Durga

Reply via email to