In GCP the equivalent of HDFS is Google Could Storage. You have to change
the url from hdfs:// to gs://.
The map reduce api's will work as it is with this change. You run map
reduce jobs on Google Dataproc instance. Your storage is in Google Cloud
Storage bucket. Refer GCP documents.
On Friday,
Any help here ?
On Thu, Jun 13, 2019 at 12:38 PM Amit Kabra wrote:
> Hello,
>
> I have a requirement where I need to read/write data to public cloud via
> map reduce job.
>
> Our systems currently read and write of data from hdfs using mapreduce and
> its working well, we write data in