Hi Folks, good morning!

I'm using GDAL Python for pre-processing some satellite images, currently
using CPU instances in AWS.
I'm struggling to find a way to implement it using GPUs, more specifically,
the NVIDIA V100 GPU.

Does any one achieve that and could share how you've done it?

Thanks,
Paulo Petersen.
paulo.a...@gmail.com
_______________________________________________
gdal-dev mailing list
gdal-dev@lists.osgeo.org
https://lists.osgeo.org/mailman/listinfo/gdal-dev

Reply via email to