Hi, scikit-learn doesn't support computations on the GPU, unfortunately. Specifically for random forests, there's CudaTree, which implements a GPU version of scikit-learn's random forests. It doesn't look like the library is actively developed (hard to tell whether that's a good thing or a bad thing -- whether it's stable enough that it didn't need any updates). Anyway, maybe worth a try: https://github.com/EasonLiao/CudaTree
Otherwise, I can imagine there are probably alternative implementations out there? Best, Sebastian > On Aug 8, 2018, at 7:50 PM, hoang trung Ta <[email protected]> wrote: > > Dear all members, > > I am using Random forest for classification satellite images. I have a bunch > of images, thus the processing is quite slow. I searched on the Internet and > they said that GPU can accelerate the process. > > I have GPU NDVIA Geforce GTX 1080 Ti installed in the computer > > Do you know how to use GPU in Scikit learn, I mean the packages to use and > sample code that used GPU in random forest classification? > > Thank you very much > > -- > Ta Hoang Trung (Mr) > > Master student > Graduate School of Life and Environmental Sciences > University of Tsukuba, Japan > > Mobile: +81 70 3846 2993 > Email : [email protected] > [email protected] > [email protected] > ---- > Mapping Technician > Department of Surveying and Mapping Vietnam > No 2, Dang Thuy Tram street, Hanoi, Viet Nam > > Mobile: +84 1255151344 > Email : [email protected] > _______________________________________________ > scikit-learn mailing list > [email protected] > https://mail.python.org/mailman/listinfo/scikit-learn _______________________________________________ scikit-learn mailing list [email protected] https://mail.python.org/mailman/listinfo/scikit-learn
