My lab had a GPU library for Random Forests on Images.
It was pretty fast but probably needs some updates:
https://github.com/deeplearningais/curfil
On 8/8/18 10:01 PM, Tommy Tracy wrote:
Dear Ta Hoang,
Accelerating decision tree ensembles (including Random Forest) is
actually a current are
Dear Ta Hoang,
Accelerating decision tree ensembles (including Random Forest) is actually
a current area of computer architecture research; in fact it is a principle
component of my dissertation. Like Sebastian Raschka said, the GPU is not
an ideal architecture for decision tree inference because
Dear Ta Hoang,
GPU processing can be done with Python libraries such as TensorFlow, Keras,
or Theano.
However, sklearn's implementation of RandomForestClassifier is
outstandingly fast, and a previous effort to develop GPU RandomForest
abandoned their efforts as a result:
https://github.com/EasonL
Hi,
scikit-learn doesn't support computations on the GPU, unfortunately.
Specifically for random forests, there's CudaTree, which implements a GPU
version of scikit-learn's random forests. It doesn't look like the library is
actively developed (hard to tell whether that's a good thing or a bad
@python.orgReply to: scikit-learn@python.orgSubject: [scikit-learn] Using GPU in scikit learn Dear all members,I am using Random forest for classification satellite images. I have a bunch of images, thus the processing is quite slow. I searched on the Internet and they said that GPU can accelerate the process. I
Dear all members,
I am using Random forest for classification satellite images. I have a
bunch of images, thus the processing is quite slow. I searched on the
Internet and they said that GPU can accelerate the process.
I have GPU NDVIA Geforce GTX 1080 Ti installed in the computer
Do you know ho