damccorm commented on code in PR #23218:
URL: https://github.com/apache/beam/pull/23218#discussion_r973217823
##########
website/www/site/content/en/documentation/sdks/python-machine-learning.md:
##########
@@ -83,6 +83,14 @@ You need to provide a path to a file that contains the
pickled Scikit-learn mode
`model_uri=<path_to_pickled_file>` and `model_file_type: <ModelFileType>`,
where you can specify
`ModelFileType.PICKLE` or `ModelFileType.JOBLIB`, depending on how the
model was serialized.
+### Use custom models
+
+In fact, the RunInference API is designed flexibly to allow you to use any
custom machine learning models. You only need to create your own `ModelHandler`
or `KeyedModelHandler` to handle how the ML models are loaded from a location
that the pipeline can access and how to use these models to run the inference.
Review Comment:
```suggestion
If you would like to use a model that isn't specified by one of the
supported frameworks, the RunInference API is designed flexibly to allow you to
use any custom machine learning models.
You only need to create your own `ModelHandler` or `KeyedModelHandler` with
logic to load your model and use it to run the inference.
```
Mostly a wording nit, but I'd also like to emphasize that this shouldn't be
the default option (the default is using one of the existing model handlers).
Ideally over time we will support enough frameworks that creating a custom
model handler is no longer a common need at all, but in general we want to
point people to our handlers first so that they get any improvements for free.
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]