[ 
https://issues.apache.org/jira/browse/OPENNLP-1515?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=17774502#comment-17774502
 ] 

ASF GitHub Bot commented on OPENNLP-1515:
-----------------------------------------

kinow commented on PR #551:
URL: https://github.com/apache/opennlp/pull/551#issuecomment-1759586922

   > What about using `<profiles>` and `<os>`-based activation via Maven?
   
   I think that would be for build only? Unless we produced artefacts to be 
uploaded to Maven per OS (I think I saw that once, some different flag in the 
dependency in Maven pom.xml, but never actually used it?)




> Default to onnxruntime instead of onnxruntime-gpu
> -------------------------------------------------
>
>                 Key: OPENNLP-1515
>                 URL: https://issues.apache.org/jira/browse/OPENNLP-1515
>             Project: OpenNLP
>          Issue Type: Task
>          Components: Deep Learning
>            Reporter: Jeff Zemerick
>            Assignee: Jeff Zemerick
>            Priority: Major
>
> The onnxruntime-gpu dependency is currently being included by opennlp-dl.
>     <dependency>
>       <groupId>com.microsoft.onnxruntime</groupId>
>       <!-- This dependency supports CPU and GPU -->
>       <artifactId>onnxruntime_gpu</artifactId>
>       <version>${onnxruntime.version}</version>
>     </dependency>
> The problem is, GPU support is only on Linux and Windows and not OSX. I think 
> it would be best to use the onnxruntime dependency instead.
> But we need to make OpenNLP able to use GPU easily.



--
This message was sent by Atlassian Jira
(v8.20.10#820010)

Reply via email to