[
https://issues.apache.org/jira/browse/OPENNLP-1729?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Martin Wiesner resolved OPENNLP-1729.
-------------------------------------
Resolution: Fixed
> Provide easier loading of Models for given model lang and type
> ---------------------------------------------------------------
>
> Key: OPENNLP-1729
> URL: https://issues.apache.org/jira/browse/OPENNLP-1729
> Project: OpenNLP
> Issue Type: New Feature
> Components: Models
> Affects Versions: 2.5.4
> Reporter: Martin Wiesner
> Assignee: Martin Wiesner
> Priority: Minor
> Fix For: 2.5.5
>
> Time Spent: 40m
> Remaining Estimate: 0h
>
> Currently, quite some glue code is required to load a model for a certain
> language code and type, that is, LemmatizerModel, TokenizerModel, etc.
> Consequently, for some users it seems easier - or more attractive - to stick
> with DownloadUtil's simple way of getting a model via the local user home
> ".bin" cache, avoiding a switch towards bundled OpenNLP model jars.
> Aims:
> * Provide a short path to getting a ready to use model instance from the
> classpath for a certain language.
> * Extract {{ModelType}} from {{DownloadUtil}} for re-use in scenarios such
> as this one.}}
> * Introduce a new methods in existing {{{}ClassPathModelLoader{}}}, such as:
> {{<T extends BaseModel> T load(Set<ClassPathModelEntry> modelsInClassPath,
> String lang, ModelType type, Class<T> modelType)}}
--
This message was sent by Atlassian Jira
(v8.20.10#820010)