rahul3 opened a new pull request #18:
URL: https://github.com/apache/incubator-nlpcraft/pull/18


   This pull request should achieve the desired objective of the following JIRA 
tickets:
   
   1. [NLPCRAFT-394](https://issues.apache.org/jira/browse/NLPCRAFT-394)
   2. [NLPCRAFT-411](https://issues.apache.org/jira/browse/NLPCRAFT-411)
   3. [NLPCRAFT-413](https://issues.apache.org/jira/browse/NLPCRAFT-413)
   4. [NLPCRAFT-393](https://issues.apache.org/jira/browse/NLPCRAFT-393)
   5. [NLPCRAFT-186](https://issues.apache.org/jira/browse/NLPCRAFT-186)
   
   This change does the following:
   
   1. Creates a `<USER_HOME>/.nlpcraft-python` folder, where python setup and 
dependencies are managed. This is done in the `pom.xml` of the `nlpcraft` 
module.
   2. Creates a conda environment within the `.nlpcraft-python` folder so that 
system python versions and dependencies are completely untouched. This will 
also be much better for a production type of architecture. 
   3. Uses `conda` as a requirement to set the Python version that we can use 
for the project. Currently, I have set this to `3.8` . This is also in the 
`pom.xml`
   4. `python_setup.py` file created that manages the installation of 
`fasttext`, `pytorch` and the other python dependencies in an OS independent 
way.
   5. Fixed the `start_server.sh` script to start the `ctxserver` and made it 
location independent. That is, it can be started from anywhere.
   6. Added TODOs
   7. The installation is all done during the Maven build and party by a Python 
script that I have tried to make OS independent
   8. Updated some Readme files
   9. Instead of multiple Flask servers, I have set it to one, with REST naming 
and different endpoints as `blueprints`. This will make adding additional 
things like `tokenProviders` as simple as creating another blueprint and adding 
it
   10. Setup python to be able to read settings from the `nlpcraft.conf` file 
and function accordingly.
   
   *CTXServer Note:* Currently, the `ctxserver` will need to be started 
manually from the `start_server.sh` file. Starting it automatically is another 
JIRA ticket (NLPCRAFT-373) that I will work on as well.
   
   **Build Note** : `mvn clean package verify` works for me on Ubuntu 20.04 and 
Mac OS Big Sur. I have not tried Windows. I dont have a development system with 
it yet.
   
   #### Pom.xml 
   
   - Validate stage contains the verification for the conda version, creates 
the `.nlpcraft-python` folder
   - Compile stage creates the conda environment. It also contains the 
installation of pytorch, fasttext and other python dependencies. 
   
   To do further:
    - Make the conda verification better by more accurately checking it
    - Fix the windows ctx server start script 
    - More documentation needed


-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: [email protected]

For queries about this service, please contact Infrastructure at:
[email protected]


Reply via email to