Hello German,

Firstly, I apologize for the delayed response. I am currently in the midst
of midterm exams, which have kept me quite busy.
Definitely, I will be using the knowledge of the previous works to build my
code. I also did find many errors and corrections which needed to be
rectified/changed. I will work on those also and in-case I find it hard to
resolve the errors based on the old code, I will make the implementation
from scratch.

Thank you for your time.

Best regards,
Adarsh

On Mon, Mar 20, 2023 at 9:00 PM Germán Lancioni <germanlanci...@gmail.com>
wrote:

> Hi Adarsh,
>
> Thanks for reaching out. I like your topic idea. Have you seen the
> previous work in XGBoost? It would be beneficial if you capitalized on
> pre-existing work instead of starting from scratch, unless you see a
> problem with that. Please take a look here:
> https://github.com/RishabhGarg108/GSoC-Final-Eval
>
> Thanks,
> German
> ------------------------------
> *From:* Adarsh Santoria <adarshsanto...@gmail.com>
> *Sent:* Sunday, March 19, 2023 12:23 PM
> *To:* mlpack@lists.mlpack.org <mlpack@lists.mlpack.org>
> *Subject:* [mlpack] GSoC Proposal: Improvement in Ensemble Trees with
> XGBoost Implementation
>
> Dear mlpack community,
>
> My name is Adarsh Santoria, and I am a sophomore at IIT Mandi, India. I am
> writing to submit my proposal for the GSoC project on improving ensemble
> trees with XGBoost implementation. You can access the document through this
> link:
> https://docs.google.com/document/d/1mQx5e7thE42zIlEPO2U5aUkk4sZfvDZxBWtTYgytrNY/edit?usp=sharing,
> which outlines my project plan and timeline in detail. XGBoost is a machine
> learning algorithm that uses decision trees as base learners, known for its
> high accuracy, interpretability, scalability, feature importance, and
> robustness to noisy or incomplete data. Implementing XGBoost in mlpack is a
> necessary step towards enhancing the performance of ensemble trees, making
> it an important contribution to the mlpack community.
>
> In summary, my proposal includes the following:
> ● Implementing Random Forest Regressor method and adding tests
> ● Parallelizing decision tree, random forest and xgboost with OpenMP
> ● Adding bindings for the decision tree, random forest and xgboost
> ● Adding the XGBoost Classifier and Regressor along with some split
> methods and loss functions.
> ● Adding tutorials and sample code snippets
>
> I believe that with my skills and experience, I can make significant
> contributions to mlpack and enhance the performance of ensemble trees with
> XGBoost implementation.
> Thank you for considering my proposal for the GSoC project.
>
> Best regards,
> Adarsh Santoria
> Github link: https://github.com/AdarshSantoria
>
_______________________________________________
mlpack mailing list -- mlpack@lists.mlpack.org
To unsubscribe send an email to mlpack-le...@lists.mlpack.org
%(web_page_url)slistinfo%(cgiext)s/%(_internal_name)s

Reply via email to