The code and implementation for "Multi-Task Gradient Boosting" paper, as well as the datasets utilized, are housed in this repository.
The provided algorithm contains an implementation of the Multi-Task Gradient Boosting (MT-GB) algorithm, which extends the popular Gradient Boosting method for both classification and regression problems.
The package is licensed under the GNU Lesser General Public License v2.1.
When utilizing this package, kindly acknowledge it by citing as indicated below or employing the BibTeX format provided here.
Emami, S., Ruiz Pastor, C., Martínez-Muñoz, G. (2023). Multi-Task Gradient Boosting. In: García Bringas, P., et al. Hybrid Artificial Intelligent Systems. HAIS 2023. Lecture Notes in Computer Science(), vol 14001. Springer, Cham. https://doi.org/10.1007/978-3-031-40725-3_9
The code for the proposed algorithm is available in this repository for multi-task regression and classification problems. The code is implemented in Python and uses the scikit-learn library.
To run the code, clone this repository and install the necessary libraries. Then, run the mtgb.py
file to train and test the multi-task gradient boosting algorithm on the provided datasets.
To get started with this project, please refer to the Wiki."
To install the package, clone the repository and use pip to install.
pip install .
We warmly welcome contributions to the MT-GB! You can help enhance this algorithm by taking several actions, such as creating an issue to report a bug or suggest an improvement, forking the project and submitting a pull request to the development branch.
Gonzalo Martínez-Muñoz, Carlos Ruiz Pastor, Seyedsaman Emami
0.0.1
08 May 2023
08 May 2023