Skip to content

Latest commit

 

History

History
executable file
·
33 lines (25 loc) · 2.43 KB

README.md

File metadata and controls

executable file
·
33 lines (25 loc) · 2.43 KB

easyPheno: A state-of-the-art and easy-to-use Python framework for plant phenotype prediction

Python 3.8

easyPheno is a Python framework that enables the rigorous training, comparison and analysis of phenotype predictions for a variety of different models. easyPheno includes multiple state-of-the-art prediction models. Besides common genomic selection approaches, such as best linear unbiased prediction (BLUP) and models from the Bayesian alphabet, our framework includes several machine learning methods. These range from classical models, such as regularized linear regression over ensemble learners, e.g. XGBoost, to deep learning-based architectures, such as Convolutional Neural Networks (CNN). To enable automatic hyperparameter optimization, we leverage state-of-the-art and efficient Bayesian optimization techniques. In addition, our framework is designed to allow an easy and straightforward integration of further prediction models.

Documentation

For more information, installation guides, tutorials and much more, see our documentation: https://easypheno.readthedocs.io/

Case Study

In the folder case_study, you can find all data that we used for the case study included as supplementary for our publication.
For more information on this case study, see our publication and its supplementary given below (doi: 10.1093/bioadv/vbad035). For general information, see our documentation given above.

Contributors

This pipeline is developed and maintained by members of the Bioinformatics lab lead by Prof. Dr. Dominik Grimm:

Citation

When using easyPheno, please cite our publication:

easyPheno: An easy-to-use and easy-to-extend Python framework for phenotype prediction using Bayesian optimization.
Florian Haselbeck*, Maura John* and Dominik G Grimm.
Bioinformatics Advances, 2023. doi: 10.1093/bioadv/vbad035
*These authors have contributed equally to this work and share first authorship.