※ KprFunc INTRODUCTION:
As a newly uncovered post-translational modification, lysine propionylation (Kpr) was first reported in 2007 as a histone modification. With the advance of research, propionylation was also identified on non-histone proteins, and participated in numerus crucial biological processes including metabolic processes and cellular stress responses. Despite the functionally importance of propionylation, there are less propionylation sites identified compared to other lysine modifications currently, and very few propionylation sites with functional annotations. Thus, the identification of novel propionylation sites as well as their corresponding functions will do a great help to decipher the fundamental mechanisms of propionylation.
In this work, we developed a computational framework named KprFunc (Kpr sites with Func tional relevance) which combined the conventional machine learning and deep learning methods for the prediction of both general and funtional propionylation sites. Through the integration of our own data and public data from CPLM 4.0 (http://cplm.biocuckoo.cn/) and other studies, 1,707 non-redundancy propionylation sites were used for initial model training. After the pre-training with our GPS algorithm, the initial model of KprFunc (KprFunc-i) demonstrated superior performance than other algorithms. In view of the small size of propionylation sites with known functions, MAML (Model-Agnostic Meta-Learning) was adopted to construct a new small-sample learning framework to predict the functional relevance of propionylation sites, and 13 manually collected functional propionylation sites were taken as the training data.
The KprFunc is freely available for academic research at: http://kprfunc.biocuckoo.cn/.
For publication of results please cite the following article: Small-sample learning reveals propionylation in determining global protein homeostasis Ke Shui, Chenwei Wang, Xuedi Zhang, Shanshan Ma, Wanshan Ning, Weizhi Zhang, Miaomiao Chen, Di Peng, Hui Hu, Anyuan Guo, Guanjun Gao, Luoying Zhang*, Yu Xue*. 2022, Submitted
|