4.0 Article

Simultaneous feature selection and parameter optimisation using an artificial ant colony: case study of melting point prediction

期刊

CHEMISTRY CENTRAL JOURNAL
卷 2, 期 -, 页码 -

出版社

SPRINGEROPEN
DOI: 10.1186/1752-153X-2-21

关键词

-

资金

  1. BBSRC [BB/C51320X/1]
  2. Pfizer
  3. Centre for Molecular Science Informatics
  4. Biotechnology and Biological Sciences Research Council [BB/C51320X/1] Funding Source: researchfish

向作者/读者索取更多资源

Background: We present a novel feature selection algorithm, Winnowing Artificial Ant Colony (WAAC), that performs simultaneous feature selection and model parameter optimisation for the development of predictive quantitative structure-property relationship (QSPR) models. The WAAC algorithm is an extension of the modified ant colony algorithm of Shen et al. (J Chem Inf Model 2005, 45:1024-1029). We test the ability of the algorithm to develop a predictive partial least squares model for the Karthikeyan dataset (J Chem Inf Model 2005, 45:581-590) of melting point values. We also test its ability to perform feature selection on a support vector machine model for the same dataset. Results: Starting from an initial set of 203 descriptors, the WAAC algorithm selected a PLS model with 68 descriptors which has an RMSE on an external test set of 46.6 degrees C and R-2 of 0.51. The number of components chosen for the model was 49, which was close to optimal for this feature selection. The selected SVM model has 28 descriptors (cost of 5, epsilon of 0.21) and an RMSE of 45.1 degrees C and R-2 of 0.54. This model outperforms a kNN model (RMSE of 48.3 degrees C, R-2 of 0.47) for the same data and has similar performance to a Random Forest model (RMSE of 44.5 degrees C, R-2 of 0.55). However it is much less prone to bias at the extremes of the range of melting points as shown by the slope of the line through the residuals: -0.43 for WAAC/SVM, -0.53 for Random Forest. Conclusion: With a careful choice of objective function, the WAAC algorithm can be used to optimise machine learning and regression models that suffer from overfitting. Where model parameters also need to be tuned, as is the case with support vector machine and partial least squares models, it can optimise these simultaneously. The moving probabilities used by the algorithm are easily interpreted in terms of the best and current models of the ants, and the winnowing procedure promotes the removal of irrelevant descriptors.

作者

我是这篇论文的作者
点击您的名字以认领此论文并将其添加到您的个人资料中。

评论

主要评分

4.0
评分不足

次要评分

新颖性
-
重要性
-
科学严谨性
-
评价这篇论文

推荐

暂无数据
暂无数据