References
1、国外格式
[1] D. E. Rumelhart, G. E. Hinton, and R. J. Williams, “Learning representations by back-propagating errors,” Nature, vol. 323, no. 6088, pp. 533–536, 1986.
[2] T. Cover P. Hart, "Nearest neighbor pattern classification," Journal IEEE Transactions on Information Theory archive Volume 13 Issue 1, January 1967
2、国内格式
[1] Rumelhart D E, Hinton G E, Williams R J. Learning representations by back-propagating errors.[J]. 1986, 323(6088):399-421.
[2] Cover T M, Hart P E. Nearest neighbor pattern classification. IEEE Trans Inf Theory IT-13(1):21-27[J]. IEEE Transactions on Information Theory, 1967, 13(1):21-27.
[3] Daral N. Histograms of Oriented Gradients for Human Detection[J]. Proc. of CVPR, 2005, 2005.
[3.1] Histograms of Oriented Gradients for Human Detection. Dalai,N,B.Triggs. Computer Vision and Pattern Recognition, 2005.CVPR 2005.IEEE Computer Society Conference on . 2005
[4] Kazemi V, Sullivan J. One Millisecond Face Alignment with an Ensemble of Regression Trees[C] Computer Vision and Pattern Recognition. IEEE, 2014:1867-1874.
[5] David J. Hand and Robert J. Till( 2001). A Simple Generalization of the Area Under the ROC Curve for Multiple Class Classification Problems . Machine Learning , 45(2), 171 – 186 .
一、综合方向
周志华,机器学习,清华大学出版社,2016
李航,统计学习方法,清华大学出版社,2012
Scikit-learn,https://scikit-learn.org/stable/index.html
Qcon 2017 feature engineering by Gabriel Moreira
Thomas M.Cover, JoyA. Thomas. Elementsof InformationTheory. 2006
Christopher M.Bishop. Pattern Recognition and Machine Learning. Springer-Verlag. 2006
二、预测方向
1、ML预测类参考文章
1. sklearn documentation for RandomForestRegressor, http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestRegressor.html
2. Leo Breiman. (2001). “Random Forests.” Machine Learning , 45 (1): 5–32.doi:10.1023/A:10109334043243. J. H. Friedman. “Greedy Function Approximation: A Gradient BoostingMachine,” https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
3. J. H. Friedman. “Greedy Function Approximation: A Gradient Boosting Machine,”https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
4. sklearn documentation for RandomForestRegressor, http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.
RandomForestRegressor.html
5. L. Breiman, “Bagging predictors,” http://statistics.berkeley.edu/sites/default/files/techreports/421.pdf
6. Tin Ho. (1998). “The Random Subspace Method for Constructing DecisionForests.”IEEE Transactions on Pattern Analysis and Machine Intelligence ,20 (8): 832–844.doi:10.1109/34.709601
7. J. H. Friedman. “Greedy Function Approximation: A Gradient BoostingMachine,”https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
8. J. H. Friedman. “Stochastic Gradient Boosting,”https://statweb.stanford.edu/~jhf/ftp/stobst.pdf
9. sklearn documentation for GradientBoostingRegressor, http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.GradientBoostingRegressor.html
10. J. H. Friedman. “Greedy Function Approximation: A Gradient BoostingMachine,”https://statweb.stanford.edu/~jhf/ftp/trebst.pdf
11. J. H. Friedman. “Stochastic Gradient Boosting,” https://statweb.stanford.edu/~jhf/ftp/stobst.pdf
12. J. H. Friedman. “Stochastic Gradient Boosting,” https://statweb.stanford.edu/~jhf/ftp/stobst.pdf
13. sklearn documentation for RandomForestClassifier, http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.RandomForestClassifier.html
14. sklearn documentation for GradientBoostingClassifier, http://scikit-learn.org/stable/modules/generated/sklearn.ensemble.GradientBoostingClassifier.html