目录
相关文章
ML:基于自定义数据集利用Logistic、梯度下降算法GD、LoR逻辑回归、Perceptron感知器、SVM支持向量机、LDA线性判别分析算法进行二分类预测(决策边界可视化)
ML:基于自定义数据集利用Logistic、梯度下降算法GD、LoR逻辑回归、Perceptron感知器、SVM支持向量机、LDA线性判别分析算法进行二分类预测(决策边界可视化)实现
基于自定义数据集利用Logistic、梯度下降算法GD、LoR逻辑回归、Perceptron感知器、支持向量机(SVM_Linear、SVM_Rbf)、LDA线性判别分析算法进行二分类预测(决策边界可视化)
设计思路
输出结果
1. w_target.shape: (3,) [ 1.17881511 -5.13265596 -6.55556511] 2. Pre_Logistic_function 3. <class 'function'> 4. Product_x_function 5. [1. 0.10262954 0.43893794] 6. data_x 7. (300, 3) [[ 1. -0.15378708 0.9615284 ] 8. [ 1. 0.36965948 -0.0381362 ] 9. [ 1. -0.21576496 -0.31364397] 10. [ 1. 0.45809941 -0.12285551] 11. [ 1. -0.88064421 -0.20391149]]
核心代码
1. def gradient_descent(data_x, data_y, w_h=None, eta=1.0, max_iterations=10000, epsilon=0.001): 2. if w_h == None: 3. w_h = np.array([0.0 for i in range(data_x.shape[1])]) 4. w_h_i = [np.copy(w_h)] 5. 6. for i in range(max_iterations): 7. subset_indices = range(data_x.shape[0]) 8. grad_E_in = np.mean(np.tile(- data_y[subset_indices] / 9. ( 1.0 + np.exp(data_y[subset_indices] * w_h.dot(data_x[subset_indices].T)) ), 10. (data_x.shape[1], 1)).T * data_x[subset_indices], axis=0) 11. w_h -= eta * grad_E_in 12. w_h_i.append(np.copy(w_h)) 13. if np.linalg.norm(grad_E_in) <= np.linalg.norm(w_h) * epsilon: 14. break 15. return np.array(w_h_i) 16. 17. 18. LoR = linear_model.LogisticRegression() 19. LoR.fit(data_x,data_y) 20. y_train=LoR.predict(data_x) 21. 22. LoRpp_function = lambda z: LoR.predict_proba(z)[:,0] 23. BG_Grid_BaseLoR = apply_to_fill(z_grid, LoRpp_function) 24. 25. full_N_fig = plot_dataset_and_hypothesis(3,data_x, data_y, xy_1, xy_2, BG_Grid_BaseLoR,title=r'LoR: Hypothesis, $N={:}$'.format(N)) 26. 27. 28. 29. SVM_Linear = svm.SVC(kernel='linear') 30. SVM_Linear.fit(data_x,data_y) 31. 32. SVM_LinearPre_function = lambda z: SVM_Linear.predict(z) 33. BG_Grid_BaseSVM_Linear = apply_to_fill(z_grid, SVM_LinearPre_function) 34. 35. full_N_fig = plot_dataset_and_hypothesis(5,data_x, data_y, xy_1, xy_2, BG_Grid_BaseSVM_Linear, title=r'SVM_Linear: Hypothesis, $N={:}$'.format(N)) 36. 37. 38. 39. 40. 41. 42.