ML之kNNC:基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测daiding

简介: ML之kNNC:基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测daiding


目录

基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测

设计思路

输出结果

核心代码


 

 

 

相关文章

ML之kNNC:基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测

ML之kNNC:基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测实现


 

基于iris莺尾花数据集(PCA处理+三维散点图可视化)利用kNN算法实现分类预测

设计思路

 

 

 

输出结果

 

 

 

 

 

1. (149, 5) 
2. 5.1  3.5  1.4  0.2  Iris-setosa
3. 0  4.9  3.0  1.4  0.2  Iris-setosa
4. 1  4.7  3.2  1.3  0.2  Iris-setosa
5. 2  4.6  3.1  1.5  0.2  Iris-setosa
6. 3  5.0  3.6  1.4  0.2  Iris-setosa
7. 4  5.4  3.9  1.7  0.4  Iris-setosa
8. (149, 5) 
9.     Sepal_Length  Sepal_Width  Petal_Length  Petal_Width            type
10. 0           4.5          2.3           1.3          0.3     Iris-setosa
11. 1           6.3          2.5           5.0          1.9  Iris-virginica
12. 2           5.1          3.4           1.5          0.2     Iris-setosa
13. 3           6.3          3.3           6.0          2.5  Iris-virginica
14. 4           6.8          3.2           5.9          2.3  Iris-virginica
15. 切分点: 29
16. label_classes: ['Iris-setosa', 'Iris-versicolor', 'Iris-virginica']
17. kNNDIY模型预测,基于原数据: 0.95
18. kNN模型预测,基于原数据预测: [0.96666667 1.         0.93333333 1.         0.93103448]
19. kNN模型预测,原数据PCA处理后: [1.         0.96       0.95918367]

 

 

 

核心代码

1. class KNeighborsClassifier Found at: sklearn.neighbors._classification
2. 
3. class KNeighborsClassifier(NeighborsBase, KNeighborsMixin, 
4.     SupervisedIntegerMixin, ClassifierMixin):
5. """Classifier implementing the k-nearest neighbors vote.
6.     
7.     Read more in the :ref:`User Guide <classification>`.
8.     
9.     Parameters
10.     ----------
11.     n_neighbors : int, default=5
12.     Number of neighbors to use by default for :meth:`kneighbors` queries.
13.     
14.     weights : {'uniform', 'distance'} or callable, default='uniform'
15.     weight function used in prediction.  Possible values:
16.     
17.     - 'uniform' : uniform weights.  All points in each neighborhood
18.     are weighted equally.
19.     - 'distance' : weight points by the inverse of their distance.
20.     in this case, closer neighbors of a query point will have a
21.     greater influence than neighbors which are further away.
22.     - [callable] : a user-defined function which accepts an
23.     array of distances, and returns an array of the same shape
24.     containing the weights.
25.     
26.     algorithm : {'auto', 'ball_tree', 'kd_tree', 'brute'}, default='auto'
27.     Algorithm used to compute the nearest neighbors:
28.     
29.     - 'ball_tree' will use :class:`BallTree`
30.     - 'kd_tree' will use :class:`KDTree`
31.     - 'brute' will use a brute-force search.
32.     - 'auto' will attempt to decide the most appropriate algorithm
33.     based on the values passed to :meth:`fit` method.
34.     
35.     Note: fitting on sparse input will override the setting of
36.     this parameter, using brute force.
37.     
38.     leaf_size : int, default=30
39.     Leaf size passed to BallTree or KDTree.  This can affect the
40.     speed of the construction and query, as well as the memory
41.     required to store the tree.  The optimal value depends on the
42.     nature of the problem.
43.     
44.     p : int, default=2
45.     Power parameter for the Minkowski metric. When p = 1, this is
46.     equivalent to using manhattan_distance (l1), and euclidean_distance
47.     (l2) for p = 2. For arbitrary p, minkowski_distance (l_p) is used.
48.     
49.     metric : str or callable, default='minkowski'
50.     the distance metric to use for the tree.  The default metric is
51.     minkowski, and with p=2 is equivalent to the standard Euclidean
52.     metric. See the documentation of :class:`DistanceMetric` for a
53.     list of available metrics.
54.     If metric is "precomputed", X is assumed to be a distance matrix and
55.     must be square during fit. X may be a :term:`sparse graph`,
56.     in which case only "nonzero" elements may be considered neighbors.
57.     
58.     metric_params : dict, default=None
59.     Additional keyword arguments for the metric function.
60.     
61.     n_jobs : int, default=None
62.     The number of parallel jobs to run for neighbors search.
63.     ``None`` means 1 unless in a :obj:`joblib.parallel_backend` context.
64.     ``-1`` means using all processors. See :term:`Glossary <n_jobs>`
65.     for more details.
66.     Doesn't affect :meth:`fit` method.
67.     
68.     Attributes
69.     ----------
70.     classes_ : array of shape (n_classes,)
71.     Class labels known to the classifier
72.     
73.     effective_metric_ : str or callble
74.     The distance metric used. It will be same as the `metric` parameter
75.     or a synonym of it, e.g. 'euclidean' if the `metric` parameter set to
76.     'minkowski' and `p` parameter set to 2.
77.     
78.     effective_metric_params_ : dict
79.     Additional keyword arguments for the metric function. For most 
80.      metrics
81.     will be same with `metric_params` parameter, but may also contain the
82.     `p` parameter value if the `effective_metric_` attribute is set to
83.     'minkowski'.
84.     
85.     outputs_2d_ : bool
86.     False when `y`'s shape is (n_samples, ) or (n_samples, 1) during fit
87.     otherwise True.
88.     
89.     Examples
90.     --------
91.     >>> X = [[0], [1], [2], [3]]
92.     >>> y = [0, 0, 1, 1]
93.     >>> from sklearn.neighbors import KNeighborsClassifier
94.     >>> neigh = KNeighborsClassifier(n_neighbors=3)
95.     >>> neigh.fit(X, y)
96.     KNeighborsClassifier(...)
97.     >>> print(neigh.predict([[1.1]]))
98.     [0]
99.     >>> print(neigh.predict_proba([[0.9]]))
100.     [[0.66666667 0.33333333]]
101.     
102.     See also
103.     --------
104.     RadiusNeighborsClassifier
105.     KNeighborsRegressor
106.     RadiusNeighborsRegressor
107.     NearestNeighbors
108.     
109.     Notes
110.     -----
111.     See :ref:`Nearest Neighbors <neighbors>` in the online 
112.      documentation
113.     for a discussion of the choice of ``algorithm`` and ``leaf_size``.
114.     
115.     .. warning::
116.     
117.     Regarding the Nearest Neighbors algorithms, if it is found that two
118.     neighbors, neighbor `k+1` and `k`, have identical distances
119.     but different labels, the results will depend on the ordering of the
120.     training data.
121.     
122.     https://en.wikipedia.org/wiki/K-nearest_neighbor_algorithm
123.     """
124.     @_deprecate_positional_args
125. def __init__(self, n_neighbors=5, 
126.         *, weights='uniform', algorithm='auto', leaf_size=30, 
127.         p=2, metric='minkowski', metric_params=None, n_jobs=None, **
128.         kwargs):
129. super().__init__(n_neighbors=n_neighbors, algorithm=algorithm, 
130.          leaf_size=leaf_size, metric=metric, p=p, metric_params=metric_params, 
131.          n_jobs=n_jobs, **kwargs)
132.         self.weights = _check_weights(weights)
133. 
134. def predict(self, X):
135. """Predict the class labels for the provided data.
136. 
137.         Parameters
138.         ----------
139.         X : array-like of shape (n_queries, n_features), \
140.                 or (n_queries, n_indexed) if metric == 'precomputed'
141.             Test samples.
142. 
143.         Returns
144.         -------
145.         y : ndarray of shape (n_queries,) or (n_queries, n_outputs)
146.             Class labels for each data sample.
147.         """
148.         X = check_array(X, accept_sparse='csr')
149.         neigh_dist, neigh_ind = self.kneighbors(X)
150.         classes_ = self.classes_
151.         _y = self._y
152. if not self.outputs_2d_:
153.             _y = self._y.reshape((-1, 1))
154.             classes_ = [self.classes_]
155.         n_outputs = len(classes_)
156.         n_queries = _num_samples(X)
157.         weights = _get_weights(neigh_dist, self.weights)
158.         y_pred = np.empty((n_queries, n_outputs), dtype=classes_[0].
159.          dtype)
160. for k, classes_k in enumerate(classes_):
161. if weights is None:
162.                 mode, _ = stats.mode(_y[neigh_indk], axis=1)
163. else:
164.                 mode, _ = weighted_mode(_y[neigh_indk], weights, axis=1)
165.             mode = np.asarray(mode.ravel(), dtype=np.intp)
166.             y_pred[:k] = classes_k.take(mode)
167. 
168. if not self.outputs_2d_:
169.             y_pred = y_pred.ravel()
170. return y_pred
171. 
172. def predict_proba(self, X):
173. """Return probability estimates for the test data X.
174. 
175.         Parameters
176.         ----------
177.         X : array-like of shape (n_queries, n_features), \
178.                 or (n_queries, n_indexed) if metric == 'precomputed'
179.             Test samples.
180. 
181.         Returns
182.         -------
183.         p : ndarray of shape (n_queries, n_classes), or a list of n_outputs
184.             of such arrays if n_outputs > 1.
185.             The class probabilities of the input samples. Classes are ordered
186.             by lexicographic order.
187.         """
188.         X = check_array(X, accept_sparse='csr')
189.         neigh_dist, neigh_ind = self.kneighbors(X)
190.         classes_ = self.classes_
191.         _y = self._y
192. if not self.outputs_2d_:
193.             _y = self._y.reshape((-1, 1))
194.             classes_ = [self.classes_]
195.         n_queries = _num_samples(X)
196.         weights = _get_weights(neigh_dist, self.weights)
197. if weights is None:
198.             weights = np.ones_like(neigh_ind)
199.         all_rows = np.arange(X.shape[0])
200.         probabilities = []
201. for k, classes_k in enumerate(classes_):
202.             pred_labels = _y[:k][neigh_ind]
203.             proba_k = np.zeros((n_queries, classes_k.size))
204. # a simple ':' index doesn't work right
205. for i, idx in enumerate(pred_labels.T): # loop is O(n_neighbors)
206.                 proba_k[all_rowsidx] += weights[:i]
207. 
208. # normalize 'votes' into real [0,1] probabilities
209.             normalizer = proba_k.sum(axis=1)[:np.newaxis]
210.             normalizer[normalizer == 0.0] = 1.0
211.             proba_k /= normalizer
212.             probabilities.append(proba_k)
213. 
214. if not self.outputs_2d_:
215.             probabilities = probabilities[0]
216. return probabilities

 


相关文章
|
2月前
|
机器学习/深度学习 算法 安全
【无人机三维路径规划】多目标螳螂搜索算法MOMSA与非支配排序的鲸鱼优化算法NSWOA求解无人机三维路径规划研究(Matlab代码实现)
【无人机三维路径规划】多目标螳螂搜索算法MOMSA与非支配排序的鲸鱼优化算法NSWOA求解无人机三维路径规划研究(Matlab代码实现)
110 0
|
3月前
|
算法 BI 定位技术
三维Chan算法解决室内定位问题的MATLAB实现
三维Chan算法解决室内定位问题的MATLAB实现
66 0
|
2月前
|
机器学习/深度学习 边缘计算 并行计算
【无人机三维路径规划】基于遗传算法GA结合粒子群算法PSO无人机复杂环境避障三维路径规划(含GA和PSO对比)研究(Matlab代码代码实现)
【无人机三维路径规划】基于遗传算法GA结合粒子群算法PSO无人机复杂环境避障三维路径规划(含GA和PSO对比)研究(Matlab代码代码实现)
170 2
|
3月前
|
传感器 算法 Python
【无人机设计与控制】改进型粒子群优化算法(IPSO)的无人机三维路径规划研究(Matlab代码实现)
【无人机设计与控制】改进型粒子群优化算法(IPSO)的无人机三维路径规划研究(Matlab代码实现)
131 1
|
2月前
|
算法 调度 决策智能
基于高尔夫优化算法GOA求解无人机三维路径规划研究(Matlab代码实现)
基于高尔夫优化算法GOA求解无人机三维路径规划研究(Matlab代码实现)
|
2月前
|
机器学习/深度学习 存储 算法
基于密集型复杂城市场景下求解无人机三维路径规划的Q-learning 算法研究(Matlab代码实现)
基于密集型复杂城市场景下求解无人机三维路径规划的Q-learning 算法研究(Matlab代码实现)
|
3月前
|
传感器 编解码 分布式计算
【创新未发表】基于吕佩尔狐算法RFO复杂城市地形无人机避障三维航迹规划研究(Matlab代码实现)
【创新未发表】基于吕佩尔狐算法RFO复杂城市地形无人机避障三维航迹规划研究(Matlab代码实现)
|
4月前
|
算法 数据可视化 数据挖掘
基于EM期望最大化算法的GMM参数估计与三维数据分类系统python源码
本内容展示了基于EM算法的高斯混合模型(GMM)聚类实现,包含完整Python代码、运行效果图及理论解析。程序使用三维数据进行演示,涵盖误差计算、模型参数更新、结果可视化等关键步骤,并附有详细注释与操作视频,适合学习EM算法与GMM模型的原理及应用。
|
3月前
|
算法 数据处理 定位技术
基于TDOA算法的三维定位
基于TDOA算法的三维定位
342 0
|
4月前
|
机器学习/深度学习 人工智能 算法
AP聚类算法实现三维数据点分类
AP聚类算法实现三维数据点分类
151 0

热门文章

最新文章