机器学习:6.机器学习 --- K近邻

第1关:KNN原理

1、下列说法正确的是?
A、kNN算法的预测效率与训练集的数量有关
B、kNN算法只能用于二分类
C、kNN算法只能用于回归
D、kNN算法属于监督学习

A D

2、下列说法错误的是:
A、kNN算法的训练阶段需要构建模型
B、kNN算法中k这个参数可以根据实际情况来调整
C、kNN算法支持多分类
D、kNN算法不属于无监督学习

A

第2关:K近邻再识

1、k近邻方法有几个参数?
A、1
B、2
C、3
D、4

A

2、k近邻分类只能用于二分类问题。
A、正确
B、错误

B

3、使用k近邻方法时应将数据分为训练集和测试集。
A、正确
B、错误

B

4、如何选择合适的k值?
A、试错法
B、遍历法
C、交叉验证法
D、随机法

C

第3关:K近邻小试

#引入KNeighborsClassifier模块
from sklearn.neighbors import KNeighborsClassifier

X = [[0], [1], [2], [3]]
n = int(input())
if n == 0:
   y = [1,1,0,0]
else:
   y = [0,0,1,1]

#使用KNeighborsClassifier函数以及fit函数填空
# ********** Begin ********** #
neigh =KNeighborsClassifier(n_neighbors=3)
neigh.fit(X,y)

# ********** End ********** #

#输出数字1.1的类型以及数字0.9各属于两类的概率
print(neigh.predict([[1.1]]))
print(neigh.predict_proba([[0.9]]))

第4关:K近邻实战

import numpy as np
import matplotlib.pyplot as plt
from matplotlib.colors import ListedColormap
from sklearn import neighbors, svm, tree, ensemble 
from sklearn import datasets
from sklearn.neighbors import KNeighborsClassifier
import warnings
warnings.filterwarnings("ignore")   #忽略警告
with warnings.catch_warnings():
    warnings.filterwarnings("ignore",category=DeprecationWarning)
    from numpy.core.umath_tests import inner1d


# set the number of neighbors
n_neighbors = 15

# import the iris dataset
#------------------begin--------------------
iris =datasets.load_iris()
# only take the first two features
X = iris.data[:, :2]
y = iris.target
#-------------------end---------------------

h = .02  # step size in the mesh

# Create color maps
cmap_light = ListedColormap(['#FFAAAA', '#AAFFAA', '#AAAAFF'])
cmap_bold = ListedColormap(['#FF0000', '#00FF00', '#0000FF'])

#基于K近邻分类结果描绘分类边界
#------------------begin--------------------
for weights in ['uniform', 'distance']:
  # create an instance of KNN Classifier and fit the data.
    clf = neighbors.KNeighborsClassifier(n_neighbors, weights=weights)
    clf.fit(X, y)
    # Plot the decision boundary. For that, we will assign a color to each
    # point in the mesh [x_min, x_max]x[y_min, y_max].
    x_min, x_max = X[:, 0].min() - 1, X[:, 0].max() + 1
    y_min, y_max = X[:, 1].min() - 1, X[:, 1].max() + 1
    xx, yy = np.meshgrid(np.arange(x_min, x_max, h),
                         np.arange(y_min, y_max, h))
    Z = clf.predict(np.c_[xx.ravel(), yy.ravel()])

#-------------------end---------------------

    # Put the result into a color plot
    Z = Z.reshape(xx.shape)
    plt.figure()
    plt.pcolormesh(xx, yy, Z, cmap=cmap_light)

    # Plot also the training points
    plt.scatter(X[:, 0], X[:, 1], c=y, cmap=cmap_bold,
                edgecolor='k', s=20)
    plt.xlim(xx.min(), xx.max())
    plt.ylim(yy.min(), yy.max())
    plt.title("3-Class classification (k = %i, weights = '%s')"
              % (n_neighbors, weights))

plt.savefig("step3/结果/result.png")

from sklearn.ensemble import RandomForestClassifier

X, y = datasets.make_classification(n_samples=1000, n_features=4,
                           n_informative=2, n_redundant=0,
                           random_state=0, shuffle=False)
clf_rf = RandomForestClassifier(n_estimators=100, max_depth=2,
                             random_state=0)
clf_rf.fit(X, y)

#输出clf_rf的各特征权重以及预测[0,0,0,0]的类别
#------------------begin--------------------
print(clf_rf.feature_importances_)
print(clf_rf.predict([[0,0,0,0]]))
#-------------------end---------------------

你可能感兴趣的:(机器学习,机器学习)