Tuesday, 19 September 2017

Comparison of different Neighbor Classifiers & Regressors

                                               Neighbors Techniques

Neighbors techniques are tested here to see their accuracy in terms of output.

Python program:

>>> import numpy as np
>>> import matplotlib.pyplot as plt
>>> from matplotlib.colors import ListedColormap
>>> from sklearn import neighbors, datasets
>>> n_neighbors = 24
>>> iris = datasets.load_iris()
>>> x = iris.data[:, :2]
>>> y = iris.target
>>> h = .02
>>> cmap_bold = ListedColormap(['firebrick', 'lime', 'blue'])


>>> cmap_light = ListedColormap(['pink', 'lightgreen', 'paleturquoise'])

//Calling different neighbors with Discriminant Analysis techniques//
>>> from sklearn import neighbors, tree
>>> clf14 = neighbors.NearestNeighbors()
>>> clf15 = neighbors.KNeighborsClassifier()
>>> clf16 = neighbors.RadiusNeighborsClassifier()
>>> clf17 = neighbors.RadiusNeighborsRegressor()
>>> clf18 = neighbors.KNeighborsRegressor()
>>> clf19 = neighbors.NearestCentroid()
>>> from sklearn import discriminant_analysis
>>> clf20 = discriminant_analysis.LinearDiscriminantAnalysis()
>>> clf21 = discriminant_analysis.QuadraticDiscriminantAnalysis()

//Plotting techniques//
>>> for clf in [clf15, clf17, clf18, clf19, clf20, clf21]:
...     clf.fit(x, y)
...     x_min, x_max = x[:, 0].min() -1, x[:, 0].max() +1
...     y_min, y_max = x[:, 1].min() -1, x[:, 1].max() +1
...     xx, yy = np.meshgrid(np.arange(x_min, x_max, h), np.arange(y_min, y_max, h))
...     z = clf.predict(np.c_[xx.ravel(), yy.ravel()])
...     z = z.reshape(xx.shape)
...     plt.figure()
...     plt.pcolormesh(xx, yy, z, cmap=cmap_light)
...     plt.scatter(x[:, 0], x[:, 1], c=y, cmap=cmap_bold, edgecolor='k', s=24)
...     plt.xlim(xx.min(), xx.max())
...     plt.ylim(yy.min(), yy.max())
...     plt.title("Regressor (clf='%s')" %(clf))
...

Output:







No comments:

Post a Comment