_覓 | 覦覈襦 | 豕蠏手 | 殊螳 | 譯殊碁 |
FrontPage › Python-ConfusionMatrix
|
|
#iris 一危一誤 襷り鍵 import numpy as np import pandas as pd from sklearn.datasets import load_iris iris = load_iris() iris.data iris.feature_names iris.target iris.target_names iris_df = pd.DataFrame(iris.data, columns=iris.feature_names) iris_df["target"] = iris.target iris_df["target_names"] = iris.target_names[iris.target] iris_df[:5] #誤, ろ語誤 蠍 from sklearn.model_selection import train_test_split train_set, test_set = train_test_split(iris_df, test_size = 0.3) train_set.shape test_set.shape #kNN import sklearn.neighbors as nn knn = nn.KNeighborsClassifier(n_neighbors = 1) # knn.fit(X=train_set.ix[:, [0,1,2,3]], y=train_set.target) #ろ pred = knn.predict(X=test_set.ix[:, [0,1,2,3]]) #焔ロろ from sklearn.metrics import confusion_matrix from sklearn.metrics import classification_report #confusion_matrix #classification_report c_mat = confusion_matrix(test_set.target.values, pred) print('\n\nConfusion Matrix\n', c_mat) print('\n', classification_report(test_set.target.values, pred, target_names=iris.target_names)) 蟆郁骸
print('\n\nConfusion Matrix\n', c_mat) print('\n', classification_report(test_set.target.values, pred, target_names=iris.target_names)) Confusion Matrix [[12 0 0] [ 0 21 1] [ 0 1 10]] precision recall f1-score support setosa 1.00 1.00 1.00 12 versicolor 0.95 0.95 0.95 22 virginica 0.91 0.91 0.91 11 avg / total 0.96 0.96 0.96 45
confusion matrix 螳() 襷朱 襷襦 譬蟇磯.
鏤
|
螳 蟾 螳 豺覓 . (襷 覓伎) |