.. note:: :class: sphx-glr-download-link-note Click :ref:`here ` to download the full example code .. rst-class:: sphx-glr-example-title .. _sphx_glr_packages_scikit-learn_auto_examples_plot_compare_classifiers.py: Compare classifiers on the digits data ======================================= Compare the performance of a variety of classifiers on a test set for the digits data. .. rst-class:: sphx-glr-script-out Out: .. code-block:: none LinearSVC: 0.9341800269333108 GaussianNB: 0.8332741681010101 KNeighborsClassifier: 0.9804562804949924 ------------------ LinearSVC(loss='hinge'): 0.9294570108037394 LinearSVC(loss='squared_hinge'): 0.9341371852581549 ------------------- KNeighbors(n_neighbors=1): 0.9913675218842191 KNeighbors(n_neighbors=2): 0.9848442068835102 KNeighbors(n_neighbors=3): 0.9867753449543099 KNeighbors(n_neighbors=4): 0.9803719053818863 KNeighbors(n_neighbors=5): 0.9804562804949924 KNeighbors(n_neighbors=6): 0.9757924194139573 KNeighbors(n_neighbors=7): 0.9780645792142071 KNeighbors(n_neighbors=8): 0.9780645792142071 KNeighbors(n_neighbors=9): 0.9780645792142071 KNeighbors(n_neighbors=10): 0.9755550897728812 | .. code-block:: python from sklearn import model_selection, datasets, metrics from sklearn.svm import LinearSVC from sklearn.naive_bayes import GaussianNB from sklearn.neighbors import KNeighborsClassifier digits = datasets.load_digits() X = digits.data y = digits.target X_train, X_test, y_train, y_test = model_selection.train_test_split(X, y, test_size=0.25, random_state=0) for Model in [LinearSVC, GaussianNB, KNeighborsClassifier]: clf = Model().fit(X_train, y_train) y_pred = clf.predict(X_test) print('%s: %s' % (Model.__name__, metrics.f1_score(y_test, y_pred, average="macro"))) print('------------------') # test SVC loss for loss in ['hinge', 'squared_hinge']: clf = LinearSVC(loss=loss).fit(X_train, y_train) y_pred = clf.predict(X_test) print("LinearSVC(loss='{0}'): {1}".format(loss, metrics.f1_score(y_test, y_pred, average="macro"))) print('-------------------') # test the number of neighbors for n_neighbors in range(1, 11): clf = KNeighborsClassifier(n_neighbors=n_neighbors).fit(X_train, y_train) y_pred = clf.predict(X_test) print("KNeighbors(n_neighbors={0}): {1}".format(n_neighbors, metrics.f1_score(y_test, y_pred, average="macro"))) **Total running time of the script:** ( 0 minutes 1.025 seconds) .. _sphx_glr_download_packages_scikit-learn_auto_examples_plot_compare_classifiers.py: .. only :: html .. container:: sphx-glr-footer :class: sphx-glr-footer-example .. container:: sphx-glr-download :download:`Download Python source code: plot_compare_classifiers.py ` .. container:: sphx-glr-download :download:`Download Jupyter notebook: plot_compare_classifiers.ipynb ` .. only:: html .. rst-class:: sphx-glr-signature `Gallery generated by Sphinx-Gallery `_