[机器学习]机器学习笔记整理11-神经网络算法简单实现

原理

[机器学习]机器学习笔记整理10- 神经网络算法

1. 关于非线性转化方程(non-linear transformation function)

sigmoid函数(S 曲线)用来作为activation function:

 1.1 双曲函数(tanh)
 1.2  逻辑函数(logistic function)

2. 实现一个简单的神经网络算法



#!/usr/bin/python
# -*- coding:utf-8 -*-

# 每个图片8x8  识别数字:0,1,2,3,4,5,6,7,8,9

import numpy as np
from sklearn.datasets import load_digits
from sklearn.metrics import confusion_matrix, classification_report
from sklearn.preprocessing import LabelBinarizer
from NeuralNetwork import NeuralNetwork
from sklearn.cross_validation import train_test_split


digits = load_digits()
X = digits.data
y = digits.target
X -= X.min()  # normalize the values to bring them into the range 0-1
X /= X.max()

nn = NeuralNetwork([64, 100, 10], 'logistic')
X_train, X_test, y_train, y_test = train_test_split(X, y)
labels_train = LabelBinarizer().fit_transform(y_train)
labels_test = LabelBinarizer().fit_transform(y_test)
print "start fitting"
nn.fit(X_train, labels_train, epochs=3000)
predictions = []
for i in range(X_test.shape[0]):
    o = nn.predict(X_test[i])
    predictions.append(np.argmax(o))
print confusion_matrix(y_test, predictions)
print classification_report(y_test, predictions)

运行结果

[机器学习]机器学习笔记整理11-神经网络算法简单实现_第1张图片

你可能感兴趣的:(1.机器学习与算法笔记)