线性判别函数权向量的(修正)感知器算法【python实现】
Published:
线性判别函数权向量的(修正)感知器算法适用于 \(\omega_i/\omega_j\) 两类判别问题。
算法流程
- 赋初值:
- 迭代步数 \(k = 0\)
- 固定比例因子 \(\rho\) 为满足 \(0 \leq \rho \leq 1\) 的常数
- 解权向量的初值 \(W(0)\) 为任选的一个向量
- 连续正确分类计数器 \(N_c = 0\)
- 规范化处理:
- 对所有训练样本 \(X_m\,(m = 0, 1, ..., N-1)\),若 \(X_m \in \omega_j\),则令 \(X_m = -X_m\)
- 得到规范化的训练样本集合 \(\{X_0, X_1, ..., X_{N-1}\}\)
- 迭代计算:
- 取样本 \(X = X_{k \bmod N}\),计算判别函数 \(G(X) = W(k)^\top X\)
- 若 \(G(X) \leq 0\),则更新权向量 \(W(k+1) = W(k) + \rho X\),并重置 \(N_c = 0\)
- 否则,令 \(W(k+1) = W(k)\),增加计数器 \(N_c = N_c + 1\)
- 终止条件:
- 若 \(N_c \geq N\),则输出 \(W(k)\),算法结束
- 否则,令 \(k = k + 1\),返回步骤 3
Python 实现
import numpy as np
def perceptron(omega1, omega2, W0, rho):
"""修正的感知器算法"""
k = 0
Nc = 0
W = [W0]
omega1 = np.column_stack((omega1, np.ones((len(omega1), 1))))
omega2 = np.column_stack((omega2, np.ones((len(omega2), 1))))
X = np.vstack((omega1, -omega2))
N = len(X)
while Nc < N:
G = np.dot(W[k], X[k % N])
if G <= 0:
W.append(W[k] + rho * X[k % N])
Nc = 0
else:
W.append(W[k])
Nc += 1
# print(k, G, W[k])
k += 1
return W[-1]
def perceptron_unmodified(omega1, omega2, W0, rho):
"""未修正的感知器算法"""
k = 0
Nc = 0
W = [W0]
omega1 = np.column_stack((omega1, np.ones((len(omega1), 1))))
omega2 = np.column_stack((omega2, np.ones((len(omega2), 1))))
X = np.vstack((omega1, omega2))
N = len(X)
while Nc < N:
G = np.dot(W[k], X[k % N])
if (X[k % N] == omega1).all(axis=1).any() and G <= 0:
W.append(W[k] + rho * X[k % N])
Nc = 0
elif (X[k % N] == omega2).all(axis=1).any() and G >= 0:
W.append(W[k] - rho * X[k % N])
Nc = 0
else:
W.append(W[k])
Nc += 1
# print(k, G, W[k])
k += 1
return W[-1]
if __name__ == '__main__':
omega1 = [[0, 0], [0, 1]]
omega2 = [[1, 0], [1, 1]]
W0 = [1, 1, 1]
print(perceptron(omega1, omega2, W0, 1))
测试实例
输入:\(\omega_1 = \{(0,0), (0,1)\}\),\(\omega_2 = \{(1,0), (1,1)\}\)(均未加长),\(W(0) = (1,1,1)\),\(\rho = 1\) 输出:\(W^* = (-3, 0, 1)\)
输入:\(\omega_1 = \{(1,0,1), (0,1,1)\}\),\(\omega_2 = \{(1,1,0), (0,1,0)\}\),\(W(0) = (1,1,1,1)\),\(\rho = 1\) 输出:\(W^* = (0, -1, 3, 0)\)
参考资料:汪增福《模式识别》P38~44。