(1) 输入层到H1层: z1 = 11 + (-1)(-2) + 1*(-1) + 11 + 1 = 3 a1 = sigmoid(z1) = 0.9526 z2 = 12 + (-1)(-1) + 1(-2) + 1*0 + 0 = 3 a2 = sigmoid(z2) = 0.9526

H1层到H2层: z3 = 02 + 0(-1) + 0*(-2) + 01 + 0 = 0 a3 = sigmoid(z3) = 0.5 z4 = 02 + 0*(-1) + 0*(-2) + 0*(-1) + 0 = 0 a4 = sigmoid(z4) = 0.5

H2层到输出层: z5 = (-2)3 + 21 + 0*(-1) + 0*(-1) + 2 = -4 a5 = sigmoid(z5) = 0.0179 z6 = 23 + 01 + (-2)(-1) + 04 + 2 = 8 a6 = sigmoid(z6) = 0.9997

(2) 输出层的误差为: E = (0.5 - 0.0179)^2 + (1 - 0.9997)^2 = 0.964

为了优化该样本的误差,可以使用反向传播算法来更新网络参数。具体步骤如下:

  1. 计算输出层的误差项: δ5 = (y5 - a5) * a5 * (1 - a5) = (0.5 - 0.0179) * 0.0179 * (1 - 0.0179) = 0.0084 δ6 = (y6 - a6) * a6 * (1 - a6) = (1 - 0.9997) * 0.9997 * (1 - 0.9997) = 0.0001

  2. 计算H2层的误差项: δ3 = δ5 * w5 * a3 * (1 - a3) = 0.0084 * (-2) * 0.5 * (1 - 0.5) = -0.0021 δ4 = δ6 * w6 * a4 * (1 - a4) = 0.0001 * 2 * 0.5 * (1 - 0.5) = 0.000025

  3. 更新H2层到输出层的权重和偏置: w5 = w5 + η * δ5 * a3 = 3 - 0.1 * 0.0084 * 0.5 = 2.9986 w6 = w6 + η * δ6 * a4 = -1 - 0.1 * 0.0001 * 0.5 = -1.000005 b2 = b2 + η * δ5 = 2 - 0.1 * 0.0084 = 1.99916

  4. 更新H1层到H2层的权重和偏置: w3 = w3 + η * δ3 * a2 = -1 - 0.1 * (-0.0021) * 0.9526 = -1.0002 w4 = w4 + η * δ4 * a2 = 1 - 0.1 * 0.000025 * 0.9526 = 0.999997 b1 = b1 + η * δ3 = 1 - 0.1 * (-0.0021) = 1.00021

  5. 更新输入层到H1层的权重和偏置: w1 = w1 + η * δ1 * x1 = 1 - 0.1 * (-0.0021) * 1 = 1.00021 w2 = w2 + η * δ2 * x2 = -2 - 0.1 * (-0.0021) * (-1) = -1.99979 b1 = b1 + η * δ1 = 1 - 0.1 * (-0.0021) = 1.00021 b2 = b2 + η * δ2 = 2 - 0.1 * (-0.0021) = 2.00021

其中,η为学习率,可以根据实际情况进行调整

阅读如下神经网络结构图图中从左至右共有两个隐层H1、H2和一个输出层O激活函数为Sigmoid1若输入为1-1给出各层正向输出的计算过程。输入层到H1层:w1=1w2=-2w3=-1w4=1b1=1b2=0H1层到H2层:w1=2w2=-1w3=-2w4=-1b1=0b2=0H2层到输出层:w1=3w2=-1w3=-1w4=4b1=-2b2=22若输出对应真实值为051给出此时的误差。若以该样本的

原文地址: http://www.cveoy.top/t/topic/c8RQ 著作权归作者所有。请勿转载和采集!

免费AI点我,无需注册和登录