Friday, 1 March 2013
The effect of the learning rate.
P = [0.5152 0.8173 1.0000;
0.8173 1.0000 0.7308;
1.0000 0.7308 0.1390;
0.7308 0.1390 0.1087;
0.1390 0.1087 0.3520;
0.1087 0.3520 0.0000;]';
T=[0.7308 0.1390 0.1087 0.3520 0.0000 0.3761];
net=newff([0 1;0 1;0 1],[5,1],{'tansig','logsig'},'traingd');
net.trainParam.epochs=1000;
net.trainParam.goal=0.01;
LP.lr=0.01;
net=train(net,P,T);
If the learning rate is too large, it may swing between inappropriate solution.
If the learning rate is too small, the learning will be very slow.
learning rate=0.01:the performance=0.128
learning rate=0.1:the performance=0.185
learning rate=0.2:the performance=0.337
learning rate=0.5:the performance=0.308
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment