0
votes

I want to solve a classification problem with 3 classes using multi layer neural network with back propagation algorithm. I'm using matlab 2012a. I'm facing trouble with newff function. I want to build a network with one hidden layer and there will be 3 neurons in the output layer, one for each class. Please advise me with example.

Here is my code

clc

%parameters
nodesInHL=7;
nodesInOutput=3;
iteration=1000;
HLtranfer='tansig';
outputTranser='tansig';
trainFunc='traingd';
learnRate=0.05;
performanceFunc='mse';


%rand('seed',0);
%randn('seed',0);
rng('shuffle');

net=newff(trainX,trainY,[nodesInHL],{HLtranfer,outputTranser},trainFunc,'learngd',performanceFunc);
net=init(net);

%setting parameters
net.trainParam.epochs=iteration;
net.trainParam.lr=learnRate;

%training
[net,tr]=train(net,trainX,trainY);

Thanks.

1

1 Answers

1
votes

The newff function was made obsolete. The recommended function is feedforwardnet, or in your case (classification), use patternnet.

You could also use the GUI of nprtool, which provides a wizard-like tool that guides you step-by-step to build your network. It even allows for code generation at the end of the experiment.

Here is an example:

%# load sample dataset
%#   simpleclassInputs: 2x1000 matrix (1000 points of 2-dimensions)
%#   simpleclassTargets: 4x1000 matrix (4 possible classes)
load simpleclass_dataset

%# create ANN of one hidden layer with 7 nodes
net = patternnet(7);

%# set params
net.trainFcn = 'traingd';            %# training function
net.trainParam.epochs = 1000;        %# max number of iterations
net.trainParam.lr = 0.05;            %# learning rate
net.performFcn = 'mse';              %# mean-squared error function
net.divideFcn = 'dividerand';        %# how to divide data
net.divideParam.trainRatio = 70/100; %# training set
net.divideParam.valRatio = 15/100;   %# validation set
net.divideParam.testRatio = 15/100;  %# testing set

%# training
net = init(net);
[net,tr] = train(net, simpleclassInputs, simpleclassTargets);

%# testing
y_hat = net(simpleclassInputs);
perf = perform(net, simpleclassTargets, y_hat);
err = gsubtract(simpleclassTargets, y_hat);

view(net)

note that NN will automatically set the number of nodes in the output layer (based on the target class matrix size)

screenshot