why my regression plot is inverse

3 vues (au cours des 30 derniers jours)
hamed
hamed le 26 Août 2014
Commenté : Greg Heath le 21 Sep 2014
Hello guys i use nftool for classification some data .
why my regression plot is inverse ???????
what should i do ?

Réponse acceptée

Greg Heath
Greg Heath le 28 Août 2014
Modifié(e) : Greg Heath le 28 Août 2014
Oh! You you have a classifier with {0,1} targets!
You are probably using the wrong function. Use patternnet (not fitnet or feedforwardnet). Then instead of getting innapropriate regression plots you will get confusion and roc plots.
Compare the three results from the three different functions
net = patternnet; % Case 1 Classification and Pattern Recognition
%net = fitnet; % Case 2 Regression and Curve-fitting
%net = feedforwardnet; % Case 3 Regression and Curve-fitting
plotFcns = net.plotFcns % No semicolon
Hope this helps.
Thank you for formally accepting my answer
Greg

Plus de réponses (2)

Sean de Wolski
Sean de Wolski le 26 Août 2014
You've likely overfit your model to your data. Try changing the amount of data in each of the fields (less in training, more in validation and test) to avoid overfitting.
  1 commentaire
Greg Heath
Greg Heath le 21 Sep 2014
To avoid over-fitting use one or more of the following
1. Reduce the number of hidden nodes
2. INCREASE the amount of training data
For regression include
3. use msereg instead of mse
4. Use trainbr instead of trainlm
However, for classification (trainscg,crossentropy) I'm not sure how to modify 3 and 4.
Nonetheless, overfitting may not be your problem.
Try 10 or more designs with different initial weights. See my 27 Aug answer.

Connectez-vous pour commenter.


Greg Heath
Greg Heath le 27 Août 2014
Default solutions depend on random trn/val/tst data divisions, random weight initializations and the choice of the number of hidden nodes (H=10 is the default). Therefore, it is often necessary to choose the 'best' of multiple designs.
My personal approach is to try 10 random initialization for each trial value of H smaller than the upperbound Hub = -1+ceil((Ntrn*O-O)/(I+O+1)). This makes sure that there are more training equations Ntrneq=Ntrn*O than unknown weights Nw = (I+1)*H+(H+1)*O.
For details search for my examples using
greg Nw Ntrials
Hope this helps.
Thank you for formally accepting my answer
Greg

Catégories

En savoir plus sur Sequence and Numeric Feature Data Workflows dans Help Center et File Exchange

Tags

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by