![]() ![]() ![]() The decision boundary is good, until some outliers data points are added to the blue class as in the image to the right. Both images in the figure shows the classification decision boundary obtained from a Least Squares Regression as detailed above in purple color. Figure 3 is taken from Chapter 4 of "Pattern Recognition and Machine Learning" by Bishop. The reason behind not achieving a perfect MCCR=1 for the wave-alike data is that classification with L east Squares Regression is prone to outliers, and it tries to fit a function such that all the training points give a small squared errors. For the wave-alike data, the MCCR = 0.94. The MCCR for the linear data set is zero using a polynomial of order 3. CCRn is the ratio of the correctly classified test points in class n divided by the total number of test points in class n. MCCR is defined as the minimum of CCR1 and CCR2. For classification accuracy, I use the Minimum Correct Classification Rate (MCCR). ![]()
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |