Maximizing Margin Quality And Quantity
Yuanzhe Bei, Pengyu Hong

The large-margin principle has been widely applied to learn classifiers with good generalization power. While tremendous efforts have been devoted to develop machine learning techniques that maximize margin quantity, little attention has been paid to ensure the margin quality. In this paper, we proposed a new framework that aims to achieve superior generalizability by considering not only margin quantity but also margin quality. An instantiation of the framework was derived by deploying a max-min entropy principle to maximize margin-quality in addition to using a traditional means for maximizing margin-quantity. We developed an iterative learning algorithm to solve this instantiation. We compared the algorithm with a couple of widely-used machine learning techniques (e.g., Support Vector Machines, decision tree, naive Bayes classifier, k-nearest neighbors, etc.) and several other large margin learners (e.g., RELIEF, Simba, G-flip, LOGO, etc.) on a number of UCI machine learning datasets and gene expression datasets. The results demonstrated the effectiveness of our new framework and algorithm.