improve the AdaBoost approach based on a weighted feature selection in traditional filters, which obtains significant boost on classification accuracy. ![]() apply artificial bee colony algorithm into decision tree (embedded one), achieving global optimization. propose a wrapped algorithm named FACO, which combines the ant colony optimization algorithm and feature selection. Todorov uses valid distance metric into Relief, a classic filtering feature selection algorithm. In recent years, researches of feature selection can be summarized into three algorithms, filter, wrapper, and embedder. Raw features can be transformed by feature extraction into a set of features with statistical significance or kernel, and feature selection means choosing the best feature subset from all feature sets. įeature reduction could be divided in two categories, feature extraction and feature selection. Therefore, it is of vital importance to keep essential information with low-dimensionality, which is sort of technology called feature reduction. Reducing the dimension of the samples can effectively solve the information redundancy but unreasonable dimensionality reduction will cause information loss that affects data analysis and accuracy of classification. Such huge amount of information is hard to handle, which may cause some trouble in information processing like the dimension disaster. Owing to the pervasive use of Internet and rapid performance boost of information technology, people are exposed to an increasing amount of information in a day. Furthermore, it can significantly reduce the required time when figuring out the better results. Experimental results have proved that EEGA performs better based on the measure of accuracy. ![]() Experiments have been performed by using several standard databases with four fitness algorithms. Information entropy of features is defined as the population labels in GA rather than the direct iteration of individual fitness. In order to improve the classification accuracy and decrease time complexity, an algorithm with intelligent optimization genetic algorithm and weight distribution based on information entropy is proposed, called EEGA. Feature selection is of vital importance to reduce information redundancy and deal with the invalidation of basic classification approaches for massive dataset and too many features.
0 Comments
Leave a Reply. |
Details
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |