The mixture-of-experts (ME) models can be useful to solve complicated classification problems in real world. However, in order to train the ME model with not only labeled data but also unlabeled data which are easier to come, a new learning algorithm that considers characteristics of the ME model is required. We proposed global-local co-training (GLCT), the hybrid training method of the ME model training method for supervised learning (SL) and the co-training, which trains the ME model in semi-supervised learning (SSL) manner. GLCT uses a global model and a local model together since using the local model only shows low accuracy due to lack of labeled training data. The models enlarge the labeled data set from the unlabeled one and are trained from it by supplementing each other. To evaluate the method, we performed experiments using benchmark data sets from UCI machine learning repository. As the result, GLCT confirmed the feasibility of itself. Moreover, a comparison experiments to show the excellences of GLCT showed better performance than the other alternative method.
|Title of host publication||Hybrid Artificial Intelligent Systems - 6th International Conference, HAIS 2011, Proceedings|
|Number of pages||8|
|Publication status||Published - 2011|
|Name||Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)|
Bibliographical noteFunding Information:
Acknowledgement. This research was supported by the Converging Research Center Program through the Converging Research Headquarter for Human, Cognition and Environment funded by the Ministry of Education, Science and Technology (2010K001173).
All Science Journal Classification (ASJC) codes
- Theoretical Computer Science
- Computer Science(all)