Disparity in Intelligent Classification of Data Sets Due to Dominant Pattern Effect (DPE)

Abstract

A hypothesis of the existence of dominant pattern that may affect the performance of a neural based pattern recognition system and its operation in terms of correct and accurate classification, pruning and optimization is assumed, presented, tested and proved to be correct. Two sets of data subjected to the same ranking process using four main features are used to train a neural network engine separately and jointly. Data transformation and statistical pre-processing are carried out on the datasets before inserting them into the specifically designed multi-layer neural network employing Weight Elimination Algorithm with Back Propagation (WEA-BP). The dynamics of classification and weight elimination process is correlated and used to prove the dominance of one dataset. The presented results proved that one dataset acted aggressively towards the system and displaced the first dataset making its classification almost impossible. Such modulation to the relationships among the selected features of the affected dataset resulted in a mutated pattern and subsequent re-arrangement in the data set ranking of its members.

Share and Cite:

Iskandarani, M. (2015) Disparity in Intelligent Classification of Data Sets Due to Dominant Pattern Effect (DPE). Journal of Intelligent Learning Systems and Applications, 7, 75-86. doi: 10.4236/jilsa.2015.73007.

Conflicts of Interest

The authors declare no conflicts of interest.

References

[1] Belghinia, N., Zarghilib, A., Kharroubic, J. and Majdad, A. (2012) Learning a Backpropagation Neural Network with Error Function Based on Bhattacharyya Distance for Face Recognition. International Journal of Image, Graphics and Signal Processing, 4, 8-14.
http://dx.doi.org/10.5815/ijigsp.2012.08.02
[2] Wilamowski, B.M. and Yu, H. (2010) Neural Network Learning without Backpropagation. IEEE Transactions on Neural Networks, 21, 1793-1803. http://dx.doi.org/10.1109/TNN.2010.2073482
[3] Subrahmanya, N. and Yung, C.S. (2010) Constructive Training of Recurrent Neural Networks Using Hybrid Optimization. Neurocomputing, 73, 2624-2631.
http://dx.doi.org/10.1016/j.neucom.2010.05.012
[4] Puma-Villanuevaa, W.J., Santos, E.P. and Zuben, F.J.A. (2012) Constructive Algorithm to Synthesize Connected Feedforward Neural Networks. Neurocomputing, 75, 14-32.
http://dx.doi.org/10.1016/j.neucom.2011.05.025
[5] May, P., Zhou, E. and Lee, C.W. (2014) Improved Generalization in Recurrent Neural Networks Using the Tangent Plane Algorithm. International Journal of Advanced Computer Science and Applications, 5, 118-126.http://dx.doi.org/10.14569/IJACSA.2014.050317
[6] Ahmed, S.U., Shahjahan, M.D. and Kazuyuki, M. (2011) A Limpel-Ziv Complexity Based Neural Network Pruning Algorithm. International Journal of Neural Systems, 21, 427-441.
http://dx.doi.org/10.1142/S0129065711002936
[7] Bavafaye Haghighia, E., Palm, G., rahmati, M. and Yazdanpanahc, M.J. (2015) A New Class of Multi-Stable Neural Networks: Stability Analysis and Learning Process. Neural Networks, 65, 53-64.
http://dx.doi.org/10.1016/j.neunet.2015.01.010
[8] Nie, X. and Zheng, W.X. (2015) Multistability of Neural Networks with Discontinuous Non-Monotonic Piecewise Linear Activation Functions and Time-Varying Delays. Neural Networks, 65, 65-70.
http://dx.doi.org/10.1016/j.neunet.2015.01.007
[9] Wright Shrestha, S.B. and song, Q. (2015) Adaptive Learning Rate of SpikeProp Based on Weight Convergence Analysis. Neural Networks, 63, 185-198.
http://dx.doi.org/10.1016/j.neunet.2014.12.001
[10] Li, X., Gong, X., Peng, X. and Peng, S. (2014) SSiCP: A New SVM Based Recursive Feature Elimination Algorithm for Multiclass Cancer Classification. International Journal of Multimedia and Ubiquitous Engineering, 9, 347-360.http://dx.doi.org/10.14257/ijmue.2014.9.6.33
[11] Qingsong, Z., Jiaqiang, E., Jinke, G., Lijun, L., Tao, C., Shuhui, W. and Guohai, J. (2014) Functional Link Neural Network Prediction on Composite Regeneration Time of Diesel Particulate Filter for Vehicle Based on Fuzzy Adaptive Variable Weight Algorithm. Journal of Information & Computational Science, 11, 1741-1751.http://dx.doi.org/10.12733/jics20103209
[12] May, P., Zhou, E. and Lee, C.W. (2013) A Comprehensive Evaluation of Weight Growth and Weight Elimination Methods Using the Tangent Plane Algorithm. International Journal of Advanced Computer Science and Applications, 4, 149-156. http://dx.doi.org/10.14569/IJACSA.2013.040621
[13] Ennett, C.M. and Friz, M. (2003) Weight-elimination Neural Networks Applied to Coronary Surgery Mortality Prediction. Information Technology in Biomedicine. IEEE Transactions on Information Technology in Biomedicine, 7, 86-92. http://dx.doi.org/10.1109/TITB.2003.811881
[14] Lalis, J.T., Gerardo, B.D. and Byun, Y. (2014) An Adaptive Stopping Criterion for Backpropagation Learning in Feedforward Neural Network. International Journal of Multimedia and Ubiquitous Engineering, 9, 149-156.http://dx.doi.org/10.14257/ijmue.2014.9.8.13
[15] Iskandarani, M.Z. (2014) A Novel Approach to System Security Using Derived Odor Keys with Weight Elimination Neural Algorithm (DOK-WENA). Transactions on Machine Learning and Artificial Intelligence, 2, 20-31.http://dx.doi.org/10.14738/tmlai.22.138

Copyright © 2023 by authors and Scientific Research Publishing Inc.

Creative Commons License

This work and the related PDF file are licensed under a Creative Commons Attribution 4.0 International License.