Brief Biography

Dr Wang received his PhD degree in March 1995, from the School of Information Science and Engineering, Northeastern University, Shenyang, China. From September 1995 to August 1997, he worked as a Postdoctoral Fellow at the School of Electronic and Electrical Engineering, Nanyang Technological University, Singapore. He then worked as a Research Associate and Research Fellow for three years until the end of June 2001 in the Department of Computing, The Hong Kong Polytechnic University, Hong Kong. Since July 2001, he has been with the Department of Computer Science and Computer Engineering at La Trobe University, Australia, and currently working in the same department as a Reader and Associate Professor. Since 2010, Dr Wang has been an adjunct Professor in The State Key Laboratory of Synthetical Automation for Process Industries, Northeastern University, China.

He is a Senior Member of IEEE, and serving as an Editor-in-Chief for the Int. J. of Machine Intelligence and Sensory Signal Processing, Associate Editors for several international journals including IEEE Trans. On Neural Networks and Learning Systems, IEEE Trans. On Cybernetics, INFORMATION SCIENCE, NEUROCOMPUTING, and a Subject Editor for APPLIED MATHEMATICAL MODELLING.

Research Focus:

Dr. Wang's working areas include machine learning, data mining and computational intelligence systems for Bioinformatics and Engineering Applications. Technically, his research focus falls in subtle pattern discovery and recognition using neural networks and fuzzy systems, and in recent years he has been working towards to the development of randomized methods for neural networks, specifically, contributing to develop a brand new framework for building randomized learner models, termed as Stochatic Configuration Networks (SCNs). In contrast to the existing randomised learning algorithms for single layer feed-forward neural networks, we randomly assign the input weights and biases of the hidden nodes in the light of a supervisory mechanism, and the output weights are analytically evaluated in either constructive or selective manner. As fundamentals of SCN-based data modelling techniques, we establish some theoretical results on the universal approximation property. Some experimental results indicate that our proposed SCNs outperform than other randomized neural networks in terms of less human intervention on the network size setting, the scope adaptation of random parameters, fast learning and sound generalization. Deep sctochastic configuration networks (DeepSCNs) have been developed and mathematically proved as universal approximators for continous nonlinear functions defined over compact sets. DeepSCNs can be constructed efficiently (much faster than other deep neural networks) and share many great features, such as learning representation and consistency property between learning and generalization. Details about SCNs and DeepSCNs could be found in our relevant publications at this homepage or ResearchGate.

Supervision Information:

Students with sound mathematics knowledge and/or strong computing backgrounds are warmly welcome to work with me for a higher degree (PhD or Research Masters). For applicants who wish to apply for a scholarship from La Trobe University, you may get more detailed information from the link of Scholarship. Further information on research topics can be found at our Intelligent Search and Discovery (ISD) Laboratory page.