Open Access Open Access  Restricted Access Subscription Access

Boltzmann Machine and Hyperbolic activation function in Higher Order Neuro Symbolic Integration


Affiliations
1 Institute of Engineering Mathematics, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
2 School of Microelectronic Engineering, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
3 School of Mathematical Sciences, Universiti Sains Malaysia Penang, 11800 USM, Malaysia
 

Higher-order network structure isimportant in doing higher order programming because high-order neural networks have converge faster and have a higher memory and story capacity. Furthermore higher order networks also have higher approximation ability and robust if compare lower-order neural networks. Thus, the higher-order clauses for logic programming in Hopfield Networks are been focused in this paper. We will limit till fifth order network due to complexity issue. Hereby we employed Boltzmann Machines and hyperbolic tangent activation function to increased the performance of neuro symbolic integration. We used agent based modelling to model this problem.
User
Notifications
Font Size

  • J. J. Hopfield, Neural networks and physical systems with emergent collective computational abilities, Proceedings National Academy of Science USA, Vol. 79, 1982, pp. 2554-2558, 1982.
  • Pinkas, G., Energy minimization and the satisfiability of propositional calculus. Neural Computation, 3, pp 282-291, 1991.
  • Pinkas, G., Propositional no monotonic reasoning and inconsistency in symmetric neural networks. Proceedings of the 12th International Joint Conference on Artificial Intelligence, pp. 525-530, 1991.
  • Wan Abdullah, W.A.T., Logic Programming on a Neural Network.Int .J. Intelligent Sys, 7, pp. 513-519, 1992.
  • W. A. T. Wan Abdullah., Neural Network Logic. O. Benhar, C. Bosio, P. del Giudice and E.Tabet (eds.), Neural Networks: From Biology toHigh Energy Physics, ETS Editrice, Pisa, pp.135-142, 1991.
  • Brenton Cooper, Stability analysis of higher-order neural networks for combinatorial optimization, International Journal of Neural Systems, 12, pp. 177-186, 2002.
  • Ding, Y., Dong, L., Wang, L. and Wu, G., A High Order Neural Network to Solve Crossbar Switch Problem. In: Wong, K.W. et al. (eds.) ICONIP 2010, Part II, LNCS 6444, Heidelberg: RAMEPublishers pp. 692–699, 2010.
  • Cheung, K.-W, Lee, T., Boundary Detection by Artificial Neural Network.IEEE Proceeding of 1993 International Joint Conference on Neural Networks, Nagoya, 1993.2, pp. 1189-1194, 1993.
  • Kwok-wai Cheung, Tong Lee, Boundary Detection by Artificial Neural Network. International Joint Conference on Neural Networks, 1993.
  • Joya, G, Atencia, M.A. and Sandoval, F., Hopfield neural networks for optimization: study of the different dynamics, Neurocomputing 43, Amsterdam: Elsevier Science B.V pp.219-237, 2002.
  • Roweis S. (n.d.) Boltzmann Machines [Online]. [Accessed 15 May 2013].
  • Available from: Onine
  • Sathasivam, S. and Wan Abdullah, W.A.T., The Satisfiabilty Aspect of Logic on Little Hopfield Network.ISSN 1450-223X (7), pp.90-105, 2010.

Abstract Views: 192

PDF Views: 97




  • Boltzmann Machine and Hyperbolic activation function in Higher Order Neuro Symbolic Integration

Abstract Views: 192  |  PDF Views: 97

Authors

Muraly Velavan, Zainor Ridzuan bin Yahya
Institute of Engineering Mathematics, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
Mohamad Nazri bin Abdul Halif,
School of Microelectronic Engineering, Universiti Malaysia Perlis, 02600 Arau, Perlis, Malaysia
Saratha Sathasivam
School of Mathematical Sciences, Universiti Sains Malaysia Penang, 11800 USM, Malaysia

Abstract


Higher-order network structure isimportant in doing higher order programming because high-order neural networks have converge faster and have a higher memory and story capacity. Furthermore higher order networks also have higher approximation ability and robust if compare lower-order neural networks. Thus, the higher-order clauses for logic programming in Hopfield Networks are been focused in this paper. We will limit till fifth order network due to complexity issue. Hereby we employed Boltzmann Machines and hyperbolic tangent activation function to increased the performance of neuro symbolic integration. We used agent based modelling to model this problem.

References