MEV Mechatronics, Electrical Power, and Vehicular Technology 05(2014) 91-98 Mechatronics, Electrical Power, andVehicular Technology e-ISSN:2088-6985 p-ISSN: 2087-3379 Accreditation Number: 432/Akred-LIPI/P2MI-LIPI/04/2012 www.mevjournal.com © 2014 RCEPM - LIPI All rights reserved doi: 10.14203/j.mev.2014.v5.91-98 LEARNING EFFICIENCY OF CONSCIOUSNESS SYSTEM FOR ROBOT USING ARTIFICIAL NEURAL NETWORK Osama Shoubaky a, *, Tala M. Sharari b ªComputer and Intelligent Systems Center, PO.Box. 2150, Jordan b Institute of Engineering &Technology, Department of Electrical Engineering, Control Laboratory and Automation, PO.Box. 81, Jordan Received 26 October 2014; received in revised form 22 November 2014; accepted 23 November 2014 Published online 24 December 2014 Abstract This paper presents learning efficiency of a consciousness system for robot using artificial neural network. The proposed conscious system consists of reason system, feeling system and association system. The three systems are modeled using Module of Nerves for Advanced Dynamics (ModNAD). Artificial neural network of the type of supervised learning with the back propagation is used to train the ModNAD. The reason system imitates behaviour and represents self-condition and other-condition. The feeling system represents sensation and emotion. The association system represents behaviour of self and determines whether self is comfortable or not. A robot is asked to perform cognition and tasks using the consciousness system. Learning converges to about 0.01 within about 900 orders for imitation, pain, solitude and the association modules. It converges to about 0.01 within about 400 orders for the comfort and discomfort modules. It can be concluded that learning in the ModNAD completed after a relatively small number of times because the learning efficiency of the ModNAD artificial neural network is good. The results also show that each ModNAD has a function to imitate and cognize emotion. The consciousness system presented in this paper may be considered as a fundamental step for developing a robot having consciousness and feelings similar to humans. Keywords: consciousness, robot, artificial neural network. I. INTRODUCTION Consciousness has been studied extensively in brain science, neural science, psychology, philosophy, others academic fields and recently in robotics [1-9]. Various robots that can communicate with consciousness and feeling functions have been reported by many researchers and companies [10-18]. The Actroid robot is one of them [1], however it cannot talk to others cleverly because it lacks human-like consciousness and feelings. The WE-4R II robot of Waseda University reportedly has the functions for feelings [2]. Emotional equation is used in this robot system. This robot is different from the robot proposed in this paper which expresses emotion using artificial neural models of the consciousness system. Moreover, this paper reports not only emotion but also the function of consciousness. Both human consciousness and emotion are being actively studied and interesting researches have been reported in various fields [14-18]. Nevertheless, no paper has ever elucidated the relationship between emotions and consciousness. The present paper proposes a computational model capable of realising the function of emotion and reports on the experiments from which robots learn the imitation behaviour. The emotions are assumed to be cognised by imitating others behaviour. This will be able to cause advanced self-cognition and other-cognition. The relationship between consciousness and imitation behaviours, self-cognition, and other-cognition is discussed in the next section. Consciousness is generally considered to be the state when one is paying attention to something, thinking, or awake. With a focus on the imitation function of humans, this paper * Corresponding Author. Phone:+962-0778679772 E-mail: o.shoubaky@yahoo.com http://dx.doi.org/10.14203/j.mev.2014.v5.91-98 O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 92 defines consciousness to arise from “a consistency of cognition and behaviour.” Based on this belief, this paper presents a devised module for the consciousness system named Module of Nerves for Advanced Dynamics (ModNAD). In addition, by the ModNAD, most of Husserl’s ten functions of consciousness can be accounted for [8]. A strong relationship was found between consciousness and imitation based on four important instances of consciousness: mirror neuron, mimisis theory [3-5], medical cases of imitation behaviour [6], and a study of imitation behaviour [7, 12]. The immediate objective of the present study is the realisation of self- consciousness. As the first step inthe study, there was an attempt to realise the function of imitation and verify whether self-cognition coud be realized based on feedback information regarding the condition of self and the other. An implementation of self-consciousness must need self-cognition. First, other and self conditions are represented by imitating others behaviours with the consciousness system. Second, the representations of self and others are compared. If other representation is similar to self representation, the other is more-self-like. During the imitation experiments for the consciousness system, a great interest was given to how an autonomous robot behaved when it confronted difficulty caused by harmful obstacles. This theme has a close relationship with emotion. Emotion is generally said to have the following three features [8, 9], as follow: 1. Emotion is evoked by internal information representing a change in the condition of the body or by external information. For example, one feels pain upon hitting something, or feels bad when one’s stomach is upset 2. Emotion plays the role of an adjuster in the body. This is particularly important for understanding the homeostasis function [8] 3. Emotion helps people to reason and make choices. Based on these observations, with regard to feelings of discomfort, a new hypothesis is proposed considering in which emotions are generated by internal or external information to the body, they make people pay attention to the cause of the discomfort, and assist the person to avoid the discomfort-causing behaviour, eventually enabling the person to avoid harmful damage using this assistance. By implementation of the emotional function, self-cognition and other-cognition will be able to make further progress. This will enable robots to possess human-like sociality. II. OUTLINE OF THE CONSCIOUSNESS SYSTEM A. ModNAD Structure The consciousness module "ModNAD" is a consciousness computational model using neural networks. Shown in Figure 1, ModNAD consists of the cognition system (a), behaviour system (b), primary representation (c) which is common area of cognition and behaviour systems, and symbolic representation (d), and input/output units (A/B). Symbolic representation has cognition representation RL, which represents what ModNAD cognises now, and behaviour representation BL, which represents symbol how ModNAD will behave next. These two representations serve as the communication terminals for the higher-level modules. Without information transmitted from the higher-level module, the information of cognition representation RL is basically copied to behaviour representation BL. The most important feature of ModNAD is primary representation, the common area for the cognition and the behaviour system. Primary representation enables the system to learn behaviour while it cognises and, conversely, to learn cognition while the system behaves. It is also possible torealise artificial thought and expectation using ModNAD. Because the feedback B’ as somatic sensation is in this system, therefore, self-condition can be grasped more. B. General Concept of the System Figure 2 shows a conceptual model of the consciousness system proposed in this paper. An important feature of the system is that it consists of multiple ModNADs. The major components of the consciousness system are the reason, feelings and association system. The reason system cognises the external environment based on information received from the input unit (Figure 2 (1)). The output unit performs the decided behaviour. The reason system has a hierarchical structure of ModNADs. Figure 1. ModNAD concept model O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 93 The feelings system represents feelings based on information from the external environment and the internal environment reflecting bodily changes (Figure 2 (1), (2)). The feelings system is also a hierarchical structure of ModNADs. The highest layer of the feelings system holds two modules corresponding to comfort and discomfort. Another important feature of the consciousness system is that information from the reason system is also used in cognising comfort or discomfort. Information from the reason system (cognised language label) is input into the comfort and discomfort modules (Figure 2 (3)). The reason system and the feelings system exchange information between higher and lower levels using their hierarchical structures. The association system receives two inputs: information cognised in the reason system and the condition of self as understood by the feelings system (Figure 2 (4), (5)). Based on these two pieces of information, the association system determines the behaviour that will make self comfortable. To reflect the decision on the behaviour of the robot the association system produces information to the reason system and the feelings system (Figure 2 (6)). This functioning of the association system modifies the representation of both the reason and feelings systems and, eventually, the reason system outpu the command to perform a certain behaviour that makes the robot comfortable (Figure 2 (7)). At this time, the association system is not a so-called homunculus because the ModNAD of the association system is driven by information from the lower-level ModNADs. . III. DEVELOPMENT OF CONSCIOUS SYSTEM This section describes the learning functions of each of the reason, feelings and association systems. Fundamental construction of the ModNAD is shown in Figure 3. As shown in Figures 4, 5 and 6, the ModNAD is applied to the implementation of the emotional and associating function. Figure 3 also shows the network in the reason system to imitate for representing self and other condition. Figure 4 shows the lower-level networks in the feelings system to represent feeling of pain from internal information. Figure 5 shows the higher-level network in the feelings system to represent feeling of comfort from the lower-level emotional representation. Figure 6 shows the network in the association system to give support Figure 2. Consciousness system concept model Figure 3. ModNAD computational model and imitation behaviour module Figure 4. Pain module of consciousness system Figure 5. Comfortable module of consciousness system Figure 6. The association module of the consciousness system O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 94 to decide next action from the comfortable and uncomfortable representation, the representation of self-cognition and other-cognition. In this paper, the differences of the network form are considered. The networks in the feelings system do not have output unit M because they are used mainly to represent self condition. But an information flow circulates from primary representation to symbolic representation, and it can change emotional condition and the representation. This change will generate emotional thinking. The association network does not have feedback, M’. The reason is that the association network is the higher-level ModNAD and does not have motors as somatic sensation. This network also has circulation, so it capable in implementing thought. A. Consciousness System for Self-Cognition Figure 7 shows an overall image of the consciousness system developed for conducting experiments on mirror image cognition. Module A, the reason system, imitates behaviour based on external information from the infrared (IR) sensor and represents self-condition and other- condition. Modules B through E make up the feelings system: B represents “pain” based on internal information or an error detected by PID control; C represents “solitude” from the value of the IR sensor; D and E receive the values of representation from modules A, B, and C represent the emotions of “comfort” and “discomfort”, respectively. Module F is the association system. Three behaviours of advance, stop and retreat are used in the robotic experiments. Just one module A is used in the reason system but two or more modules may be used for robots requiring more complex motions. The robot used in the experiments is Khepera2. These robots have infrared sensors and motors of PID control. The robot is assumed to have two sensations of pain and solitude in the experiment. The reason why the error detected by PID control gives pain to the robot and why the sensor condition generates solitude is described below. This paper focused on the fact that emotion plays a role of adjuster in the body. When no error is detected by PID control, the robot moves at the set speed. When the error detected by PID control increases, some fault such as friction is present and the actual speed decreases to below the set speed. This increases the load on the motor, and the robot thinks the speed must be adjusted, which gives pain to the robot. The present consciousness system is designed to imitate behaviours and learn self-cognition using emotions. If the sensor does not react, the robot cannot learn, making self-cognition impossible. If no information arrives at the sensor, discomfort results causing solitude. Pain and solitude prevent the robots from imitating for self-cognition, which is discomfort. This section describes the flow of information through the consciousness system that performs self-cognition (Figure 7). The value of the infrared sensor for capturing external information is input into terminals In1 and In3. Any error detected by PID control representing internal information is input into terminal In2. In the symbolic representation of imitation module A, behaviour of self and the other is cognised from the external information input into In1 and from the somatic sensation of A. Using information from In2 and In3, the degree of pain and solitude is matched to one of four stages of language labels at the symbolic representations B and C, respectively. As a result of the cognition of behaviour of self and the other at A, the degree of pain and solitude determined at B and C are respectively input as comfort and discomfort into modules D and E via p1, p2 and p3. Similarly, in the comfort and discomfort modules, the degree of comfort and discomfort is matched to one of four stages of language labels at the symbolic representation. The result of the cognition by the module of imitation at A and the degree of comfort and discomfort respectively determined at D and E are eventually input into module F via p4, p5 and p6 to associate information from the reason system and the feelings system. The association system module F represents at A-RL (Figure 6) the behaviour of self and whether self is comfortable or not at the present moment as determined by the result of the cognition by the imitation module and the degree of comfort and discomfort determined. This information from cognition representation A-RL is copied to the behaviour representation A-BL (Figure 6). Figure 7. Consciousness system for self-cognition O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 95 As a result, the association system module F transmits information to module A via p7, enabling it to change the “expectation of behaviour” generated in the cognition-by- imitation process of module A. This means that the reason system is dictated to prepare the next behaviour according to the condition of the feelings system of self at the present moment. If A-RL of the module F represents discomfort, module F transmits to the lower-level module information by which the lower-level reason module will invite discomfort as a result of behaviour. If A-RL of module F represents comfort, module F transmits to the lower-level module information by which the lower-level reason module will allow comfort to continue as a result of behaviour. Information determined by the association system module F is transmitted to the feelings systems D and E via p8. This information affects the feedback on comfort and discomfort. This feedback is assumed to be able to cause a change in emotional thinking without input. Assume the result of the cognition by imitation at the reason system is advance for both self and the other and the robot is imitating the specified behaviour smoothly. If at this time the degree of discomfort determined by the feelings system is high, the consciousness system judges that advance is a discomfort. In response, the association system determines a new step and instructs the robot to retreat in the next behaviour. The timing of the above determination is described below. In the mirror image cognition experiments, the robot built for these experiments repeatedly imitated the motion of its image in the mirror. When the robot imitates the motion of its image in the mirror, and if the robot itself advances, the other or the mirror image always advances. By repeating this imitation of the advance behaviour, the robot eventually collides with the mirror. The collision causes an error detected by PID control and the consciousness module represents pain while the discomfort module E represents discomfort. The association system module F transmits representation of retreat to the reason unit. Behaviour to avoid discomfort and invite comfort is implemented to continue imitation for self-cognition. B. Neural Network Learning Method The neural network learning method used in the present study is supervised learning with the back propagation method. Specifically, we prepared bit strings for S, M’ and B as the learning inputs for the ModNAD of the reason system (Figure 3). Bit strings for S and B were prepared for the feelings systems. Here are each bit strings. Table 1 shows input in the imitation module. Table 2 shows output in the imitation module. Bit strings in Table 2 are used in self and other cognition, e.g. if both self and other advance, bit strings in symbolic representation are 000. Tables 3 and 4 show representation of pain and comfort modules. The imitation module A (Figure 4) has 27 learning patterns. Each of the feelings systems Bthrough to E has 32 paterns and the association system has 72 patterns. In the connection from lower-layer to higher- layer in the feelings system, the trilaminar neural network is used and has 32 learning patterns. R and M at the reason and association system become the outputs when a value is entered into the neural network of the respective ModNAD (Figure 3). R is the only output in the feelings systems. Table 1. Bit strings of input of the system Condition Bit strings Both advances 000 One advance 001 Both stop 010 One retreat 100 Both retreats 111 Table 2. Bit strings of robotic behaviour Condition Bit strings Advance 00 Stop 01 Retreat 11 Table 3. Bit strings of representation of pain Condition Bit strings No pain 000 A little pain 100 Pain 011 Lots of pain 111 Table 4. Bit strings of representation of comfort Condition Bit strings No comfortable 000 A little comfortable 001 Comfortable 100 Lots of comfortable 110 O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 96 The system learns by calculating the sum of mean square errors for the number of patterns using outputs R and M. The difference with the corresponding teacher signals then sequentially reduces the determined error. Learning continues until the sum of mean square errors is reduced to below a certain value (0.01). C. Result of Learning and Discussion Figure 8 shows the learning of the imitation module, the association module, the pain and solitude modules. Figure 9 shows the learning of the connections, and the discomfort and comfort modules. The x-axis of the graph is the learning number (order) and the y-axis is the mean square error (error). The graphs in Figure 8 show that learning converges to about 0.01 within about 900 orders for the imitation module, the pain module, the solitude module, and the association module. The graphs in Figure 9 show that learning converges to about 0.01 within about 400 orders for the comfort and discomfort module. But in the connections to comfort and discomfort learning converges to about 3,000 and 40,000 orders. Learning in the ModNAD completed after a relatively small number of times because the processing efficiency of the ModNAD neural network is good. The results also show that each ModNAD has a function to imitate and cognise emotion. It further shows that the functions of consciousness and emotion as defined in this paper can be realised by combining the respective ModNADs. By structuring the consciousness system as described above, when mounted on a robot the system can imitate and cognise emotion and if discomfort is determined, select behavior to remove the discomfort. Self-cognition can continue for the implementation of self- consciousness. In the learning experiments using the consciousness system developed in this paper, the robot is considered capable of avoiding harmful obstacles if the robot is given an emotion function. The consciousness system will be mounted on a robot to verify that the robot actually evades obstacles using the emotion function and that the robot is capable of self- cognition. For robotic experiments, four objects to be imitated are prepared for. The following is four objects prepared: a robot taking action which Figure 8. Results of learning experiment ofimitation, association, solitude, and pain Figure 9. Results of learning experiment ofcomfort, discomfort, and the connections O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 97 is advance, stop or retreat at random; a robot possessing the consciousness system (a conscious robot); a robot controlled via cables from a conscious robot; and a mirror image of a conscious robot. A conscious robot will imitate these four objects and then it will be shown that the robot can have self-consciousness by comparing representations of self-condition to representations of other-condition in the system proposed in this paper. IV. CONCLUSIONS This paper reported on research into functions that are intrinsic to humans such as mirror neuron and mimisis theory and defined a new consciousness. This paper also established a definition of the robot’s emotions based on human emotions. A consciousness system has been developed and successfully taught using artificial neural networks. A robot was asked to perform tasks using the consciousness system. Learning converges to about 0.01 within 900 orders for imitation, pain, solitude and association modules. It converges to about 0.01 within 400 orders for the comfort and discomfort modules. It can be concluded that the learning efficiency of the ModNAD artificial neural network is good. These results provide a fundamental step for developing a robot having consciousness and feelings similar to humans. ACKNOWLEDGEMENT I would like to thank S.Masri for his help in this work. REFERENCES [1] K. Company. (Feb., 2013). Available: http://www.kokoro-dreams.co.jp [2] H. Miwa, et al., "Effective emotional expressions with emotion expression humanoid robot WE-4RII," in IEEE/RSJ Intl. Conf. on Intelligent Robots and Systems, Miyagi, Japan, 2004, pp. 2203-2208. [3] V. Gallese, et al., "Action recognition in the premotor cortex," Brain, vol. 119, pp. 593- 609 1996. [4] M. Donald, Origins of the Modern Mind. Cambridge: Harvard University Press, 1991. [5] U. o. Maryland. (20/12/2012). Hearing and Speech Sciences. Available: http://www.bsos.umd.edu/hesp/ index.htm [6] A. N. Meltzoff and M. K. Moore, "imitation of facial and manual gestures by human neonates," Science, vol. 198, pp. 75-78, 1977. [7] K. Ikegami, "Effect of extrauterine and intrauterine experiences on tongue protruding imitation in premature infants and full-term neonates," Annual Report of Grant-in-Aide for Scientific Research, Ministry of Education, Science, Sports and Culture: The Emergence of Human Cognition and Language, 3, pp 9-13, 1996. [8] A. R. Damasio, The feeling of what happens: body and emotion in the making of consciousness. Florida, U.S.A. : Harcourt Brace, Orlando, 1999. [9] A. R. Damasio, Descartes’ error: emotion, reason and the human brain. Quill, New York: G.P. Putnam's Sons, a divisions of the Putnam Berkley Group, Inc., 1994. [10] A. Alsmith and F. de Vignemont, "Embodying the mind and representing the body," Review of Philosophy and Psychology, vol. 3, pp. 1-13, 2012. [11] M. Asada, et al., "Cognitive developmental robotics: a survey," Autonomous Mental Development, IEEE Transactions on, vol. 1, pp. 12-34, 2009. [12] L. W. Barsalou, "Grounded cognition," Annu. Rev. Psychol., vol. 59, pp. 617-645, 2008. [13] R. D. Beer, "The Dynamics of active categorical perception in an evolved model agent," Adaptive Behavior, vol. 11, pp. 209- 243, December 1, 2003 2003. [14] R. Blickhan, et al., "Intelligence by mechanics," Phil. Trans. R. Soc. A, vol. 365, pp. 199-220, 2007. [15] I. Aleksander and B. Dunmall, "Axioms and tests for the presence of minimal consciousness in agents i: preamble," Journal of Consciousness Studies, vol. 10, pp. 7-18, 2003. [16] I. Aleksander, et al., "Will and emotions: a machine model that shuns illusion," in Proceedings of Symposium on Next Generation Approaches to Machine Consciousness: Imagination, Development, Intersubjectivity, and Embodiment, University of Hertfordshire, Hatfield, UK, 2005, pp. 110-116. [17] T. Bosse, et al., "Simulation and representation of body, emotion, and core consciousness," in Proceedings of Symposium on Next Generation Approaches to Machine Consciousness: Imagination, Development, Intersubjectivity, and Embodiment, University of Hertfordshire, Hatfield, UK, 2005, pp. 95-103. [18] D. Calverley, "Towards a method for determining the legal status of a conscious O. Shoubaky and T.M. Sharari/ Mechatronics, Electrical Power, and Vehicular Technology 05 (2014) 91-98 98 machine," in Proceedings of Symposium on Next Generation Approaches to Machine Consciousness: Imagination, Development, Intersubjectivity, and Embodiment, University of Hertfordshire, Hatfield, UK, 2005, pp. 75-84.