PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection http://dx.doi.org/10.3991/ijim.v10i3.5552 Herman Tolle1 and Kohei Arai2 1 Brawijaya University, Malang, Indonesia 2 Saga University, Sagashi, Japan Abstract—Head movement has been found to be a natural way of interaction. It can be used as an alternative control method and provides accessibility for users when used in human computer interface solutions. The combination of Head-mounted displays (HMDs) with mobile devices, pro- vide an innovation of new low cost of human-computer interaction. Such devices are hands-free systems. In this paper, we introduce a new method for recognizing head movement as the controller of mobile application and pro- posed a new control system using head movement only. The proposed method can determine specific head pose move- ment and respond it as a controller of an application. The implementation of a music player application on an iOS devices shows that the proposed method is appropriate for a new experience of real-time human-computer interaction with head movement control only. Index Terms—head mounted display, accelerometer, head motion estimation, human computer interaction. I. INTRODUCTION Head movement detection has received significant at- tention in recent research. One of the specific purposes for head movement detection and tracking is to allow the user to interact with a computer or new devices like mobile phone. The increased popularity of the wide range of applications of which head movement detection is a part, such as assistive technology, virtual reality, and augment- ed reality, have increased the size of research aiming to provide robust and effective techniques of real-time head movement detection and tracking [1]. There are many different approaches for head move- ment estimation. All investigated methods require high computational and still difficult to implemented using low computational hardware. Recently, there are three popular approaches for estimating head movement and tracking: using camera based image processing, sensor based meth- od using accelerometer and gyroscope, and the combina- tion of different techniques. Most of the head pose estimation method is based on computer vision approach, like [2][3][4]. Liu et al. [2] introduced a video-based technique for estimating the head pose and used it in an image processing application for a real-world problem; and attention recognition for drivers. Murphy-Chutorian and Trivedi [3] presented a static head pose estimation algorithm and a visual 3-D tracking algorithm based on image processing and pattern recognition. Kupetz et al. [4] implemented a head move- ment tracking system using an IR camera and IR LEDs. Another approach for head movement detection is by using sensors such as gyroscopes and accelerometers. King et al. [5] implemented a hands-free head movement classification system which uses pattern recognition tech- niques with mathematical solutions for enhancement. A dual axis accelerometer mounted inside a hat was used to collect head movement data. A similar method was pre- sented by Nguyen et al. [6]. The method detects the movement of a user's head by analyzing data collected from a dual-axis accelerometer and pattern recognition techniques. But still no application based on the proposed method was suggested. Other sensor based approach like [7][8]. However, it needs more theoretical proofs and more experiments and accuracy analysis. A combination of different techniques can be used in head tracking systems. Satoh et al. [9] proposed a head tracking method that uses a gyroscope mounted on a head- mounted device (HMD) and a fixed bird's-eye view cam- era responsible for observing the HMD from a third- person viewpoint. Using a fixed camera, customized marker, gyroscope sensor and calibration process makes this proposal impractical for head tracking tasks. The time complexity of the algorithm has not been investigated which makes it a little far from being used in real-world applications. Head-mounted displays (HMDs) embedded in eye- glasses are the next innovation along the path of commu- nication techniques. Such devices are hands-free systems. Although this is not a new idea, currently released and commercially available products (such as the Project Glass by Google) show the immense potential of this technology. They function as stand-alone computers; their light glass frame is equipped with a variety of sensors; a projector displays images and information onto the eye. In our previous research work, we propose head movement detection and tracking as a controller for 3D object scene view [10] and the combination of user’s head and body movement as a controller for virtual reality labyrinth game [11]. In this paper, we introduce a new type of head move- ment controlling systems using 3 degrees of freedom of head rotation movement. The method is based on recog- nizing the internal accelerometer and gyro sensor data inside a mobile phone placed on user's head using Head- mounted Display (HMD) like Google cardboard (called as dummy HMD). A real-time mobile application is built to 24 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … prove the implementation of the method in real time basis. The user can easily control the hands-free application by using particular head pose movement only. II. DESIGN OF HEAD MOVEMENT CONTROL SYSTEM (HEMOCS) The head movement controller system (HEMOCS) working with a dummy HMD complementary with a smartphone which has internal inertial sensors like accel- erometer, gyroscope, and magnetometer. The user wears an HMD with a smartphone as shown in Figure 1 and Figure 2 while system displays a mobile application de- veloped on a smartphone. HMD is used for integrating human head and eyes, and through the smartphone’s dis- play user can watch through the camera while controlling something in the application. The user can control the application by moving the head in a particular movement. The head movement is detected through real-time data gathering and analysis from mobile phone’s sensors. The method for detecting user’s head pose movement is based on the pattern of data gathered by internal sensors. The human head is limited to three degrees of freedom (DOF) of head translation in the pose, which can be char- acterized by the pitch, roll, and yaw angles as pictured in Figure 2. In this research, we proposed a control system using three types of head pose movement with each has two opposite direction as shown in Figure 3. Type of head movement is axial rotation left (H1), axial rotation right (H2), flexion (H3) and extention (H4), lateral bending left (H5) and lateral bending right (H6). H1 to H6 is just the code for easily named the head pose movement. Axial rotation left means that user moves the head to the left (around 30-45 degree) from the initial position in the spe- cific speed and then pose back the head into initial posi- tion. This movement feel like user tries to swipe some- thing in the application using their head. All the same process for other five types of movement with each direc- tion. Then we have 6 type of head poses movement as the gesture for control something in the mobile application. A. Head Pose Gesture Control Function In the proposed head movement control system (HEMOCS), the head movement gesture is act like a swipe type in mobile phone user experience. This control system also work as substitute for the conventional button or tap function. The proposed method for the head movement and each type of control purpose as shown in Table 1. Moving the head to the left or right is using as selecting control (previous (H1)) or next (H2)). Head looking down is for accept (or tap or choose) control (H3), looking up for back function (H4), tilt the head toward left shoulder is for back home (H5) and tilt head toward right shoulder is just reserved (H6) for further functionality in the near future. The first thing to investigate is how to detect and rec- ognize pose of the head movement through signal pattern. The method in the head movement controlling system based on four repeatable process steps as shown in Figure 4 as follows: 1) read sensors data, 2) recognize the da- ta/signal pattern, 3) determine the head movement, and 4) controller response based on the recognized head move- ment. First, the system read sensors data using push meth- od, secondly recognizing the pattern of sensor’s data, then determine which head pose movement type it is, lastly is the response effect for control something in the mobile Figure 1. Sample of Google Cardboard as a dummy-HMD with smartphone [12] Figure 2. User wears a dummy-HMD with 3 degrees of freedom of head movement Figure 3. Six type of head pose movement named as H1 to H6 TABLE I. DESIGN OF HEAD POSE CONTROL FUNCTION Code Head Pose Type Control Purpose H1 Move to the left / Axial rotation left Select Previous H2 Move to the right / Axial rotation right Select Next H3 Head looking down / Flexion Choose this H4 Head looking up / Extension Back to the List H5 Tilt head toward left shoulder / Lateral Bending left Back to Home H6 Tilt head toward right shoulder / Lateral Bending right reserved application that correlated with the particular detected head movement type. B. Preliminary Investigate on Head Movement Signal Pattern Preliminary investigate on the pattern of the 6 type of head movement should be done before developing attitude control system. In our previous work [10][11], using iOS iJIM ‒ Volume 10, Issue 3, 2016 25 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … devices sensors achieve best results comparing to android based devices sensors [13]. In iOS based device system, reading the sensor data is facilitated through CoreMotion which has 4 different type of movement data as follows: acceleration, gravity, attitude, and rotation, while each has three axis data. We read the data from for different type on CoreMotion then analyze the pattern. The data pattern of the first type of head movement is shown in Figure 5. Comparison of the pattern of six different head movement data concludes that CoreMotion attitude is feasible for usage in our proposed control system. The basic data rate is set to 60 cycles per second. From the preliminary investigation, we found that core-motion attitude data sensors are appropriate for recognizing particular head movement type. C. Method for Detecting Head Pose Movement The method for detecting and recognizing the head pose is based on the analyzing of attitude data including yaw, roll, and pitch. We proposed the general algorithm as shown in Figure 4 and sample of pseudocode of detecting head pose algorithm as shown in Figure 7. Attitude data pattern for H1 and H2 is affected only with yaw data as shown in Figure 6. The system starts to count if yaw de- gree is higher than a specific threshold. The value of the threshold is 10 degree as the base threshold line (point 1) since the amplitude of pitch and roll is below the thresh- old value when user’s act H1 or H2 head movements. The number of counter when user’s head move more than 10 degrees is using to determine that user is in the motion of H1 or H2 (point 2). If the counter is in the specific number between yMin and yMax then system recognized it as user head movement in H1 or H2 pose. Determine the H1 or H2 is based on the peak value of the absolute degree level, if the real value is positive then determine as H1, and H2 for a negative value. For H3-H4 and H5-H6 we use roll and pitch respectively. There are two challenges in this system when to use CoreMotion attitude data. These are, how to get high accuracy when combined all the head pose movement detection; and the second is about user first head orienta- tion. The first problem is an algorithm problem to com- bine all processes with high accuracy. For the second Figure 4. Proposed Head Movement Control System Process Step problem because attitude data is in degree (after conver- sion from radian to degree), which means that first front position as zero degrees, then we have to consider about user orientation movement. If the user changes his/her base front position orientation, then the algorithm is not working anymore. User first front orientation position is a challenging because in practice, user’s head can move any direction and orientation, but only six specific head pose should be detected, recognized and use as a controller. We improved the algorithm to ensure that our proposed system is robust and able to avoid different head movement type besides the six types of head pose defined as H1 to H6. A threshold number is a static number while the user’s base Figure 5. Different signal pattern of iOS CoreMotion for Axial Rota- tion (H1) Pose Figure 6. Signal Pattern of Coremotion Attitude while user Move the Head to the Left (H1) !"#$%&'#()*%*$%+',*-.( /( (((012)1%1(3(4'5*6'%&'#78%%&%"9*7:12( (((&!(;012)1%1;(<(012=5*,>'?9( ((((((0124'"#%@@( (((*?,*( ((((((1??4'"#%@@( (((&!(-0124'"#%(<(0126&#.(AA(-0124'"#%(B(01261C.( (((/( (((((((&!(012)1%1(B(D( (((((((((E'F*G*!%-.( (((((((*?,*( (((((((((E'F*H&I>%-.( ((((J( ((((&!(1??4'"#%(<(KD( (((((((0124'"#%(3(D( J Figure 7. Pseudocode of Head Pose Detection Function 26 http://www.i-jim.org PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … front orientation is adaptable depend on user’s head base front orientation. If the user move the head’s orientation more than specific cycles, then we change the front base orientation based on current orientation. III. IMPLEMENTATION & EVALUATION A. Mobile Application Using HEMOCS There are plenty of mobile application can implement HEMOCS as a new control method. This method is appli- cable for any kind of mobile application in the area where user should not use their hand for controlling something in the screen, for example for the driver, welder, etc. In the area of healthcare, HEMOCS can be implemented in the mobile application for usage by disabled people where they cannot talk and use their hands for controlling an application. Figure 7 show the illustration of an imple- mentation of HEMOCS in HMD based mobile applica- tion. One of the future targets of HEMOCS is implemen- tation as assisted communication device for disabled peo- ple. To prove our concept, we develop simple music player application with overlay text showed the song order one by one in the screen. User can control the song list and play a chosen song by move their head only. H1 and H2 control type is used to select different song in the list one by one with H1 for next song and H2 for previous song. H3 is used for chose one song to play, then system will play the song, and H4 for stopping the song and back to the menu. Figure 9 show sample view of screen show the first menu overlaid (9a), sample of fifth menu (9b), and view when user choose first song (9c). This simple appli- cation looks like an augmented reality application, where user can see through the HMD while controlling overlay text of song lists. B. Accuracy Evaluation. Accuracy parameter is use to evaluate the performance of proposed method on detecting user head movement. Accuracy is the overall success rate, when user move the head into particular move and system respond it with corresponding control type on the application. Experiment results as shown in Table 2 shows that there is some error in particular head movement with the average accuracy is 80%. These errors happen because user moves their head at various speed. Using the same evaluation process, but categories it into head movement duration as shown in Table 3 shows that 100% accuracy is achieved when user move the head in the speed between 400 milliseconds to 1.3 seconds, with an average head maximum degree is around 46.84°. C. Usability Evaluation. Usability evaluation is conducted to measure the satis- faction of the user while using the new proposed control- ler systems. Experiments results by 20 users trying to use the iOS based music player application with head pose movement control as shown in Table 4. We evaluate five usability factor as follows: functionality, easy to use, effec- tiveness, satisfaction, and understandable. Highest aver- age score 94% is achieved in the factor of “effectiveness” which means that user thinks this control type is effective for control the application. The least average score 71% is achieved in the factor of “easy to use”, which means that some user still difficult to use the new control system even they think that is effective. The average of usability fac- Figure 8. Implementation Evaluation of HEMOC System. Figure 9. Screenshot of music player augmented application using HEMOCS for control. TABLE II. ACCURACY EVALUATION OF EACH HEAD POSE Head Pose Total T F Accuracy H1 20 18 2 90% H2 20 18 2 90% H3 20 14 6 70% H4 20 15 5 75% H5 20 19 1 95% H6 20 12 8 60% Average 80% TABLE III. ACCURACY EVALUATION BASED ON MOVEMENT DURATION No Movement Dura-tion (s) Max Degree (º) Accuracy 1. ! 0.3 29.11 0% 2. 0.4 – 0.6 41.63 100% 3. 0.7 – 0.9 48.40 100% 4. 1 – 1.2 50.50 100% 5. >=1.3 50.33 0% iJIM ‒ Volume 10, Issue 3, 2016 27 PAPER DESIGN OF HEAD MOVEMENT CONTROLLER SYSTEM (HEMOCS) FOR CONTROL MOBILE APPLICATION THROUGH … TABLE IV. USABILITY EVALUATION RESULTS No Usability Factors Average Score 1. Functionality 78% 2. Easy to Use 71% 3. Effectiveness 94% 4. Satisfaction 79% 5. Understandable 82% Average 81% tors results reaches 81% which means that the user satis- fied with music player mobile application controlling by using their head movement only. Some user’s feedback said that better if the speed of the head movement is cali- brating for each user before they start to use as a control- ler. So the system will be adaptable with user’s head movement speed. IV. CONCLUSION & FUTURE WORKS. Detecting user’s head movement are possible through a mobile phone internal sensors put in user’s head with a dummy HMD like Google cardboard. The proposed method is succeeded to recognized user’s particular head movement as controller for specific gesture in mobile application. The proposed method is a novel approach for using single position of the sensors put in user’s head, and recognizing user’s head movement type. Implementation of head movement controller system on a music player application prove that our new proposed control system is acceptable to implement and possible for another kind of mobile application, especially the application for disabled people while they cannot use the hand for control the application. In the near future, we will improve the sensitivity of the head movement controller systems with user’s comfort speed for moving their head as a controller. We could improve the method to implement in other mobile applica- tion for HMD with the proposed head movement control- ler system. There are plenty of area of mobile application is applicable for this kind of new control by head pose movement. REFERENCES [1] A. Al-Rahayfeh and M. Faezipour, "Eye Tracking and Head Movement Detection: A State-of-Art Survey," IEEE Journal of Translational Engineering in Health and Medicine, vol. 1, pp. 11- 22, 2013. http://dx.doi.org/10.1109/JTEHM.2013.2289879 [2] K. Liu, Y. P. Luo, G. Tei, and S. Y. Yang, “Attention recognition of drivers based on head pose estimation,'' in Proc. IEEE VPPC, Sep. 2008, pp. 1-5. [3] E. Murphy-Chutorian and M. M. Trivedi, “Head pose estimation and augmented reality tracking: An integrated system and evalua- tion for monitoring driver awareness” IEEE Trans. Intell. Transp. Syst., vol. 11, no. 2, pp. 300-311, Jun. 2010. http://dx.doi.org/10.1109/TITS.2010.2044241 [4] D. J. Kupetz, S. A. Wentzell, and B. F. BuSha, “Head motion controlled power wheelchair” in Proc. IEEE 36th Annu. Northeast Bioeng. Conf., Mar. 2010, pp. 1-2. [5] L. M. King, H. T. Nguyen, and P. B. Taylor, “Hands-free head- movement gesture recognition using artificial neural networks and the magnified gradient function,'' in Proc. 27th Annu. Conf. Eng. Med. Biol., 2005, pp. 2063-2066. http://dx.doi.org/10.1109/ iembs.2005.1616864 [6] S. T. Nguyen, H. T. Nguyen, P. B. Taylor, and J. Middleton, “Improved head direction command classification using an opti- mised Bayesian neural network” in Proc. 28th Annu. Int. Conf. EMBS, 2006, pp. 5679-5682. [7] S. Manogna, S. Vaishnavi, and B. Geethanjali, “Head movement based assist system for physically challenged” in Proc. 4th ICBBE, 2010, pp. 1-4. [8] S. Kim, M. Park, S. Anumas, and J. Yoo, “Head mouse system based on gyro- and opto-sensors” in Proc. 3rd Int. Conf. BMEI, vol. 4. 2010, pp. 1503-1506. [9] K. Satoh, S. Uchiyama, and H. Yamamoto, “A head tracking method using bird's-eye view camera and gyroscope” in Proc. 3rd IEEE/ACM ISMAR, Nov. 2004, pp. 202-211. [10] K. Arai, H. Tolle, A. Serita, “Mobile Devices Based 3D Image Display Depending on User's Actions and Movements. Interna- tional Journal of Advanced Research in Artificial Intelligence (IJARAI), 2013, vol 2, no. 6. pp. [11] H. Tolle, A. Pinandito, EM. Adams J., K. Arai, “Virtual reality game controlled with user’s head and body movement detection using smartphone sensors“. ARPN Journal of Engineering and Applied Sciences. Nov. 2015, vol 10, no 20, pp 9776-9782. [12] ---, Google Cardboard SDK, http://developers.google.com/ card- board. Accessed in January 23rd, 2015. [13] Shoaib M., Bosch S., Incel O.D., Scholten H., Havinga P.J. (2014). Fusion of smartphone motion sensors for physical activity recognition. Sensors. 2014; 14: 10146–10176. http://dx.doi.org/10.3390/s140610146 AUTHORS H. Tolle is with the Research Group of Multimedia, Game & Mobile Technology, Informatics Department of Computer Science Faculty, Brawijaya University, Malang 65145 INDONESIA (e-mail: emang@ub.ac.id, her- man.saga@gmail.com). K. Arai is a professor in the Department of Information Science, Saga University, JAPAN. He is also an Adjunct Professor of University of Arizona, USA since 1998. He wrote 33 books and published 500 journal papers. (e-mail: arai@is.saga-u.ac.jp). This research work is funded by Dikti SAME 2015 project of Indone- sian Ministry of Research, Technology & Higher Education, as collabo- ration research between Brawijaya University, Indonesia and Saga University, Japan. Submitted, 22 February 2016. Published as resubmit- ted by the authors on 12 May 2016. 28 http://www.i-jim.org iJIM – Vol. 10, No. 3, 2016 Design of Head Movement Controller System (HEMOCS) for Control Mobile Application through Head Pose Movement Detection