Validation of a modular and wearable system for tracking fingers movements ACTA IMEKO ISSN: 2221-870X December 2020, Volume 9, Number 4, 157 - 164 ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 157 Validation of a modular and wearable system for tracking fingers movements Michela Borghetti1, Paolo Bellitti1, Nicola Francesco Lopomo1, Mauro Serpelloni1, Emilio Sardini1 1 Department of Information Engineering, University of Brescia, via Branze 38, 25123 Brescia, Italy Section: RESEARCH PAPER Keywords: finger movement tracking; wearable module; inertial measurement unit; stretch sensor; industrial setting; data validation; opto-electronic systems Citation: Michela Borghetti, Paolo Bellitti, Nicola Francesco Lopomo, Mauro Serpelloni, Emilio Sardini, Validation of a modular and wearable system for tracking fingers movements, Acta IMEKO, vol. 9, no. 4, article 21, December 2020, identifier: IMEKO-ACTA-09 (2020)-04-21 Section Editor: Francesco Bonavolonta, University of Naples Federico II, Italy Received November 8, 2019; In final form August 8, 2020; Published December 2020 Copyright: This is an open-access article distributed under the terms of the Creative Commons Attribution 3.0 License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Funding: This work was supported through grant PRIN 2015C37B25 by the Italian Ministry of Instruction, University and Research (MIUR) Corresponding author: Michela Borghetti, e-mail: michela.borghetti@unibs.it 1. INTRODUCTION In recent years, Industry 4.0 has radically revolutionised manufacturing processes and industrial production [1]. The rapid increase in the level of automation and the introduction of information and communication technologies into the manufacturing world are enabling more efficient and more flexible production processes that can fabricate higher-quality goods at reduced cost and at high production rates [2]. In this context, industrial settings are transformed into smart factories; information about processes is shared in real time through the Internet of Things and cyber-physical production systems in order to improve efficiency and throughput as well as the quality of the final products [3], [4]. Although one of the main principles of Industry 4.0 is that the cyber-physical system should make decisions and perform tasks as autonomously as possible, the worker remains one of the most important resources inside the factory; in the era of digitalisation and data exchange, the supervision of workers’ actions inside the factory is crucial [5]. For example, manual tasks are often required to guarantee flexibility in the production process, especially in semi-automated or non-automated settings, and incorrect movements and action may significantly affect the production processes [6]. In other situations, workers control and supervise industrial robots to perform repetitive and exhausting tasks or for the inspection and maintenance of hazardous and extreme settings in plants [7]. A valuable way to interact with robots without physical interaction is the use of hand gestures [8]. An important key element in Industry 4.0 are collaborative robots, which are specially designed to work together with humans. In this context, the workers’ movements and position should be monitored continuously to ensure the workers’ safety and to improve the collaboration and the production workflow [9]. Finally, assembly tasks are based on the repetitive composition of different parts to produce the final product, and the quality of the product is affected by human errors, especially when the manual operations contain many steps ABSTRACT Supervising manual operations performed by workers in industrial environments is crucial in a smart factory. Indeed, the production of products with superior quality at higher throughput rates and reduced costs with the support of Industry 4.0-enabling technologies is based on the strict control of all resources inside the factory, including workers. This paper shows a protocol for validating a new wearable system for tracking finger movements. The wearable system consists of two measuring modules worn on the thumb and index finger that measure flexion and extension of the proximal interphalangeal (PIP) joint by a stretch sensor and rotation of the proximal phalanx (PP) by an inertial measurement unit. A marker-based opto-electronic system is used to validate the proposed device by capturing specific finger movements. Four movements that simulate typical tasks and gestures, such as grasp and pinch, were specifically performed. The maximum root-mean-square error is 3.7 deg for the roll angle of PP. The resistance changes of the stretch sensors with respect to flexion and extension of the PIP joint is 0.47 /deg. The results are useful for data interpretation when the system is adopted to monitor finger movements and gestures. mailto:michela.borghetti@unibs.it ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 158 with a specific order and many different operating objects [6], [10]. Indeed, the efficiency of the operation depends on quickly finding the individual parts in the warehouse and on the degree of training enabling the worker to follow the specific assembly order effectively [11]. Furthermore, musculoskeletal injuries known as repetitive motion disorders are the result of the combination of incorrect and repetitive movements in workshops and individual predispositions [12]. Motion tracking systems are currently used in different applications, and they are an ongoing topic of research [13]-[15]. Finger tracking systems are used for movement analysis during the execution of precise actions [16], for object design in combination with virtual reality [17], for rehabilitation purposes [18], [19] and for tracking hand gestures used with human- machine interfaces [20]-[22]. Finger tracking systems are mainly used in vision-based and contact-based systems [8]. A vision- based system consists of a set of cameras and image processing algorithms. Mechanical constraints are reduced, especially in the case of markerless motion capture systems, and this guarantees good freedom of movement in the space. In order to increase accuracy, several active or passive markers should be placed on the hand while multiple cameras record the displacement of markers in order to capture all the movements [23]. Opto- electronic systems are extensively used to measure human kinematics (e.g. VICON, BTS, OptiTrack, etc.) [24]. In [25], an opto-electronic marker system was used to validate a glove equipped with 11 inertial measurement units. These systems require expensive and complex equipment, an accurate calibration algorithm and a confined space, such as a laboratory [26]. The illumination is a critical condition in order to properly capture properly all the segments [27]; moreover, waste and dust deposited on the lens may significantly affect the measurements. For all these reasons, vision-based systems are rarely adopted for industrial applications, in less-constrained settings or for continuous monitoring [28]. Contact-based systems include wearable devices equipped with sensors and electronics for elaborating and transmitting measurement data to an external elaboration unit. They are basically composed of low-cost components and they can be easily used in many applications and environments. A wearable device suitable for industrial applications should be easily wearable and should comply with workspace safety rules. Furthermore, the device should not introduce mechanical constrains that limit the task execution. In previous works [11] and [29], we proposed different measurement systems for tracking finger movement, composed of different wearable modules with embedded sensors and electronics. The electronics elaborate the measurement signals and transmit the elaborated data to a mobile device, which displays and collects the data by using a software specifically designed for this application. The wearable systems can be worn in a minimally invasive way, and we demonstrated that the system configurated with two measurement modules could be a valuable solution for many industrial applications by simulating tasks potentially performed by a worker in the industrial field. In this paper, we show the validation of the data collected by the wearable system proposed in [11] using a marker-based opto- electronic system. In Section 2, we describe the measurement system. In Section 3, the experimental setup and protocol used for the data validation are given, and in Section 4, we show and discuss the experimental results. Finally, in the concluding section, the major findings are summarised. 2. SYSTEM DESCRIPTION The proposed system (shown in Figure 1) was designed to be modular, light, low-cost, battery powered and cable-free. It is composed of two sections: 1) a wearable system equipped with sensors and electronics for measuring finger motion and transmitting data and 2) a local device (a laptop) for collecting and displaying data measurements. The wearable system is configurable, and it is composed of wearable and independent measuring modules (shown in Figure 2). The number of modules depends on the application and on the physical constraints. Each module communicates with the laptop directly via Bluetooth, reducing the number of wires and cables that could obstruct the finger movements. In this work, two modules were used to test and validate the system: one was worn on the thumb and the other on the index finger of the right hand. The module design allows the system to be worn on different fingers of both hands with few changes. The box (dimensions: 36 mm × 25 mm × 10 mm) and the ring that supports this box are made of polylactic acid, and they were made using Fused Deposition Modelling technology for an easy and tailor-made fabrication of the module according to finger size. The box encloses the electronic board and the 40 mAh polymer Figure 1. Measuring system for tracking the movements of the index finger and the thumb. In this example, the system is worn on the right hand (index finger and thumb), but it could be worn on the left hand without changes. The wearable system provides information about the flexion/extension of the PIP joint by reading the resistance of the stretch sensors and about the orientation of the proximal phalanges through the IMU’s output. Module 1 Module 2 Module 2 Module 1 Wearable System Local Device ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 159 Li-ion rechargeable battery (20 mm × 11 mm × 3 mm), for a total weight of 14 g. A more extended description of the module is reported in [11]. The finger movements are measured by an inertial measurement unit (IMU) mounted on the electronic board and a piezoresistive sensor. The LMS9DS1, produced by STMicroelectronics, is used as a 9 Degrees of Freedom IMU, which incorporates a 3-axis accelerometer, a 3-axis gyroscope and a 3-axis magnetometer for the determination of the proximal phalanx (PP) orientation. The orientation of each sensor is shown in Figure 3. The three-dimensional (3D) coordinate system of the accelerometer and gyroscope do not satisfy the right-hand rule. Flexion and extension of the proximal interphalangeal (PIP) joint is monitored by the stretch sensor sold by Image Scientific [30], which is placed over the dorsal side of the finger. The stretch sensor is made of a conductive rubber whose electrical resistance varies with the applied force and thus with the sensor elongation. The nominal cord resistance is 395 /cm in its unstressed state. One sensor end is anchored to the plastic ring, and the other one is fixed to an adjustable one- wrap hook and loop fastener tied around the middle phalanx. The sensor length is determined by finger size. A voltage divider circuit included in the electronic board is used to convert the resistance change into a voltage measurement and thus into a digital signal. ATmega328P is the microcontroller for the electronic board and supervises the measuring process, such as the communication with the laptop. Once the communication with the laptop is established, the microcontroller collects the measurements from the sensors and sends them to the laptop every 45 ms. The resolution of the analogue–digital converter is 3.22 mV/LSB, which corresponds to a resolution in the measurement of sensor resistance of 0.86 Ω/LSB. In order to reduce the power consumption, a Bluetooth Low-Energy (BLE) module, RN4871 from Microchip, is used for wireless communication. The GATT roles (Generic Attribute Profile) are used to exchange data in accordance with the BLE protocol. The module can properly operate with the input voltage of a 3.7 V rechargeable battery thanks to the on-board low-dropout voltage regulator (TPS71533, Texas Instruments), with an overall power consumption of 75 mW. On the laptop side, a LabVIEW Virtual Instrument developed for this specific application collects, stores and displays the data sent by the two modules on the same screen. 3. VALIDATION PROCEDURE 3.1. Experimental setup A male subject wore the measuring system on the thumb and the index finger of his right hand. The IMUs were calibrated according to the standard calibration procedures (an example is reported in [31]). The relationship between the strain and the resistance change of the stretch sensor is reported in [29] and [32]. The calculated gauge factor was 4.73 and the linear correlation coefficient (R2) was equal to 0.98 in the range 0– 10 %. A marker-based opto-electronic system (DX400, BTS Bioengineering S.p.A) including 8 cameras acquiring at 100 fps and a dedicated marker-set was exploited to have a ground truth reference for the joint kinematics. The calibrated acquisition volume was approximatively 1.00 m x 1.00 m x 1.00 m. In these tests, the measurements acquired by the proposed system and by the opto-electronic system were post-processed to estimate the orientation and rotation of the fingers. 3.2. Marker Protocol A configuration of 11 markers was used, covering the palm of the hand, the proximal and distal phalanx of the thumb and the proximal and middle phalanx of the index finger (Figure 4). In particular, three markers were located approximately at the ulna styloid process, the radius styloid process and the dorsum of the hand (i.e. the middle of the third metacarpal bone). The thumb was fit with 4 markers, three at the PP – exploiting the thumb board enclosure – and one at the tip of the distal phalanx. Equivalently, 4 markers were used for the index finger: three markers at the PP – using the index board enclosure – and one on the distal joint of the middle phalanx. For both the static pose and each acquired movement, 3D marker coordinates were computed in every frame (100 fps) of the time sequence and used to calibrate the model and estimate joint kinematics. 3.3. Coordinate Frames and Kinematic Assessment In order to correctly decompose joint kinematics, during the static phase a specific coordinate frame was defined for each segment involved in the kinematic chain. In particular, the coordinate frame of the hand segment was defined as in Figure 4: Figure 2. Coordinate frame and description of the measuring module to be worn on the finger. Figure 3. Electronic board including IMU and electronic components for conditioning, elaborating and transmitting measurement data. The accelerometer and the gyroscope have a different orientation with respect to the magnetometer, and this aspect is considered in the data elaboration algorithm used to measure the phalanx orientation. x y z Stretch sensor Electronic Board Battery IMU Adjustable ring Plastic ring Top Side Bottom side BLE module MCU LDO IMU x y z z y x Magnetometer Accelerometer/ Gyroscope ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 160 • X-axis is the direction identified by wrist flexion/extension axis (i.e. the axis connecting the markers placed on the ulnar and radial styloids). • Z-axis is the normal axis identified by the hand plane defined by the 2 markers on the wrist and the marker on the dorsum of the hand • Y-axis is the cross-product between the z-axis and y-axis for the hand. For both the thumb PP and the index finger PP, the markers placed on the board enclosures and the specific configuration were also used to define the anatomical coordinate systems: • Y-axis direction is defined by the two markers placed on the longer side of the enclosure. • X-axis direction is defined by the markers on the shorter side. • Z-axis is the cross-product between the x-axis and the y- axis. The anatomical coordinate frames for the distal phalanxes were then defined by starting from the board coordinate frames and shifting them by an appropriate quantity, taking into account the dimensions of both the boards, the support and the fingers. For the distal phalanx of the thumb and the middle phalanx of the index finger, the coordinate frames were defined by keeping the hand in a neutral posture and considering the coordinate frames aligned with the most proximal ones but moved along the y-axis by the length of the proximal segments themselves. Once each coordinate frame was defined, the local position of each marker was defined. Then, for the kinematic analysis, the position and orientation of the coordinate frames were estimated by minimising the local position of each marker with respect to its global position. Minimisation was performed through least- squares optimisation. The coordinate frame of moving IMUs was aligned as shown in Figure 2, and it was aligned with the coordinate frame of the segments (index finger and thumb) detected by the opto- electronic system. The rotation matrix associated with the IMU measurements was calculated after the acquisition by using the Kalman filter algorithm implemented in Sensor Fusion and Tracking Toolbox (Matlab R2019b). Since the coordinate system of the accelerometer and of the gyroscope do not satisfy the right-hand rule, the measurements of acceleration in the x- direction were swapped. The orientation of the segments and of the IMUs were referred to the neutral posture: wrist, thumb and index finger were kept unflexed with the index finger and palm parallel to the ground. Joint kinematics (i.e. angles) was then assessed by decomposing joint rotations following Euler’s XYZ notation (defined as roll, pitch, yaw). 3.4. Experimental protocol During the experiment, the subject was seated in the chair and his forearm was supported by the table. Considering the experimental setup and the final application of the proposed system in industrial settings, we selected four simple movements shown in Figure 5. The selected movements involve the flexion/extension as well as the adduction/abduction of the fingers. Using simple movements and poses is considered a valuable method to validate wearable systems for tracking finger movements [11], [33]. All the motions started with the hand in a neutral position (all fingers were kept extended and closed). In M1, the subject grasped and released a ball, while in M2, the subject closed his hand, simulating the grasp of a small object. In M3, the subject grasped and released the screwdriver handle. Figure 4. Marker Protocol. Figure 5. Illustration of the movements involved in this study: M1 ball grasp; M2 virtual grasp; M3 screwdriver grasp; M4 pinch. Figure 6. Wrist flexion and extension for x-axis detection. On the left, the neutral posture is shown. Palm Markers Index Markers Thumb Markers x y z M1 M2 M3 M4 ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 161 Finally, in M4, the subject executed the pinch trial. Each movement was repeated five times. Before starting the test, the subjected was asked to flex and extend his wrist (Figure 6) in order to determine the x-axis of the hand segment coordinate frame. 4. RESULTS AND DISCUSSION For the opto-electronic system, according to the calibration procedure and to the experimental setup, the median root-mean- square error (RMSE) on the estimation of the single axis of the associated coordinate frame was 0.5 deg, 0.4 deg and 0.6 deg for x-axis, y-axis and z-axis, respectively. The elaborated measurements of the index finger movements collected during the M1 test are shown in Figure 7 and Figure 8. Here, we analysed the postures of the index finger while the subject grasped and released the ball five times. The orientation of the PP of the index finger measured by the opto-electronic system and by the IMU is shown in Figure 7, where roll, pitch and yaw angles are the extrinsic rotation of the x-axis, y-axis and z-axis of the index PP frame relative to the index PP frame defined in the neutral posture. As expected, the main rotation is around the x-axis (of 30 deg maximum), whereas the yaw angle is limited; indeed, the metacarpophalangeal joint has only two degrees of freedom: flexion and extension and ulnar/radial deviation [31]. The summary of the results reported in Table 1 shows the reliability of the IMU measurements, which are comparable to the ones obtained by the opto-electronic system as highlighted by the RMSE value. As shown in Figure 8, the stretch sensor measures a resistance change of 30 % with respect to the resistance measured keeping the hand in the neutral posture, and this result is obtained when the opto-electronic system measures a roll angle of 60 deg for the middle phalanx. In this case, the roll angle was calculated with respect to the rotating coordinate frame of the PP, and thus it represents flexion and extension of the PIP joint. The other two angles are not shown because their values are negligible; indeed, the PIP joint has only one degree of freedom [34]. The resistance variation of the stretch sensor with respect to the roll angle Figure 7. Orientation of the PP of the index finger measured by the opto- electronic system (dashed line) and the IMU (solid line) during M1 movement. Figure 8. Rotation of the PIP joint of the index finger measured by the opto- electronic system (on the top) and by the stretch (on the bottom) during M1 movement. Figure 9. Rotation of a) PIP joint of the index finger measured by opto- electronic system (x-axis) and by stretch (y-axis); b) metacarpophalangeal (MCP) joint of the index finger measured by opto-electronic system (x-axis) and by IMU (y-axis). Table 1. Summary of results depicted in Figure 7. For each orientation, span is the maximum range of motion measured by the opto-electronic system, RMSE is the deviation of the measurements obtained by the IMU from the ones obtained by the opto-electronic system, and nRMSE relates RMSE to span Orientation Span (deg) RMSE (deg) nRMSE (%) Roll 41.4 1.7 4.1 Pitch 17.1 1.9 11.1 Yaw 10.9 2.7 24.8 ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 162 variation of the PIP joint measured by the opto-electronic system is 0.45 /deg. This value is the slope of the line that combines the measurements of the stretch sensors with the ones of the opto-electronic system, and it is calculated through the linear least-squares method. The correlation of the two systems is 0.988 (Pearson correlation), and thus sensor output is linear throughout the PIP flexion/extension range with a negligible variation of the sensitivity, as shown in Figure 9a). The line represents the average of five repetitions, and the shaded area represents the standard deviation. The relationship between the roll angle of the PP of the index finger obtained from IMU measurements and the one obtained from the opto-electronic system is linear, as shown in Figure 9b). A similar result is obtained for the pitch angle of the PP of the index finger. The measurements obtained by the proposed measuring system and by the opto-electronic system during all the tests M1, M2, M3 and M4 for the index finger are summarised in Figure 10 and Figure 11. For better viewing, only one repetition for each movement summarised in Figure 5 is shown, since the results are similar for the entire duration of the test, as proved by the results of the M1 test in Figure 7 and Figure 8. The roll angles measured by the two systems are shown in Figure 10, since the values of the other angles are negligible. Depending on the type of movement, the range of motion of both the phalanges changes. As expected, the maximum rotation is obtained when the subject grasps the handle of the screwdriver (M3 test), while the minimum rotation is obtained during the M1 test. In all of the cases, the median RMSE of roll angle is lower than 3.7 deg. The roll angles of the PIP joint and the resistance of the stretch sensor are compared in Figure 11 during one repetition of the movements M1, M2, M3 and M4. In these cases as well, the stretch sensor can be correlated with the flexion and extension of the PIP joint. Considering all of the tests, the resistance of the stretch sensor with respect to the roll angle of the PIP joint measured by the opto-electronic system is 0.47 /deg, and it is similar in all the tests, as is the repeatability and the hysteresis. We verified that the correlations between the measurement systems are the same as shown in Figure 9a) and in Figure 9b) for all the movements. Although the flexion of the PIP joint is similar for movements M1 and M4, the combination of the measurements obtained by the IMU and the stretch sensor makes it possible to distinguish these two different gestures, and thus many others. Figure 10. Comparison between the opto-electronic system (dashed line) and the IMU (solid line) considering the rotation of the PP of the index finger. The roll angle represents the flexion and extension angle of the MPC joint during the following movements: M1 ball grasp; M2 virtual grasp; M3 screwdriver grasp; M4 pinch. An illustration of the movements is shown in Figure 5. Figure 11. Comparison between the opto-electronic system (on the top) and the stretch sensor (on the bottom) considering the PIP joint of the index finger. The roll angle represents the flexion and extension angle of the PIP joint during one repetition of the following movements: M1 ball grasp; M2 virtual grasp; M3 screwdriver grasp; M4 pinch. An illustration of the movements is shown in Figure 5. ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 163 5. CONCLUSIONS Median RMSE in roll angles of the PP of the index finger between the IMU and the opto-electronic marker system is between 1.7 deg and 3.7 deg. The smallest difference is observed when the subject grasps and releases a small ball, while the largest difference is observed during the pinch movement. The differences between the two systems could be partially explained by skin movements artifacts that cause the relative marker movements and that affect the estimation of the segment orientation. Drift errors of the IMUs and possible distortions of the static magnetic field could equally affect the measurements and the estimation of the segments. Finally, the decay of the resistance of the stretch sensor over time influences the measurements. All these sources of error are largely discussed in the literature [25]. The proposed system equipped with IMUs and stretch sensors could be adopted for monitoring finger movement tasks in a variety of conditions, including in an industrial setting. For example, wearable electronics for finger tracking are used to evaluating workers’ tasks to improve their efficiency and safety [35] and for virtual reality [36]. These devices are usually equipped with IMUs, bend sensors and piezoresistive sensors due to their weight, costs and dimensions [37]; the resulting wearable devices unobtrusively detect fingers movements while the workers perform their tasks and provide affordable measurements. In the previous work [11], the proposed system was successfully used to supervise a worker during assembly operations and to identify simple gestures for human–robot collaboration. The validation tests and the results reported in this paper need to be considered for the interpretation of the data when the proposed system is used to monitor finger movements and gestures. The proposed validation protocol is adopted for the evaluation of the system’s performance when the system is used to measure thumb and index finger movements. Further implementations will be introduced to detect more complex movements of the hand. A gesture recognition approach based on a deep learning model will be considered for human– computer interaction through the proposed wearable system. REFERENCES [1] M. Hermann, T. Pentek, B. Otto, Design principles for Industry 4.0 scenarios, Proc. 49th Hawaii Int. Conf. Systems Sciences (HICSS), Koloa, Hawaii, USA, 5-8 January 2016, pp. 3928-3937. DOI: https://doi.org/10.1109/HICSS.2016.488 [2] V. Villani, L. Sabattini, J. N. Czerniaki, A. Mertens, B. Vogel- Heuser, C. Fantuzzi, Towards modern inclusive factories: A methodology for the development of smart adaptive human- machine interfaces, Proc. 22nd IEEE Int. Conf. Emerging Technologies and Factory Automation (ETFA), Limassol, Cyprus, 12-15 September 2017, pp. 1-7. DOI: https://doi.org/10.1109/ETFA.2017.8247634 [3] P. Samaranayake, K. Ramanathan, T. Laosirihongthong, Implementing Industry 4.0 – A technological readiness perspective, Proc. IEEE Int. Conf. Industrial Engineering and Engineering Management (IEEM), Singapore, Singapore, 10-13 December 2017, pp. 529-533. DOI: https://doi.org/10.1109/IEEM.2017.8289947 [4] T. M. Fernández-Caramés, P. Fraga-Lamas, A review on human- centered IoT-connected smart labels for the Industry 4.0, IEEE Access 6 (2108), pp. 25939-25957. DOI: https://doi.org/10.1109/ACCESS.2018.2833501 [5] L. Merkel, C. Berger, C. Schultz, S. Braunreuther, G. Reinhart, Application-specific design of assistance systems for manual work in production, Proc. of IEEE Int. Conf. Industrial Engineering and Engineering Management (IEEM), Singapore, Singapore, 10 – 13 December 2017, pp. 1189-1193. DOI: https://doi.org/10.1109/IEEM.2017.8290080 [6] X. Yin, Y. Gu, S. Qiu, X. Fan, VR&AR combined manual operation instruction system on industry products: A case study, Proc. of Int. Conf. Virtual Reality and Visualization, Shenyang, China, 30-31 August 2014, pp. 65-72. DOI: https://doi.org/10.1109/ICVRV.2014.55 [7] M. Di Castro, M. Ferre, A. Masi, CERNTAURO: A modular architecture for robotic inspection and telemanipulation in harsh and semi-structured environments, IEEE Access 6 (2018), pp. 37506-37522. DOI: https://doi.org/10.1109/ACCESS.2018.2849572 [8] E. A. Arkenbout, J. C. F. de Winter, P. Breedveld, Robust hand motion tracking through data fusion of 5dt data glove and nimble VR kinect camera measurements, Sensors 15 (2015), pp. 31644- 31671. DOI: https://doi.org/10.3390/s151229868 [9] J. E. Cohn, Ultrasonic bracelet and receiver for detecting position in 2D plane, U.S. Patent 9 881 276, 30 January 2018. [10] M. Krugh, K. Antani, L. Mears, J. Schulte, Prediction of defect propensity for the manual assembly of automotive electrical connectors, Procedia Manuf. 5 (2016), pp. 144-157. DOI: https://doi.org/10.1016/j.promfg.2016.08.014 [11] P. Bellitti, M. Bona, M. Borghetti, E. Sardini, M. Serpelloni, Application of a modular wearable system to track workers’ fingers movement in industrial environments, 2019 II Workshop on Metrology for Industry 4.0 and IoT (MetroInd4.0 IoT), Naples, Italy, 4-6 June 2019, pp. 137-142. DOI: https://doi.org/10.1109/METROI4.2019.8792859 [12] National Institute of Neurological Disorders and Stroke (NINDS), Repetitive motion disorders information page. Online [Accessed 31 October 2020]. https://www.ninds.nih.gov/disorders/all-disorders/repetitive- motion-disorders-information-page [13] S. Ma, Q. Liu, M. Fan, P. Sheu, Projected visible light for 3D finger tracking and device augmentation on everyday objects, Internet of Things 6 (2019), 100044. DOI: https://doi.org/10.1016/j.iot.2019.02.004 [14] J. Coupier, S. Hamoudi, S. Telese-Izzi, V. Feipel, M. Rooze, S. Van Sint Jan, A novel method for in-vivo evaluation of finger kinematics including definition of healthy motion patterns, Clin. Biomech. 31 (2016), pp. 47-58. DOI: https://doi.org/10.1016/j.clinbiomech.2015.10.002 [15] V. N. Iliukhin, K. B. Mitkovskii, D. A. Bizyanova, A. A. Akopyan, The development of motion capture system based on kinect sensor and Bluetooth-gloves, Procedia Eng. 176 (2017), pp. 506-513. DOI: https://doi.org/10.1016/j.proeng.2017.02.350 [16] K. Mitobe, M. Saitoh, N. Yoshimura, Analysis of dexterous finger movements for writing using a hand motion capture system, Proc. of IEEE Int. Conf. Virtual Environments, Human-Computer Interfaces and Measurement Systems (VECIMS), Taranto, Italy, 6-8 September 2010, pp. 60-63. DOI: https://doi.org/10.1109/VECIMS.2010.5609351 [17] R. Alkemade, F. J. Verbeek, S. G. Lukosch, On the efficiency of a VR hand gesture-based interface for 3D object manipulations in conceptual design, Int. J. Hum. Comput. Int. 33 (2017), pp. 882- 901. DOI: https://doi.org/10.1080/10447318.2017.1296074 [18] J. Condell, K. Curran, T. Quigley, P. Gardiner, M. McNeill, E. Xie, Automated measurements of finger movements in arthritic patients using a wearable data hand glove, Proc. of IEEE Int. Workshop Medical Measurements and Applications (MeMeA), Cetraro, Italy, 29-30 May 2009, pp. 122-126. DOI: https://doi.org/10.1109/MEMEA.2009.5167968 [19] D. Lockery, J. F. Peters, S. Ramanna, B. L. Shay, T. Szturm, Store-and-feedforward adaptive gaming system for hand-finger motion tracking in telerehabilitation, IEEE Trans. Inf. Technol. https://doi.org/10.1109/HICSS.2016.488 https://doi.org/10.1109/ETFA.2017.8247634 https://doi.org/10.1109/IEEM.2017.8289947 https://doi.org/10.1109/ACCESS.2018.2833501 https://doi.org/10.1109/IEEM.2017.8290080 https://doi.org/10.1109/ICVRV.2014.55 https://doi.org/10.1109/ACCESS.2018.2849572 https://doi.org/10.3390/s151229868 https://doi.org/10.1016/j.promfg.2016.08.014 https://doi.org/10.1109/METROI4.2019.8792859 https://www.ninds.nih.gov/disorders/all-disorders/repetitive-motion-disorders-information-page https://www.ninds.nih.gov/disorders/all-disorders/repetitive-motion-disorders-information-page https://doi.org/10.1016/j.iot.2019.02.004 https://doi.org/10.1016/j.clinbiomech.2015.10.002 https://doi.org/10.1016/j.proeng.2017.02.350 https://doi.org/10.1109/VECIMS.2010.5609351 https://doi.org/10.1080/10447318.2017.1296074 https://doi.org/10.1109/MEMEA.2009.5167968 ACTA IMEKO | www.imeko.org December 2020 | Volume 9 | Number 4 | 164 Biomed. 15 (2011), pp. 467-473. DOI: https://doi.org/10.1109/TITB.2011.2125976 [20] B. Fang, F. Sun, H. Liu, C. Liu, 3D human gesture capturing and recognition by the IMMU-based data glove, Neurocomputing 277 (2018), pp. 198-207. DOI: https://doi.org/10.1016/j.neucom.2017.02.101 [21] P. Meier, K. Rohrmann, M. Sandner, M. Prochaska, A novel methodology for magnetic hand motion tracking in human- machine interfaces, Proc. of IEEE Int. Conf. Systems, Man, and Cybernetics (SMC), Miyazaki, Japan, 7-10 October 2018, pp. 372- 378. DOI: https://doi.org/10.1109/SMC.2018.00073 [22] A. Kulshreshth, K. Pfeil, J. J. LaViola, Enhancing the gaming experience using 3D spatial user interface technologies, IEEE Comput. Graph. Appl. 37 (2017), pp. 16-23. DOI: https://doi.org/10.1109/MCG.2017.42 [23] M. K. Fleron, N. C. H. Ubbesen, F. Battistella, D. L. Dejtiar, A. S. Oliveira, Accuracy between optical and inertial motion capture systems for assessing trunk speed during preferred gait and transition periods, Sport. Biomech. 18 (2019), pp. 366-377. DOI: https://doi.org/10.1080/14763141.2017.1409259 [24] A. M. Aurand, J. S. Dufour, W. S. Marras, Accuracy map of an optical motion capture system with 42 or 21 cameras in a large measurement volume, J. Biomech. 58 (2017), pp. 237-240. DOI: https://doi.org/10.1016/j.jbiomech.2017.05.006 [25] J. C. van den Noort, H. G. Kortier, N. van Beek, D. H. E. J. Veeger, P. H. Veltink, Measuring 3D hand and finger kinematics - A comparison between inertial sensing and an opto- electronic marker system, PLoS One 11 (2016), pp. 1-16. DOI: https://doi.org/10.1371/journal.pone.0164889 [26] E. van der Kruk, M. M. Reijne, Accuracy of human motion capture systems for sport applications; state-of-the-art review, Eur. J. Sport Sci. 18 (2018), pp. 806-819. DOI: https://doi.org/10.1080/17461391.2018.1463397 [27] S. L. Colyer, M. Evans, D. P. Cosker, A. I. T. Salo, A review of the evolution of vision-based motion analysis and the integration of advanced computer vision methods towards developing a markerless system, Sport. Med. - Open 4 (2018) 24. DOI: https://doi.org/10.1186/s40798-018-0139-y [28] C. Wong, Z. Zhang, B. P. L. Lo, G.-Z. Yang, Wearable sensing for solid biomechanics, IEEE Sens. J. 15 (2015), pp. 2747-2760. DOI: https://doi.org/10.1109/JSEN.2015.2393883 [29] P. Bellitti, M. Bona, M. Borghetti, E. Sardini, M. Serpelloni, Sensor analysis for a modular wearable finger 3D motion tracking system, Proc. of Eurosensors, Graz, Austria, 9-12 September 2018, vol. 2, 1051. DOI: https://doi.org/10.3390/proceedings2131051 [30] Images Scientific Instruments Inc., Flexible Stretch Sensors. Online [Accessed 31 October 2020] http://www.imagesco.com/sensors/stretch.pdf [31] V. V. Avrutov, P. M. Aksonenko, P. Henaff, L. Ciarletta, 3D- calibration of the IMU, Proc. of IEEE 37th Int. Conf. Electron. Nanotechnology (ELNANO), Kiev, Ukraine, 18-20 April 2017, pp. 374-379. DOI: https://doi.org/10.1109/ELNANO.2017.7939782 [32] M. Borghetti, M. Serpelloni, E. Sardini, O. Casas, Multisensor system for analyzing the thigh movement during walking, IEEE Sens. J. 17 (2017), pp. 4953-4961. DOI: https://doi.org/10.1109/JSEN.2017.2715857 [33] P. Zhang, Y. Chen, Y. Li, Y. Zhao, W. Wang, S. Li, L. Huang, Flexible piezoresistive sensor with the microarray structure based on self-assembly of multi-walled carbon nanotubes, Sensors 19(22) (2019), 4985, 18 pages. DOI: https://doi.org/10.3390/s19224985 [34] S. Rath, Hand kinematics: Application in clinical practice, Indian J. Plast. Surg. 44 (2011), pp. 178-185 DOI: https://doi.org/10.4103/0970-0358.85338 [35] L. Francés, P. Morer, M. I. Rodriguez, A. Cazón, Design and development of a low-cost wearable glove to track forces exerted by workers in car assembly lines, Sensors 19 (2019), pp. 296-313. DOI: https://doi.org/10.3390/s19020296 [36] B. O’Flynn, J. Torres Sanchez, J. Connolly, J. Condell, K. Curran, P. Gardiner, B. Downes, Integrated smart glove for hand motion monitoring, The Sixth International Conference on Sensor Device Technologies and Applications (IARIA), Venice, Italy, 23-28 August 2015, pp. 45-50. Online [Accessed 04 December 2020] https://ulster- staging.pure.elsevier.com/files/11555412/Sensordevices2015.pdf [37] M. Borghetti, E. Sardini, M. Serpelloni, Evaluation of bend sensors for limb motion monitoring, Proc. of IEEE International Symposium on Medical Measurements and Applications (MeMeA), Lisbon, Portugal, 11-12 June 2014, pp. 1-5. DOI: https://doi.org/10.1109/MeMeA.2014.6860127 https://doi.org/10.1109/TITB.2011.2125976 https://doi.org/10.1016/j.neucom.2017.02.101 https://doi.org/10.1109/SMC.2018.00073 https://doi.org/10.1109/MCG.2017.42 https://doi.org/10.1080/14763141.2017.1409259 https://doi.org/10.1016/j.jbiomech.2017.05.006 https://doi.org/10.1371/journal.pone.0164889 https://doi.org/10.1080/17461391.2018.1463397 https://doi.org/10.1186/s40798-018-0139-y https://doi.org/10.1109/JSEN.2015.2393883 https://doi.org/10.3390/proceedings2131051 http://www.imagesco.com/sensors/stretch.pdf https://doi.org/10.1109/ELNANO.2017.7939782 https://doi.org/10.1109/JSEN.2017.2715857 https://doi.org/10.3390/s19224985 https://doi.org/10.4103/0970-0358.85338 https://doi.org/10.3390/s19020296 https://ulster-staging.pure.elsevier.com/files/11555412/Sensordevices2015.pdf https://ulster-staging.pure.elsevier.com/files/11555412/Sensordevices2015.pdf https://doi.org/10.1109/MeMeA.2014.6860127