Bioscience Journal | 2021 | vol. 37, e37039 | ISSN 1981-3163 1 Matheus FELTER1 , Maurício Guilherme LENZA2 , Wendel Minoro Muniz SHIBASAKI3 , Rhonan Ferreira SILVA4 1 Postgraduate program in Dental Science, Federal University of Goiás, Goiânia, Goiás, Brazil. 2 Department of Orthodontics, Federal University of Goiás, Goiânia, Goiás, Brazil. 3 Postgraduate program in Dental Science, Estadual Paulista University, Araraquara, São Paulo, Brazil. 4 Department of Forensic and Legal Dentistry, Federal University of Goiás, Goiânia, Goiás, Brazil. Corresponding author: Matheus Felter Email: contato@matheusfelter.com.br How to cite: FELTER, M., et al. Usability of free software used for visualization and measurement of digital orthodontic models. Bioscience Journal. 2021, 37, e37039. https://doi.org/10.14393/BJ-v37n0a2021-56824 Abstract The aim of this study was to evaluate the usability of the free software available that allow visualization and measurement of orthodontic digital models. 80 graduate students of orthodontics were asked to perform pre-defined tasks in a digital model through 3D Viewer ® and 3D-Tool ® software. The success in accomplishing the tasks and the time spent were recorded. To end, each participant answered a questionnaire to express their satisfaction regarding the software. There were no statistically significant difference between the software when compared to the accomplishment rates of tasks and the time spent by participants on each one. The software were evaluated as "slightly satisfactory" in several criteria. There is scope for optimization of orthodontic software by manufacturers since lack of their interface usability can discourage orthodontists’ adherence to new resources that could provide benefits to their daily routine, even if they are freely available. Keywords: Dental Models. Orthodontics. Technology. User-Computer Interface. 1. Introduction Digital resources can be observed in the orthodontic diagnostic and planning stages, where digital models tend to gradually replace plaster models, and bring with them benefits regarding its storage, automation of procedures and ease of sharing (Horton et al. 2010; Aragón et al. 2016; Rossini et al. 2016). Digital models can be produced by scanning dento-alveolar structures. This scan generates a digital file that needs specific software for visualization and that, for the most part, can be exported in STL (Standard Triangle Language) format. However, these programs generate significant costs for orthodontists (Camardella and Vilella 2015; Burhardt et al. 2016; Colombo et al. 2018). On the other hand, there are free software available on the Internet that allow the visualization of digital orthodontic models since they are provided in STL format. Although STL is not yet standard format among software used in the study of orthodontic models, it allows users to overcome the cost barrier, identified as a main factor of non- adherence to digital resources by professionals (Zande et al. 2013; Matthews et al. 2016; Park and Laslovich 2016; Jacox et al. 2019). In this way, orthodontists need only worry about what software satisfies their workflow. Many factors may influence software use satisfaction, efficacy and efficiency: the number of actions necessary to achieve a goal (Bauer et al. 2018); previous experience with a software (Seifert et al. 2017); USABILITY OF FREE SOFTWARE USED FOR VISUALIZATION AND MEASUREMENT OF DIGITAL ORTHODONTIC MODELS http://orcid.org/0000-0001-6590-0497 https://orcid.org/0000-0003-4230-6385 http://orcid.org/0000-0002-6900-112X https://orcid.org/0000-0002-3680-7020 Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 2 Usability of free software used for visualization and measurement of digital orthodontic models receiving prior education or training and affinity with technology (Weinerth et al. 2014); provision of feedback by the software (Wozney et al. 2016); consistency between buttons/icons and actions (Joe et al. 2015); user-friendly interface with readable fonts (O’Malley et al. 2014); presence of fewer options in software interface (Grindrod et al. 2014); compatibility with operating system (Asarbakhsh and Sandards 2013); chance to correct errors made during software use without losing what has already been done (O’Malley et al. 2014); perception of practical utility of a software (Simblett et al. 2018). Usability is the attribute that evaluates how easy to use the software is, containing three dimensions: efficacy, efficiency and satisfaction of use (ISO 2018). To employ new technologies in daily work could represent a barrier for professionals, weighed as a factor of choice against the possibility of not using them (Burzynski et al. 2018; Livas et al. 2019). Few studies (Westerlund et al. 2015; Wan Hassan et al. 2016; Felter et al. 2018) have evaluated the usability of software related to digital models in Orthodontics and, mostly, they do not consider professional use satisfaction. Thus, the aim of the present study is to fill this gap in literature, by means of an evaluation of the usability of two free softwares. 2. Material and Methods This is a cross-sectional and qualitative study in which 80 orthodontic graduate students from five educational institutions were invited to perform three tasks in a unique digital model using two different softwares. Subsequently, participants were invited to respond to a scaled questionnaire, based upon ISONORM 9241 (ISO 2018), expressing their satisfaction with the experience in each. In the present study, only those participants who had previous contact with model analysis content were included. Those who did not, or those who had prior experience with three-dimensional orthodontic digital models, were excluded. In addition, for each variable we performed a sample size calculation based on data collected in a pilot test. A study power of 80%, 95% confidence interval and significance level of 5% were considered. The highest value obtained was used. The present study was approved by a local Research Ethics Committee. Study participants, as well as educational institutions, consented to participate in this research. Software selection Software selection was based upon an electronic literature research in which 71 papers discussing linear measurements in digital models with at least one software were found, totaling 17 different software. We aimed to select only those which were freely available and that read STL universal format, finishing with two software selected: A - 3D Viewer ® (3Shape, Copenhagen, Denmark) and B - 3D-Tool ® (GmbH & Co. KG, Weinheim, Deutschland). Description of tasks requested from participants Three tasks were devised, reflecting common orthodontic practices. They were: manipulation of a model in lateral view (task 1 - figures 1 and 2); tool identification for linear measurement (task 2 - figure 3); and measurement of the mesio-distal diameter of teeth, from first molar to first molar (task 3 - figure 4). No previous instruction was given to participants in tasks 1 and 2. A time limit of 30 seconds was established for tasks 1 and 2 whereas, in task 3, there was no time limit. All the tasks started with a digital model opened in the initial screen of each software. The success rates, as well as the times spent in each task, were noted. Data analysis Participants’ profiles were described in terms of gender, age, number of hours spent on computer per week and number of orthodontic patients attended monthly. Collected data were tabulated in Microsoft Excel and analyzed through SPSS program (IBM Corp. Released 2017. IBM SPSS Statistics for Windows, Version 25.0, Armonk, NY: IBM Corp.). Success rates in accomplishing the tasks were compared between Software A and Software B groups using McNemar test; and times spent in each task were compared by t test. Statistical analysis considered a level of significance equal to 5%, a study power equal to 80% and 95% Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 3 FELTER, M., et al. confidence interval. A Kolmogorov-Smirnov test was also performed to evaluate the normality distribution of samples. Answers obtained in questionnaires were analyzed in a quanti-qualitative way. In order to facilitate a comparative view of responses, scores from 1 to 7 were assigned to each question and mean values, as well as standard deviation, were obtained. It was also assigned a category for each score, which made it possible to describe data categoricaly (1 - very bad; 2 - bad; 3 - slightly bad; 4 - Neutral; 5 – slightly satisfactory; 6 - satisfactory; 7 – very satisfactory). Figure 1. Initial view of the model (software A) and indication of the icon where participants could click to complete task 1, in addition to being able to click-hold-drag the model. Figure 2. Initial view of the model (software B) and indication of icons where participants could click to complete task 1, in addition to click-hold-drag model. Figure 3. View of software A and B immediately after tool selection to perform measurements (note the arrow pointing to a second botton-click required in software B in order to remove the distortion caused in the model by the first click). Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 4 Usability of free software used for visualization and measurement of digital orthodontic models Figure 4. Marking of points for measuring linear distance in softwares A and B (note the presentation of the points marked in the model image and the button to "clean" marked points allowing to mark new ones). 3. Results 109 graduate students in Orthodontics were invited to this research. Of these, 29 refused to participate (n = 80). Participants were between 22 and 48 years of age (mean 28.7; standard deviation 6.2), 14 men (17.5%) and 66 women (82.5%). Regarding time of computer per week, responses ranged from zero (43, or 54% of participants) to forty hours (1 participant; mean 5.1, standard deviation 8.2). Among those who did it for at least one hour per week, an average of 1 (+ -2) software were reported. Participants also reported how many orthodontic patient attendances they performed monthly. Responses ranged from zero (27, or 34% of participants) to five hundred (53, or 66% of participants), with a mean and standard deviation of 52 +- 91 monthly patients. Table 1 and table 2 show the results obtained, in terms of efficacy (completeness of tasks) and efficiency (times spent in each tasks) for both groups (Software A and Software B). In Task 3, which consisted in measuring mesio-distal diameter of teeth, all participants could complete the task, once instructions were given. Questionnaires answers are presented graphically (Figures 5-11). Bars refer to average scores obtained in each question (maximum value = 7). The average score of each group precedes the software names. Table 1. Completeness of tasks (efficacy) by the participants. Software A Software B Yes No Yes No p value Task 1 80% 20% 86.3% 13.7% 0.345 Task 2 52.5% 47.5% 33.8% 66.2% 0.163 Task 3 100% 0 100% 0 - Table 2. Mean value and standard deviation for time spent (s) in tasks and teeth measurements (mm). Software A Software B p value Time spent for task 1 17.7 (9.4) 15.9 (9.4) 0.191 Time spent for task 2 22.8 (9.7) 25.7 (7.5) 0.252 Time spent for task 3 112 (39) 108 (48) 0.553 Min. and max. values 66 e 249 50 e 257 - Dental measures (first molar to first molar diameters) 89.21 (5.3) 86.8 (9.5) 0.798 Min. and max. values 72.8-110.3 75.2-120.3 - Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 5 FELTER, M., et al. Figure 5. First group of questions: adequacy to the task. Figure 6. Second group of questions: self-description. Figure 7. Third group of issues: controllability. Figure 8. Fourth group of questions: compliance with user expectations. Figure 9. Fifth group of questions: errors tolerance. 5.8 5.4 5.8 4.8 4.8 5.9 5.6 5.1 5.4 5.4 0 1 2 3 4 5 6 7 It is suitable for the job needs Does not require unnecessary data entry Provides good resources to automate repetitive tasks Provides all the functions needed to efficiently perform the task It is easy to use Software A (5.5) Software B (5.5) 4.7 5.4 4.9 5.3 1 2 3 4 5 6 7 Uses terminology, abbreviations and easy-to- understand symbols It provides a good sense of scope of its function Software A (5.1) Software B (5.1) 5.6 5.6 4.9 4.9 1 2 3 4 5 6 7 It does not lead to unnecessary interruptions to the workflow It does not force the user to perform a rigid and unnecessary sequence of steps Software A (4.9) Software B (5.6) 6 6.1 1 2 3 4 5 6 7 Has a predictable response time Software A (6.1) Software B (6) 5.2 5.2 5 5.3 5.7 5.3 1 2 3 4 5 6 7 It gives concrete help for the correction of errors It usually requires little effort to correct an error It is designed so that small mistakes do not have severe consequences Software A (5.4) Software B (5.1) Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 6 Usability of free software used for visualization and measurement of digital orthodontic models Figure 10. Sixth group of questions: support for individualization. Figure 11. Seventh group of questions: appropriateness to learning. 4. Discussion Software A (3D Viewer ®) and B (3D-Tool ®) were evaluated for their usability. Efficacy, which referred to the ability of participants to complete three pre-selected tasks, was tested with most participants achieving desired results in both softwares, except for Task 2 in Software B. The ability to click-and-drag a model would be more intuitive than the other options to accomplish this task (Westerlund et al. 2015), however, difficulties were observed. The first difficulty was in relation to the mousepad, which was also observed in a usability test carried out by Joe et al. (2015). Handling it was relatively difficult for some users, so that small movements could cause undesired turns in model position. In addition, in software A, the clicking-and-dragging model should be done with the mousepad’s right button. Also, it became evident that some participants did not relate buttons / icons / texts to their actions, since both software were evaluated as "slighty satisfactory" in understanding their interfaces. Westerlund et al. (2015) compared four softwares for usability and also reported difficulties for their participants to locate desired functionalities due to this discrepancy. This also illustrates that there is a divergence between what a button represents and what its action generates. This creates confusion during its use (Richardson et al. 2017). Another important factor to remember is that, in the present study, none of the software had native participant’s language as an option. While technology taught in orthodontic courses is analog (Shastry and Park 2014), this adaptation is not likely to be prioritized by manufacturers. In order to improve user satisfaction in orthodontics, the language needs to be considered. Facing difficulties in performing basic necessary tasks could cause stress and frustration in professionals (De Góes et al. 2015; Roman et al. 2017), reducing their motivation to use digital models, even if there is no additional workflow cost. When software efficiency was evaluated, no significant differences were observed. For Grunheid et al. (2014), differences between 20 and 25% would be relevant. However, this parameter reinforces a similarity between software A and B (a delay of approximately 2 minutes per arc). It is worth noting that no statistically significant differences were found between measures obtained in the studied software, although, in software B, the largest measure, 120.33mm, was obtained, compared to 110.32mm in software A. it would also be possible to relate this difference to unfriendly software interfaces, once it is realised that they reduce professional chances to make mistakes while using them in order to obtain good user 4.5 5 1 2 3 4 5 6 7 It is equally suitable for beginner and experienced users Software A (5) Software B (4.5) 4.5 5.5 5.1 5.4 5.1 4.9 5.9 5.7 5.7 5.8 1 2 3 4 5 6 7 It is easy to learn without external support or manual It is designed so that what is learned is easily memorize Requires memorization of few details Encourages to try out new resources Requires only a few time to learn Software A (5.6) Software B (5.1) Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 7 FELTER, M., et al. satisfaction. The possibility of fixing mistakes with little effort was considered "slighlty satisfactory" for both softwares, with a worse evaluation of software B. The two software had a predictable response time; that is, as soon as participants performed an action, such as two-point marking, they instantly obtained a response, such as a tooth measurement. That was a better perceived characteristic in both softwares. When there is slow interaction within a software, a professional may believe that his or her action did not work and seek other, ineffective, ways to achieve a desired goal (Joe et al. 2015). This workflow breakdown can generate errors, preventing tasks (O’Malley et al. 2014). However, this has not happened in any software. In evaluation of software support for individualization, neither software was considered totally satisfactory to be used by people with different levels of knowledge, a finding that corroborates several authors (Westerlund et al. 2015; Seifert et al. 2017), who also reported greater ease of use by more experienced users. Finally, software A was considered "satisfactory" in relation to the estimated time to learn how to use it. It was also considered "satisfactory" in relation to: encouragement of experimentation of its functionalities; amount of detail needed to be memorized in order to repeat a task; and ease of use in general. Meanwhile, software B was considered "slightly satisfactory", demonstrating greater satisfaction of use for software A in this group of questions. Training software users proves to be fundamental for their use in a faster and safer way (Weinerth et al. 2014; Seifert et al. 2017). Even so, it is important to be aware that not every professional is available to overcome a new learning curve, even if the software are freely available. It is possible that a replacement of plaster models with digital models occurs only when post-graduate courses in Orthodontics begin to teach, primarily, a digital workflow. This would reduce the learning curve and generate greater awareness that this format also meets the needs of orthodontists. In this context, it is necessary to consider that it is not always the case that what seems more technological or advanced is therefore necessary. Software can be perceived differently by different users. It is up to each professional to choose those tools that are more satisfactory within their own workflow, as well as respecting two essential criteria for using a diagnostic resource: accuracy and reliability, already proven for digital orthodontic models (Rossini et al. 2016). 5. Conclusions The two software evaluated were considered slightly satisfactory for use in six of seven evaluated criteria. It is suggested that manufacturers provide adequate training to professionals, as well as optimize software interfaces in order to improve their use experience. There is a strong need for testing software with potential users before they are released, and predicting possible problems or difficulties. Doing this could encourage orthodontists’ adherence to new resources, and this would give benefits to their day to day routines, even if they are freely available. Authors' Contributions: FELTER, M.: conception and design, acquisition of data, analysis and interpretation of data, drafting the article; LENZA, M.G.: analysis and interpretation of data; SHIBASAKI, W.M.M.: conception and design, analysis and interpretation of data; SILVA, R.F.: conception and design, drafting the article. All authors have read and approved the final version of the manuscript. Conflicts of Interest: The authors declare no conflicts of interest. Ethics Approval: Approved by Research Ethics Committee of Federal University of Goiás. CAAE: 82443617.6.0000.5083. Acknowledgments: The authors would like to thank the funding for the realization of this study provided by the Brazilian agencies CAPES (Coordenação de Aperfeiçoamento de Pessoal de Nível Superior - Brasil), Finance Code 001, and CNPq (Conselho Nacional de Desenvolvimento Científico e Tecnológico - Brasil), Finance Code 001. References ARAGÓN, M.L.C., et al. Validity and reliability of intraoral scanners compared to conventional gypsum models measurements: a systematic review. European Journal of Orthodontics. 2016, 38(4), 429-434. https://doi.org/10.1093/ejo/cjw033 ASARBAKHSH, M. and SANDARS, J. E-learning: the essential usability perspective. The Clinical Teacher. 2013, 10(1), 47-50. https://doi.org/10.1111/j.1743-498X.2012.00627.x https://doi.org/10.1093/ejo/cjw033 https://doi.org/10.1111/j.1743-498X.2012.00627.x Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 8 Usability of free software used for visualization and measurement of digital orthodontic models BAUER, A.M., et al. Applying the principles for digital development: case study of a smartphone app to support collaborative care for rural patients with posttraumatic stress disorder or bipolar disorder. Journal of Medical Internet Research. 2018, 20(6), e10048. https://doi.org/10.2196/10048 BURHARDT, L., et al. Treatment comfort, time perception and preference for conventional and digital impression techniques: a comparative study in young patients. American Journal of Orthodontics and Dentofacial Orthopedics. 2016, 150(2), p. 261-267. https://doi.org/10.1016/j.ajodo.2015.12.027 BURZYNSKI, J.A., et al. Comparison of digital intraoral scanners and alginate impressions: time and patient satisfaction. American Journal of Orthodontics and Dentofacial Orthopedics. 2018, 153(4), 534-541. https://doi.org/10.1016/j.ajodo.2017.08.017 CAMARDELLA, L.T. and VILELLA, O.V. Modelos digitais em Ortodontia: novas perspectivas, métodos de confecção, precisão e confiabilidade. Revista Clínica de Ortodontia Dental Press. 2015, 14(2), 76-84. COLOMBO, G., RIZZI, C. and REGAZZONI, D. 3D environment for the design of medical devices. International Journal on Interactive Design and Manufacturing. 2018, 12, 699-715. DE GÓES, F.D.O.S., et al. Educational technology “Anatomy and vital signs”: evaluation study of content, appearance and usability. International Journal of Medical Informatics. 2015, 84(11), 982-987. https://doi.org/10.1016/j.ijmedinf.2015.06.005 FELTER, M., et al. Comparative study of the usability of two software programs for visualization and analysis of digital orthodontic models. Journal of Dental Research Dental Clinics Dental Prospects. 2018, 12(2), 213-220. https://doi.org/10.15171/joddd.2018.033 GRINDROD, K.A., LI, M. and GATES, A. Evaluating user perceptions of mobile medication management applications with older adults: a usability study. JMIR Mhealth Uhealth. 2014, 2(1), e11. https://doi.org/10.2196/mhealth.3048 GRUNHEID, T., et al. Accuracy, reproducility, and time efficiency of dental measurements using different technologies. American Journal of Orthodontics and Dentofacial Orthopedics. 2014, 145(5), 157-164. https://doi.org/10.1016/j.ajodo.2013.10.012 HORTON, H.M.I., et al. Technique comparison for efficient orthodontic tooth measurements using digital models. Angle Orthodontist. 2010, 80(2), 254-261. https://doi.org/10.2319/041709-219.1 ISO. ISO NORM 9241-11: Ergonomics of human–system interaction – Part 11 Usability: Definitions and concepts. 2018. Available from: https://www.iso.org/standard/63500.html JACOX, L.A., et al. Understanding technology adoption by orthodontists: a qualitative study. American Journal of Orthodontics and Dentofacial Orthopedics. 2019, 155(3), 432-442. https://doi.org/10.1016/j.ajodo.2018.08.018 JOE, J., et al. The use of Think-Aloud and Instant Data Analysis in evaluation research: exemplar and lessons learned. Journal of Biomedical Informatics. 2015, 56, 284-291. https://doi.org/10.1016/j.jbi.2015.06.001 LIVAS, C., et al. Concurrent validity and reliability of cephalometric analysis using smartphone apps and computer software. Angle Orthodontist. 2019, 89(6), 889-896. https://doi.org/10.2319/021919-124.1 MATTHEWS, D.C., et al. Factors influencing adoption of new technologies into dental practice: a qualitative study. Journal of Dental Research. 2016, 1(1), 77-85. https://doi.org/10.1177/2380084415627129 PARK, J.H. and LASLOVICH, J. Trends in the use of digital study models and other technologies among practicing orthodontists. Journal of Clinical Orthodontics. 2016, 50(7), 413-419. O’MALLEY, G., et al. Exploring the usability of a mobile app for adolescent obesity management. JMIR Mheatlh Uhealth. 2014, 2(2), e29. https://doi.org/10.2196/mhealth.3262 RICHARDSON, S., et al. “Think aloud” and “Near live” usability testing of two complex clinical decision support tools. International Journal of Medical Informatics. 2017, 106, 1-8. https://doi.org/10.1016/j.ijmedinf.2017.06.003 ROMAN, L.C., et al. Navigation in the electronic health record: a review of the safety and usability literature. Journal of Biomedical Informatics. 2017, 67, 69-79. https://doi.org/10.1016/j.jbi.2017.01.005 ROSSINI, G., et al. Diagnostic accuracy and measurement sensitivity of digital models for orthodontic purposes: a systematic review. American Journal of Orthodontics and Dentofacial Orthopedics. 2016, 149(2), 161-170. https://doi.org/10.1016/j.ajodo.2015.06.029 SEIFERT, A., et al. The use of mobile devices for physical activity tracking in older adults’ everyday life. Digital Health. 2017, 3, 1-12. https://doi.org/10.1177/2055207617740088 SHASTRY, S. and PARK, J.H. Evaluation of the use of digital study models in postgraduate orthodontic programs in United States and Canada. Angle Orthodontist. 2014, 84(1), 62-67. https://doi.org/10.2319/030813-197.1 SIMBLETT, S., et al. Barriers to and facilitators of engagement with remote measurement technology for managing health: systematic review and content analysis of findings. Journal of Medical Internet Research. 2018, 20(7), e10480, 2018. https://doi.org/10.2196/10480 WAN HASSAN, W.N., et al. Assessing agreement in measurements of orthodontic study models: digital caliper on plaster models vs 3- dimensional software on models scanned by structured-light scanner American Journal of Orthodontics and Dentofacial Orthopedics. 2016, 150(5), 886-895. https://doi.org/10.1016/j.ajodo.2016.04.021 https://doi.org/10.2196/10048 https://doi.org/10.1016/j.ajodo.2015.12.027 https://doi.org/10.1016/j.ajodo.2017.08.017 https://doi.org/10.1016/j.ijmedinf.2015.06.005 https://doi.org/10.15171/joddd.2018.033 https://doi.org/10.2196/mhealth.3048 https://doi.org/10.1016/j.ajodo.2013.10.012 https://doi.org/10.2319/041709-219.1 https://www.iso.org/standard/63500.html https://doi.org/10.1016/j.ajodo.2018.08.018 https://doi.org/10.1016/j.jbi.2015.06.001 https://doi.org/10.2319/021919-124.1 https://doi.org/10.1177/2380084415627129 https://doi.org/10.2196/mhealth.3262 https://doi.org/10.1016/j.ijmedinf.2017.06.003 https://doi.org/10.1016/j.jbi.2017.01.005 https://doi.org/10.1016/j.ajodo.2015.06.029 https://doi.org/10.1177/2055207617740088 https://doi.org/10.2319/030813-197.1 https://doi.org/10.2196/10480 https://doi.org/10.1016/j.ajodo.2016.04.021 Bioscience Journal | 2021 | vol. 37, e37039 | https://doi.org/10.14393/BJ-v37n0a2021-56824 9 FELTER, M., et al. WEINERTH, K., et al. Concept maps: a useful and usable tool for computer-based knowledge assesment? A literature review with a focus on usability. Computers & Education. 2014, 78, 201-209. https://doi.org/10.1016/j.compedu.2014.06.002 WESTERLUND, A., et al. Digital casts in orthodontics: a comparison of 4 software systems. American Journal of Orthodontics and Dentofacial Orthopedics. 2015, 147(4), 509-516. https://doi.org/10.1016/j.ajodo.2014.11.020 WOZNEY, L., et al. Usability, learnability and performance evaluation of Intelligent Research and Intervention Software: a delivery platform for eHealth interventions. Health Informatics Journal. 2016, 22(3), 730-743. https://doi.org/10.1177/1460458215586803 ZANDE, M.M., GORTER, R.C. and WISMEIJER, D. Dental practitioners and a digital future: an initial exploration of barriers and incentives to adopting digital technologies. British Dental Journal. 2013, 15(11), e21. https://doi.org/10.1038/sj.bdj.2013.1146 Received: 19 August 2020 | Accepted: 17 October 2020 | Published: 2 July 2021 This is an Open Access article distributed under the terms of the Creative Commons Attribution License, which permits unrestr icted use, distribution, and reproduction in any medium, provided the original work is properly cited. https://doi.org/10.1016/j.compedu.2014.06.002 https://doi.org/10.1016/j.ajodo.2014.11.020 https://doi.org/10.1177/1460458215586803 https://doi.org/10.1038/sj.bdj.2013.1146